Jennifer Lin – 2017 November 14
Researchers are adopting new tools that create consistency and shareability in their experimental methods. Increasingly, these are viewed as key components in driving reproducibility and replicability. They provide transparency in reporting key methodological and analytical information. They are also used for sharing the artifacts which make up a processing trail for the results: data, material, analytical code, and related software on which the conclusions of the paper rely. Where expert feedback was also shared, such reviews further enrich this record. We capture these ideas and build on the notion of the “article nexus” blogpost with a new variation: “the research nexus.”
Jennifer Lin – 2017 October 24
About 13-20 billion researcher-hours were spent in 2015 doing peer reviews. What valuable work! Let’s get more mileage out of these labors and make these expert discussions citable, persistent, and linked up to the scholarly record. As we previously shared during Peer Review week, Crossref is launching a new content type to support the registration of peer reviews. We’re one step closer to changing that. Today, we are excited to announce that we’re open for deposits.
2017 November 14
2017 November 01
2017 October 30
2017 October 24