The integrity of the scholarly record is an essential aspect of research integrity. Every initiative and service that we have launched since our founding has been focused on documenting and clarifying the scholarly record in an open, machine-actionable and scalable form. All of this has been done to make it easier for the community to assess the trustworthiness of scholarly outputs. Now that the scholarly record itself has evolved beyond the published outputs at the end of the research process – to include both the elements of that process and its aftermath – preserving its integrity poses new challenges that we strive to meet… we are reaching out to the community to help inform these efforts.
I’m pleased to share the 2022 board election slate. Crossref’s Nominating Committee received 40 submissions from members worldwide to fill five open board seats.
We maintain a balance of eight large member seats and eight small member seats. A member’s size is determined based on the membership fee tier they pay. We look at how our total revenue is generated across the membership tiers and split it down the middle. Like last year, about half of our revenue came from members in the tiers $0 - $1,650, and the other half came from members in tiers $3,900 - $50,000.
Our entire community – members, metadata users, service providers, community organizations and researchers – create and/or use DOIs in some way so making them more accessible is a worthy and overdue effort.
For the first time in five years and only the second time ever, we are recommending some changes to our DOI display guidelines (the changes aren’t really for display but more on that below). We don’t take such changes lightly, because we know it means updating established workflows.
I’m delighted to say that Martin Paul Eve will be joining Crossref as a Principal R&D Developer starting in January 2023.
As a Professor of Literature, Technology, and Publishing at Birkbeck, University of London- Martin has always worked on issues relating to metadata and scholarly infrastructure. In joining the Crossref R&D group, Martin can focus full-time on helping us design and build a new generation of services and tools to help the research community navigate and make sense of the scholarly record.
Just when you thought 2020 couldn’t go any faster, it’s Peer Review week again! Peer Review is such an important part of the research process and highlighting the role it plays is key to retaining and reinforcing trust in the publishing process.
“Maintaining trust in the peer review decision-making process is paramount if we are to solve the world’s most pressing problems. This includes ensuring that the peer review process is transparent (easily discoverable, accessible, and understandable by anyone writing, reviewing, or reading peer-reviewed content) and that everyone involved in the process receives the training and education needed to play their part in making it reliable and trustworthy.”
A key way that publishers can make peer reviews easily discoverable and accessible is by registering them with Crossref - creating a persistent identifier for each review, linking them to the relevant article, and providing rich metadata to show what part this item played in the evolution of the content. It also gives a way to acknowledge the incredible work done by academics in this area.
For Peer Review week last year, Rosa and Rachael from Crossref created this short video to explain more.
Fast forward to 2020 and over 75k peer reviews have now been registered with us by a range of members including Wiley, Peer J, eLife, Stichting SciPost, Emerald, IOP Publishing, Publons, The Royal Society and Copernicus. We encourage all members to register peer reviews with us - and you can keep up to date with everyone who is using this API query. (We recommend installing a JSON viewer for your browser to view these results if you haven’t done so already).
Register peer reviews and contribute to the Research Nexus
At Crossref, we talk a lot about the research nexus, and it’s a theme that you’re going to hear a lot more about from us in the coming months and years.
The published article no longer has the supremacy it once did, and other outputs - and inputs - have increasing importance. Linked data and protocols are key for reproducibility, peer reviews increase trust and show the evolution of knowledge, and other research objects help increase the discoverability of content. Registering these objects and stating the relationships between them support the research nexus.
Peer reviews in particular are key to demonstrating that the scholarly record is not fixed - it’s a living entity that moves and changes over time. Registering peer reviews formally integrates these objects into the scholarly record and makes sure the links between the reviews and the article both exist and persist over time. It allows analysis or research on peer reviews and highlights richer discussions than those provided by the article alone, showing how discussion and conversation help to evolve knowledge. In particular, post-publication reviews highlight how the article is no longer the endpoint - after publication, research is further validated (or not!) and new ideas emerge and build on each other. You can see a real-life example of this from F1000 in a blog post written by Jennifer Lin a few years ago.
As we’ve said before:
Article metadata + peer review metadata = a fuller picture of the evolution of knowledge
Registering peer reviews also provides publishing transparency and reviewer accountability, and enables contributors to get credit for their work. If peer review metadata includes ORCID IDs, our ORCID auto-update service means that we can automatically update the author’s ORCID record (with their permission), while our forthcoming schema update will take this even further, making CRediT roles available in our schema.
How to register peer reviews with Crossref
You need to be a member of Crossref in order to register your peer reviews with us and you can currently register peer reviews by sending us your XML files. Unfortunately, you can’t currently register peer reviews using our helper tools like the OJS plugin, Metadata Manager, or the web deposit form.
We know that there’s a range of outputs from the peer review process, and our schema allows you to identify many of them, including referee reports, decision letters, and author responses. You can include outputs from the initial submission only, or cover all subsequent rounds of revisions, giving a really clear picture of the evolution of the article. Members can even register content for discussions after the article was published, such as post-publication reviews.
Get involved with Peer Review Week 2020
We’re looking forward to seeing the debate sparked by Peer Review Week and hearing from our members about this important area. You can get involved by checking out the Peer Review Week 2020 website or following @PeerRevWeek and the hashtags #PeerRevWk20 #trustinpeerreview on Twitter.
We’re excited to see what examples of the evolution of knowledge will be discoverable in registered and linked peer reviews this time next year!