Metadata is one of the most important tools needed to communicate with each other about science and scholarship. It tells the story of research that travels throughout systems and subjects and even to future generations. We have metadata for organising and describing content, metadata for provenance and ownership information, and metadata is increasingly used as signals of trust.
Following our panel discussion on the same subject at the ALPSP University Press Redux conference in May 2024, in this post we explore the idea that metadata, once considered important mostly for discoverability, is now a vital element used for evidence and the integrity of the scholarly record.
For the third year in a row, Crossref hosted a roundtable on research integrity prior to the Frankfurt book fair. This year the event looked at Crossmark, our tool to display retractions and other post-publication updates to readers.
Since the start of 2024, we have been carrying out a consultation on Crossmark, gathering feedback and input from a range of members. The roundtable discussion was a chance to check and refine some of the conclusions we’ve come to, and gather more suggestions on the way forward.
https://doi.org/10.13003/ief7aibi
In our previous blog post in this series, we explained why no metadata matching strategy can return perfect results. Thankfully, however, this does not mean that it’s impossible to know anything about the quality of matching. Indeed, we can (and should!) measure how close (or far) we are from achieving perfection with our matching. Read on to learn how this can be done!
How about we start with a quiz?
We’re in year two of the Resourcing Crossref for Future Sustainability (RCFS) research. This report provides an update on progress to date, specifically on research we’ve conducted to better understand the impact of our fees and possible changes.
Crossref is in a good financial position with our current fees, which haven’t increased in 20 years. This project is seeking to future-proof our fees by:
Making fees more equitable Simplifying our complex fee schedule Rebalancing revenue sources In order to review all aspects of our fees, we’ve planned five projects to look into specific aspects of our current fees that may need to change to achieve the goals above.
Just when you thought 2020 couldn’t go any faster, it’s Peer Review week again! Peer Review is such an important part of the research process and highlighting the role it plays is key to retaining and reinforcing trust in the publishing process.
“Maintaining trust in the peer review decision-making process is paramount if we are to solve the world’s most pressing problems. This includes ensuring that the peer review process is transparent (easily discoverable, accessible, and understandable by anyone writing, reviewing, or reading peer-reviewed content) and that everyone involved in the process receives the training and education needed to play their part in making it reliable and trustworthy.”
A key way that publishers can make peer reviews easily discoverable and accessible is by registering them with Crossref - creating a persistent identifier for each review, linking them to the relevant article, and providing rich metadata to show what part this item played in the evolution of the content. It also gives a way to acknowledge the incredible work done by academics in this area.
For Peer Review week last year, Rosa and Rachael from Crossref created this short video to explain more.
Fast forward to 2020 and over 75k peer reviews have now been registered with us by a range of members including Wiley, Peer J, eLife, Stichting SciPost, Emerald, IOP Publishing, Publons, The Royal Society and Copernicus. We encourage all members to register peer reviews with us - and you can keep up to date with everyone who is using this API query. (We recommend installing a JSON viewer for your browser to view these results if you haven’t done so already).
Register peer reviews and contribute to the Research Nexus
At Crossref, we talk a lot about the research nexus, and it’s a theme that you’re going to hear a lot more about from us in the coming months and years.
The published article no longer has the supremacy it once did, and other outputs - and inputs - have increasing importance. Linked data and protocols are key for reproducibility, peer reviews increase trust and show the evolution of knowledge, and other research objects help increase the discoverability of content. Registering these objects and stating the relationships between them support the research nexus.
Peer reviews in particular are key to demonstrating that the scholarly record is not fixed - it’s a living entity that moves and changes over time. Registering peer reviews formally integrates these objects into the scholarly record and makes sure the links between the reviews and the article both exist and persist over time. It allows analysis or research on peer reviews and highlights richer discussions than those provided by the article alone, showing how discussion and conversation help to evolve knowledge. In particular, post-publication reviews highlight how the article is no longer the endpoint - after publication, research is further validated (or not!) and new ideas emerge and build on each other. You can see a real-life example of this from F1000 in a blog post written by Jennifer Lin a few years ago.
As we’ve said before:
Article metadata + peer review metadata = a fuller picture of the evolution of knowledge
Registering peer reviews also provides publishing transparency and reviewer accountability, and enables contributors to get credit for their work. If peer review metadata includes ORCID IDs, our ORCID auto-update service means that we can automatically update the author’s ORCID record (with their permission), while our forthcoming schema update will take this even further, making CRediT roles available in our schema.
How to register peer reviews with Crossref
You need to be a member of Crossref in order to register your peer reviews with us and you can currently register peer reviews by sending us your XML files. Unfortunately, you can’t currently register peer reviews using our helper tools like the OJS plugin, Metadata Manager, or the web deposit form.
We know that there’s a range of outputs from the peer review process, and our schema allows you to identify many of them, including referee reports, decision letters, and author responses. You can include outputs from the initial submission only, or cover all subsequent rounds of revisions, giving a really clear picture of the evolution of the article. Members can even register content for discussions after the article was published, such as post-publication reviews.
Get involved with Peer Review Week 2020
We’re looking forward to seeing the debate sparked by Peer Review Week and hearing from our members about this important area. You can get involved by checking out the Peer Review Week 2020 website or following @PeerRevWeek and the hashtags #PeerRevWk20 #trustinpeerreview on Twitter.
We’re excited to see what examples of the evolution of knowledge will be discoverable in registered and linked peer reviews this time next year!