We were delighted to engage with over 200 community members in our latest Community update calls. We aimed to present a diverse selection of highlights on our progress and discuss your questions about participating in the Research Nexus. For those who didn’t get a chance to join us, I’ll briefly summarise the content of the sessions here and I invite you to join the conversations on the Community Forum.
You can take a look at the slides here and the recordings of the calls are available here.
We have some exciting news for fans of big batches of metadata: this year’s public data file is now available. Like in years past, we’ve wrapped up all of our metadata records into a single download for those who want to get started using all Crossref metadata records.
We’ve once again made this year’s public data file available via Academic Torrents, and in response to some feedback we’ve received from public data file users, we’ve taken a few additional steps to make accessing this 185 gb file a little easier.
In 2022, we flagged up some changes to Similarity Check, which were taking place in v2 of Turnitin’s iThenticate tool used by members participating in the service. We noted that further enhancements were planned, and want to highlight some changes that are coming very soon. These changes will affect functionality that is used by account administrators, and doesn’t affect the Similarity Reports themselves.
From Wednesday 3 May 2023, administrators of iThenticate v2 accounts will notice some changes to the interface and improvements to the Users, Groups, Integrations, Statistics and Paper Lookup sections.
We’ve been spending some time speaking to the community about our role in research integrity, and particularly the integrity of the scholarly record. In this blog, we’ll be sharing what we’ve discovered, and what we’ve been up to in this area.
We’ve discussed in our previous posts in the “Integrity of the Scholarly Record (ISR)” series that the infrastructure Crossref builds and operates (together with our partners and integrators) captures and preserves the scholarly record, making it openly available for humans and machines through metadata and relationships about all research activity.
Members can participate in Cited-by by completing the following steps:
Deposit references for one or more prefixes as part of your content registration process. Use your Participation Report to see your progress with depositing references. This step is not mandatory, but highly recommended to ensure that your citation counts are complete.
We will match the metadata in the references to DOIs to establish Cited-by links in the database. As new content is registered, we automatically update the citations and, for those members with Cited-by alerts enabled, we notify you of the new links.
Display the links on your website. We recommend displaying citations you retrieve on DOI landing pages, for example:
If you are a member through a Sponsor, you may have access to Cited-by through your sponsor – please contact them for more details. OJS users can use the Cited-by plugin.
Citation matching
Members sometimes submit references without including a DOI tag for the cited work. When this happens, we look for a match based on the metadata provided. If we find one, the reference metadata is updated with the DOI and we add the "doi-asserted-by": "crossref" tag. If we don’t find a match immediately, we will try again at a later date.
There are some references for which we won’t find matches, for example where a DOI has been registered with an agency other than Crossref (such as DataCite) or if the reference refers to an object without a DOI, including conferences, manuals, blog posts, and some journals’ articles.
To perform matching, we first check if a DOI tag is included in the reference metadata. If so, we assume it is correct and link the corresponding work. If there isn’t a DOI tag, we perform a search using the metadata supplied and select candidate results by thresholding. The best match is found through a further validation process. Learn more about how we match references. The same process is used for the results shown on our Simple Text Query tool.
All citations to a work are returned in the corresponding Cited-by query.
Page owner: Isaac Farley | Last updated 2023-April-28