To date, we have collected around 740 million from 12 different source since we launched our Event Data service service in 2017. Each event is an online mention of the research associated with a DOI, either via the DOI directly or using the associated URL. However, we know that there is much more out there. Because of this, we would like to explore where we could expand.
We invite proposals to conduct a gap analysis for Event Data sources, looking at what we currently collect and seeing what more could be added.
We are delighted to announce the formation of a new Advisory Group to support us in improving preprint metadata. Preprints have grown in popularity over the last few years, with increasing focus brought by the need to rapidly disseminate knowledge in the midst of a global pandemic. We have supported metadata deposits for preprints under the content type ‘posted content’ since 2016, and members currently register a total of around 17,000 new preprints metadata records each month.
It is time to put the ‘R’ back into R&D.
The Crossref R&D team was originally created to focus on the kinds of research projects that have allowed Crossref to make transformational technology changes, launch innovative new services, and engage with entirely new constituencies. Some Illustrious projects that had their origins in the R&D group include:
DOI Content Negotiation Similarity Check (originally CrossCheck) ORCID (originally Author DOIs) Crossmark The Open Funder Registry The Crossref REST API Linked Clinical Trials Event Data Grant registration ROR And for each project that has graduated, there have been several that have not.
This announcement has been in the works for some time, but everything seems to take longer when there is a pandemic going on, including finding time and headspace to plan out our strategy for the next few years.
Over the last year or so we have had our heads down addressing how to scale our 20-yr-old system and operation – and adapting to new ways of working. But we’ve also spent time talking to people, forging alliances, looking ahead, and making plans.
This section is for Similarity Check account administrators only. It explains how administrators need to set up the iThenticate account for their organizations before starting to add other users. It walks administrators through the parts of iThenticate that only account administrators can see, so if you aren’t an account administrator, you can ignore this section and skip to using your iThenticate account.
Not sure if you’re an account administrator? When you receive your email with your login details for iThenticate, log in and check if you can see the Manage Users tab. You can only see this tab if you’re an account administrator.
If you can’t see this tab, you’re not an account administrator, and you can skip ahead to using your iThenticate account for information on how to actually use the service to check your manuscripts.
Similarity Check administrator checklist - questions to answer before you begin
As a Similarity Check service user, your organization gets reduced-rate access to the iThenticate tool from Turnitin. You and your team are able to upload your manuscript submissions and receive a Similarity Report which shows areas of overlap between the manuscript and other published works.
As an administrator, you create and manage the users on your account, and you decide how your organization uses the iThenticate tool. You’ll find the system easier to use if you set it up correctly to start with. Do consider the following questions carefully and set up your account accordingly before inviting any users to your account:
Exclusions allow you to set iThenticate to ignore particular phrases, document sections, common words, and URLs, so that they are not flagged in your account’s Similarity Reports.
We recommend starting without any exclusions to avoid excluding anything important. Once your users are experienced enough to identify words and phrases that appear frequently but are not potentially problematic matches (and can therefore be ignored) in a Similarity Report, you can start carefully making use of this feature.
Set clear guidelines for your users so they understand the settings you have already applied, and can make skilful use of the options they can choose for themselves at report level.
4. Which iThenticate repositories will you want to check your manuscripts against?
iThenticate has a number of content repositories, grouped by the type of content they contain, including: Crossref, Crossref posted content, Internet, Publications, Your Indexed Documents.
You can choose which of iThenticate’s repositories you’re checking your manuscripts against. We recommend including them all to start with.
The person (whether an administrator or a user) who sets up a folder selects the repositories to check against for that folder. When the folder is shared, other users cannot adjust the repositories selected.
5. How will you budget for your document checking fees?
There’s a charge for each document checked, and you’ll receive an invoice in January each year for the documents you’ve checked in the previous year. If you’re a member of Crossref through a Sponsor, your Sponsor will receive this invoice.
As well as setting a Similarity Check document fees budget for your account each year, it’s useful to monitor document checking and see if you’re on track. You can monitor your usage in the reports section of the iThenticate platform. Ask yourself:
How many documents do you plan to check?
How often do you want to monitor usage? Set yourself a reminder to check your usage reports periodically.
How do you want to segment your report? You can report separately by groups of users, so think about what types of groups would make sense for your circumstances.
Learn more about how usage reports can help you monitor the number of documents checked on your account.
It’s a good idea to come back to these questions periodically, consider how your use of the tool is evolving, and make changes accordingly.