The integrity of the scholarly record is an essential aspect of research integrity. Every initiative and service that we have launched since our founding has been focused on documenting and clarifying the scholarly record in an open, machine-actionable and scalable form. All of this has been done to make it easier for the community to assess the trustworthiness of scholarly outputs. Now that the scholarly record itself has evolved beyond the published outputs at the end of the research process – to include both the elements of that process and its aftermath – preserving its integrity poses new challenges that we strive to meet… we are reaching out to the community to help inform these efforts.
I’m pleased to share the 2022 board election slate. Crossref’s Nominating Committee received 40 submissions from members worldwide to fill five open board seats.
We maintain a balance of eight large member seats and eight small member seats. A member’s size is determined based on the membership fee tier they pay. We look at how our total revenue is generated across the membership tiers and split it down the middle. Like last year, about half of our revenue came from members in the tiers $0 - $1,650, and the other half came from members in tiers $3,900 - $50,000.
Our entire community – members, metadata users, service providers, community organizations and researchers – create and/or use DOIs in some way so making them more accessible is a worthy and overdue effort.
For the first time in five years and only the second time ever, we are recommending some changes to our DOI display guidelines (the changes aren’t really for display but more on that below). We don’t take such changes lightly, because we know it means updating established workflows.
I’m delighted to say that Martin Paul Eve will be joining Crossref as a Principal R&D Developer starting in January 2023.
As a Professor of Literature, Technology, and Publishing at Birkbeck, University of London- Martin has always worked on issues relating to metadata and scholarly infrastructure. In joining the Crossref R&D group, Martin can focus full-time on helping us design and build a new generation of services and tools to help the research community navigate and make sense of the scholarly record.
To work out which version you’re on, take a look at the website address that you use to access iThenticate. If you go to ithenticate.com then you are using v1. If you use a bespoke URL, https://crossref-[your member ID].turnitin.com/ then you are using v2.
Use doc-to-doc comparison to compare a primary uploaded document with up to five comparison uploaded documents. Any documents that you upload to doc-to-doc comparison will not be indexed and will not be searchable against any future submissions.
Uploading a primary document to doc-to-doc comparison will cost you a single document submission, but the comparison documents uploaded will not cost you any submissions.
Start from Folders, go to the Submit a document menu, and click Doc-to-Doc Comparison.
The doc-to-doc comparison screen allows you to choose one primary document and up to five comparison documents. Choose the destination folder for the documents you will upload. The Similarity Report for the comparison will be added to the same folder.
For your primary document, provide the author’s first name, last name, and document title. If you do not provide these details, the filename will be used for the title, and the author details will stay blank.
If you have administrator permissions, you can assign the Similarity Report for the comparison to a reporting group by selecting one from the Reporting Group drop-down. Learn more about reporting groups.
Click Choose File, and select the file you want to upload as your primary document. See the file requirements for both the primary and comparison documents on the right of the screen.
You can choose up to five comparison documents to check against your primary document. These do not need to be given titles and author details. Each of the filenames must be unique. Click Choose Files, and select the files you would like to upload as comparison documents. To remove a file from the comparison before you upload it, click the X icon next to the file. To upload your files for comparison, click Upload.
Once your document has been uploaded and compared against the comparison documents, it will appear in your chosen destination folder.
This upload will have ‘Doc-to-Doc Comparison’ beneath the document title to show that this is a comparison upload and has not been indexed.
The upload will be given a Similarity Score against the selected comparison documents, which is also displayed in the report column. Click the similarity percentage to open the doc-to-doc comparison in the Document Viewer.
The Document Viewer is separated into three sections:
Along the top of the screen, the paper information bar shows details about the primary document, including document title, author, date the report was processed, word count, number of comparison documents provided, and how many of those documents matched with the primary document.
On the left panel is the paper text - this is the text of your primary document. Matching text is highlighted in red.
Your comparison documents will appear in the sources panel to the right, showing instances of matching text within the submitted documents.
By default, the doc-to-doc comparison will open the Document Viewer in the All Sources view. This view lists all the comparison documents you uploaded. Each comparison document has a percentage showing the amount of content within them that is similar to the primary document. If a comparison document has no matching text with the primary document, it has 0% next to it.
Doc-to-doc comparison can also be viewed in Match Overview mode. In this view, the comparison documents are listed with highest match percentage first, and all the sources are shown together, color-coded, on the paper text.
Page owner: Kathleen Luschek | Last updated 2020-May-19