In our previous blog post in this series, we explained why no metadata matching strategy can return perfect results. Thankfully, however, this does not mean that it’s impossible to know anything about the quality of matching. Indeed, we can (and should!) measure how close (or far) we are from achieving perfection with our matching. Read on to learn how this can be done!
How about we start with a quiz? Imagine a database of scholarly metadata that needs to be enriched with identifiers, such as ORCIDs or ROR IDs.
We’re in year two of the Resourcing Crossref for Future Sustainability (RCFS) research. This report provides an update on progress to date, specifically on research we’ve conducted to better understand the impact of our fees and possible changes.
Crossref is in a good financial position with our current fees, which haven’t increased in 20 years. This project is seeking to future-proof our fees by:
Making fees more equitable Simplifying our complex fee schedule Rebalancing revenue sources In order to review all aspects of our fees, we’ve planned five projects to look into specific aspects of our current fees that may need to change to achieve the goals above.
On behalf of the Nominating Committee, I’m pleased to share the slate of candidates for the 2024 board election.
Each year we do an open call for board interest. This year, the Nominating Committee received 53 submissions from members worldwide to fill four open board seats.
We maintain a balanced board of 8 large member seats and 8 small member seats. Size is determined based on the organization’s membership tier (small members fall in the $0-$1,650 tiers and large members in the $3,900 - $50,000 tiers).
In our previous instalments of the blog series about matching (see part 1 and part 2), we explained what metadata matching is, why it is important and described its basic terminology. In this entry, we will discuss a few common beliefs about metadata matching that are often encountered when interacting with users, developers, integrators, and other stakeholders. Spoiler alert: we are calling them myths because these beliefs are not true! Read on to learn why.
To work out which version you’re on, take a look at the website address that you use to access iThenticate. If you go to ithenticate.com then you are using v1. If you use a bespoke URL, https://crossref-[your member ID].turnitin.com/ then you are using v2.
When your organization signs up for Similarity Check, a central contact at your organization will become your Similarity Check account administrator. They will set up all the users on your account.
When your administrator adds you as a user, you’ll receive an email from noreply@ithenticate.com with the subject line “Account Created” containing a username and a single-use password. You may only log in once with the single-use password, and you must change it the first time you log in.
Log in to your user account (v1)
Start from the link in the invitation email from noreply@ithenticate.com with the subject line “Account Created” and click Login
Enter your username and single-use password
Click to agree to the terms of the end-user license agreement. These terms govern your personal use of the service. They’re separate from the central Similarity Check service agreement that your organization has agreed to.
To change your email address, remove your current address from the ​Email​ field, enter your new email in the same field, and click ​Update Profile​ to save.
To change your password, enter a new password in the Change Password field, repeat it in the Confirm Password field, and click ​Update Profile​ to save.
Find your way around for users (v1)
In the main navigation bar at the top of the screen, you will see several tabs:
Folders: this is the main area of iThenticate, where you upload, manage, and view documents for checking