We believe in Persistent Identifiers. We believe in defence in depth. Today we’re excited to announce an upgrade to our data resilience strategy.
Defence in depth means layers of security and resilience, and that means layers of backups. For some years now, our last line of defence has been a reliable, tried-and-tested technology. One that’s been around for a while. Yes, I’m talking about the humble 5ÂĽ inch floppy disk.
Recording data citations supports data reuse and aids research integrity and reproducibility. Crossref makes it easy for our members to submit data citations to support the scholarly record.
TL;DR Citations are essential/core metadata that all members should submit for all articles, conference proceedings, preprints, and books. Submitting data citations to Crossref has long been possible. And it’s easy, you just need to:
Include data citations in the references section as you would for any other citation Include a DOI or other persistent identifier for the data if it is available - just as you would for any other citation Submit the references to Crossref through the content registration process as you would for any other record And your data citations will flow through all the normal processes that Crossref applies to citations.
At Crossref, we care a lot about the completeness and quality of metadata. Gathering robust metadata from across the global network of scholarly communication is essential for effective co-creation of the research nexus and making the inner workings of academia traceable and transparent. We invest time in community initiatives such as Metadata 20/20 and Better Together webinars. We encourage members to take time to look up their participation reports, and our team can support you if you’re looking to understand and improve any aspects of metadata coverage of your content.
What’s in the metadata matters because it is So.Heavily.Used.
You might be tired of hearing me say it but that doesn’t make it any less true. Our open APIs now see over 1 billion queries per month. The metadata is ingested, displayed and redistributed by a vast, global array of systems and services that in whole or in part are often designed to point users to relevant content. It’s also heavily used by researchers, who author the content that is described in the metadata they analyze.
The Metadata Manager tool is in beta and contains many bugs. It’s being deprecated at the end of 2021. We recommend using the web deposit tool as an alternative, or the OJS plugin if your content is hosted on the OJS platform from PKP.
Once you click Deposit, we immediately process the deposit and display the results for accepted and rejected deposits. All deposit records accepted by the system have a live DOI.
All deposit results are archived and available for reference on the Deposit history tab on the top menu bar.
You can also see your deposit history in the admin tool - go to the Administration tab, then the Submissions tab. Metadata Manager deposit filenames begin with MDT. You can even review the XML that Metadata Manager has created your behalf.
Updating existing records and failed deposits
Metadata Manager also makes it easy to update existing records, even if you didn’t use Metadata Manager to make the deposit in the first place. You must add the journal to your workspace before you can update the records associated with it - learn more about setting up a new journal in your workspace.
Accepted and Failed submissions can be updated using the respective tabs in the workspace. Click into the journal, and then click into the article. Add or make changes to the information, and then deposit.
What does the status “warning” in my submission result mean?
When similar metadata is registered for more than one DOI, it’s possible that the additional DOIs are duplicates. Because DOIs are intended to be unique, the potentially duplicated DOI is called a conflict. Learn more about the conflict report.
In Metadata Manager, if you register bibliographic metadata that is very similar to that for an existing DOI, you will see a status “warning” with your submission result. This is accurate.
When you return to your journal workspace in Metadata Manager to review your list of DOIs, the DOI that returned the “warning” will display as “failed”. This is inaccurate, as you can see if you try to resolve the DOI in question. We are working on improving the wording in this part of the process to make it less confusing.
Page owner: Sara Bowman | Last updated 2022-July-22