eLife recently won a Crossref Metadata Award for the completeness of its metadata, showing itself as the clear leader among our medium-sized members. In this post, the eLife team answers our questions about how and why they produce such high-quality open metadata. For eLife, the work of creating and sharing excellent metadata aligns with their mission to foster open science and supports their preprint-centred publication model, but it also lays the groundwork for all kinds of exciting potential uses.
Hablamos con Nacho Pérez Alcalde, Vicedirector Técnico de Editorial CSIC, la editorial al mando de ´Boletín Geológico y Minero’, ganadora del Crossref Metadata Award en la categoría de Metadata Enrichment. Miembro de Crossref desde 2008, Editorial CSIC publica 41 revistas en acceso abierto Diamante, y juega un papel esencial en la diseminación del conocimiento científico a nivel internacional. Exploramos lo que este premio ha significado para Editorial CSIC y qué planes para el futuro tienen para seguir mejorando la calidad y uso de sus metadatos.
TLDR: We’ve successfully moved the main Crossref systems to the cloud! We’ve more to do, with several bugs identified and fixed, and a few still ongoing. However, it’s a step in the right direction and a significant milestone, as, whilst it is a much larger financial investment, it addresses several risks and limitations and shores up the Crossref infrastructure for the future.
The American Society for Microbiology (ASM) has earned recognition in Crossref’s Participation Reports for its exceptional metadata coverage among large publishing members––an achievement built on intentional change, technical investment, and collaborative work. In this Q&A, the ASM team shares what that journey looked like, the challenges they’ve tackled, and how centering metadata has helped them better connect research with the global scientific community.
The DOI error report is sent immediately when a user informs us that they’ve seen a DOI somewhere which doesn’t resolve to a website.
The DOI error report is used for making sure your DOI links go where they’re supposed to. When a user clicks on a DOI that has not been registered, they are sent to a form that collects the DOI, the user’s email address, and any comments the user wants to share. We compile the DOI error report daily using those reports and comments, and send it via email to the technical contact at the member responsible for the DOI prefix as a .csv attachment.
If you would like the DOI error report to be sent to a different person, please contact us.
The DOI error report .csv file contains (where provided by the user):
DOI - the DOI being reported
URL - the referring URL
REPORTED-DATE - date the DOI was initially reported
USER-EMAIL - email of the user reporting the error
COMMENTS
We find that approximately 2/3 of reported errors are ‘real’ problems. Common reasons why you might get this report include:
you’ve published/distributed a DOI but haven’t registered it
the DOI you published doesn’t match the registered DOI
a link was formatted incorrectly (a . at the end of a DOI, for example)
a user has made a mistake (confusing 1 for l or 0 for O, or cut-and-paste errors)
What should I do with my DOI error report?
Review the .csv file attached to your emailed report, and make sure that no legitimate DOIs are listed. Any legitimate DOIs found in this report should be registered immediately. When a DOI reported via the form is registered, we’ll send out an alert to the reporting user (if they’ve shared their email address with us).
I keep getting DOI error reports for DOIs that I have not published, what do I do about this?
It’s possible that someone is trying to link to your content with the wrong DOI. If you do a web search for the reported DOI you may find the source of your problem - we often find incorrect linking from user-provided content like Wikipedia, or from DOIs inadvertently distributed by members to PubMed. If it’s still a mystery, please contact us.
Page maintainer: Isaac Farley Last updated: 2024-July-19