3 minute read.
Making peer reviews citable, discoverable, and creditable
A number of our members have asked if they can register their peer reviews with us. They believe that discussions around scholarly works should have DOIs and be citable to provide further context and provenance for researchers reading the article. To that end, we can announce some pertinent news as we enter Peer Review Week 2017: Crossref infrastructure is soon to be extended to manage DOIs for peer reviews. Launching next month will be support for this new content type, with schema specifically dedicated to the reviews and discussions of scholarly content.
Not disimilar to other registered resources (datasets, working papers, preprints, translations, etc.) publication peer reviews are important scholarly contributions in their own right and form a part of the scholarly record. In addition to the members who have been registering them, many more are looking to better handle these contributions and give recognition to this process which is so critical to maintaining scientific quality.
Here are a few examples of existing Crossref DOIs for peer reviews: https://doi-org.turing.library.northwestern.edu/10.1016/j.engfracmech.2015.01.019 and https://doi-org.turing.library.northwestern.edu/10.5194/wes-1-177-2016 and https://doi-org.turing.library.northwestern.edu/10.14322/PUBLONS.R518142.
We are extending our infrastructure to support all members who make these scholarly discussions available to readers. To accommodate a wide range of publisher practices, this will include a range of outputs made publicly available from the peer review history, across any and all review rounds, including referee reports, decision letters, and author responses. Members will be able to include not only scholarly discussions of journal articles before but also after publication (e.g. “post-publication reviews”).
Central to this new feature of the Crossref Content Registration service is the special set of metadata dedicated to supporting the discovery and investigation of peer reviews as it is linked up to the article discussed. The peer review schema will provide a characterization of the peer review asset (for example: recommendation, type, license, contributor info, competing interests) as well as offer a view into the review process (e.g. pre/post-publication, revision round, review date).
Our custom support for peer reviews will ensure that:
- Readers can see provenance and get context of a work
- Links to this content persist over time
- The metadata is useful
- They are connected to the full history of the published results
- Contributors are given credit for their work (we will ask for ORCID iDs)
- The citation record is clear and up-to-date.
As with all the content registered with Crossref, we will make peer review metadata available for machine and human access, across multiple interfaces (e.g. REST API, OAI-PMH, Crossref Metadata Search) to enable discoverability across the research ecosystem. This metadata may also support enrichment of scholarly discussion, reviewer accountability, publishing transparency, analysis or research on peer reviews, and so on.
To reflect the nature of this special content, we will bundle the fees for peer review content fees into the cost of registering the article for members who publish the journal article and its peer reviews. No matter how many reviews are associated with a paper, there will be a fixed fee for the full set.
Peer review infrastructure will arrive at Crossref in one month, and we are excited to engage our members who want to assign DOIs to peer reviews or migrate previously registered review content to the new schema. A special thanks to the members so far who have given feedback and advice to develop the schema: BMC, The BMJ, Copernicus, eLife, PeerJ, and Publons.
Please contact our membership specialist if you’d like to know more.