An article in the October 2015 issue of Learned Publishing, by two of the authors of this article (Paglione & Lawrence, 2015), described the creation, by a community Working Group led by Consortia Advancing Standards in Research Administration Information (CASRAI, http://dictionary.casrai.org/Peer_Review_Services), Faculty of 1000 (F1000), and Open Researcher and Contributor ID (ORCID), of a set of data exchange standards to support and acknowledge peer review activity (CASRAI, 2015). These data elements can be used to cite broad range of peer review activity, from anonymous to fully attributed reviews and including reviews for multiple forms of research output. The citation description includes, at minimum, information relating to the person conducting the review, the type of review, and the publisher, funder or other organization recognizing the reviewer. Other information may also be included, and the citation can also link to the content of the review or a reference to the item reviewed, if wished. The information to describe the person conducting the review includes an ORCID ID, a persistent unique identifier for individuals. In September 2015, peer review citations were added to the ORCID registry to describe the review activity attributed to the person holding the ID. This functionality can be implemented by any ORCID member.
In this article, we will provide two case studies of different ways these peer review citations are being used by early adopters.
WHY CONNECT PEER REVIEW ACTIVITY?
Linking peer review activities to a researcher's ORCID ID ensures that this contribution is included with their other scholarly contributions such as publications, research data sets, grants, and other activity. Centralizing these contributions through the ORCID record will provide better understanding of the full range of scholarly contributions made by researchers. Further, peer review citation can help encourage participation in this vital component of the scholarly communications process – whether for journal articles, books, conference submissions, promotion and tenure, grant applications, or pre-print annotation. As described in the October 2015 article, publishers and others are already experimenting with methods to incentivize researcher participation, and most have a growing interest in enabling citation of peer review activity. We hope that, by including peer review citations in the ORCID record and enabling them to be connected to the ORCID ID, this important activity will become recognized as an integral component of the overall scholarly record.
THE PEER REVIEW CITATION PROCESS
The functionality of ORCID has developed to enable this recognition works as follows:
When an individual agrees to perform a review, they are asked for their ORCID ID if it is not already linked to an editorial database.
The reviewer provides their ORCID ID, or generates a new one, by clicking a button, and using or setting up their ORCID credentials.
At this time the reviewer also provides permission to the publisher or other organization to add the review citation to their ORCID record.
The review is performed.
At an interval determined by the publisher or other reviewing organization, a citation is posted to the reviewer's ORCID record. Depending on the anonymity of the review, citations may be posted immediately, at the end of a review cycle, or periodically.
Citations may be sparse, containing as little as the organization for which the review was performed, or rich, containing details about what was reviewed, and even a link to the review and paper itself if the review was open.
The ORCID record holder decides whether to make information about their peer review activity publicly available; this can be performed at individual citation level, allowing the individual to choose which reviews to share publicly and which to keep private.
Our two case studies, while both relating to journal article review, offer contrasting approaches to the review process – one traditional, closed, single-blind pre-publication peer review and the other fully open, post-publication peer review.
CASE STUDY 1: AMERICAN GEOPHYSICAL UNION
The American Geophysical Union (AGU) is the largest Earth and space science organization. It has approximately 65,000 members and publishes 19 journals that are a mix of open access and subscription/hybrid titles. All journals use a traditional single-blind peer review, whereby the referees are not revealed to the author during or after the peer review process (although they can self-identify.)
The American Geophysical Union's Geophysical Electronic Manuscript Submission (GEMS) is provided by a vendor, eJournalPress (eJP), who have recently updated their system to support the collection of ORCID IDs for authors and reviewers and now supports updating of a reviewer's ORCID record with their peer review activities.
The existing AGU ‘thank you’ e-mail to peer reviewers now includes an encrypted link inviting them to claim credit via their ORCID record, which takes the reviewer to the eJP landing page (see Fig. 1 for an example). There, the reviewer will find an explanation of how the process works and a confirmation button to enable them to claim credit. If the reviewer does not already have an ORCID ID – or has not entered it in GEMS – they are redirected to ORCID for authentication and to ask the reviewer for permission to update their ORCID record (Fig. 2). (Note: GEMS has a single-sign-on capability with ORCID.) Once the reviewer's ORCID record has been confirmed and permission to update granted, eJP adds the peer review citation to it.
Because the journals use a single-blind peer review process, the only review citation elements being used are the person identifier (i.e. the ORCID ID) describing the person who performed the review, the organizational identifier, describing the organization (i.e. AGU) that is recognizing the person's review activity, the journal name, and the year the review was completed. This sparse citation ensures that it is not possible for others to determine which article(s) a reviewer is responsible for and is similar to information already published in annual lists thanking reviewers. Reviewers can still self-identify to the author if they would like. Each round of review will be counted (i.e. a separate citation is provided for each individual review, not just for each manuscript).
The American Geophysical Union's goals in adding peer review activities to their reviewers ORCID records include the following:
Expanding appreciation for peer review while preserving the anonymity.
Understanding more about who is reviewing for AGU and how often, in particular, from a membership perspective.
Improving recognition for AGU reviewers by enabling them to connect their review activities in their ORCID record.
The American Geophysical Union plans to extend credit to editors and associate editors in the future. Better recognition of reviewers is something that has been asked for by AGU's editors and Publications Committee. AGU also surveys authors and reviewers after the decision for each paper, and better recognition for this work, both individually and collectively, is a common theme. The ORCID implementation is one means for expanding this recognition and, if widely adopted, will provide a better means for understanding this added value by the community.
CASE STUDY 2: F1000
F1000 is the publisher of three services that support and inform the work of life scientists and clinicians – F1000Prime, F1000Research, and F1000Workspace. The first two services use reviewers for a type of post-publication peer review, but in differing ways; in both instances, these reviewers can now claim credit for their review activities on their ORCID record.
F1000Prime provides qualitative assessment by over 11,000 F1000 Faculty Members who are experts across biology and medicine. They identify those articles in the published literature that they think are most important and interesting during their daily work and then provide a 1–3 star rating (all positive) and write a short recommendation on why they think the article is so interesting to them. These ratings and recommendations are then published alongside their name and affiliation and form a large database of now over 165,000 recommendations spread across over 4,000 journals. This activity is essentially a form of post-publication peer review of the world's biomedical literature, and they can now obtain further credit for their work by including these recommendations in the review activity section of their ORCID record.
F1000Research is a publishing platform for biomedical research articles. Rather than a traditional double-blind review process, it uses near-immediate publication (following a set of rapid internal checks), followed by invited and fully open post-publication peer review. F1000Research has taken this approach to try and tackle a number of key issues in the traditional publication process. The first is to remove the long delay between when new findings are ready to be shared and when other researchers actually obtain to see and benefit from them. This typically takes several months or even sometimes years, a delay that is of no benefit to anyone. The second related issue is around the well-documented biases (Smith, 2010; Bastian, 2015) that pre-publication anonymous refereeing can cause, largely because of the fact that referees are, by definition, experts in the same area and may well therefore be competitors. In the F1000Research model, articles undergo a rapid pre-publication check by an internal editorial team to ensure that it is a scientific article, is authored by a researcher at a recognized scientific institution, is readable, meets ethical requirements for the field, and that they have provided the underlying data. If the article passes these checks, it is formally published and this then triggers invited peer review. All referee reports and the review status, together with the referee's name and affiliation, are published alongside the article as they come in. Authors can respond publicly to the referee reports and can publish new article versions, both of which are captured in a dynamic article citation.
Because these referee reports are signed and open, F1000Research can assign them digital object identifiers (DOIs) – and in fact some reports are highly used and cited in their own right (Fig. 3). The time and effort researchers spend in writing these referee reports is sizable, but is typically invisible despite playing such an important role in scientific progress. F1000 was especially keen to ensure their referees obtain as much credit as possible for this invaluable task, and to ensure that, as ORCID is used increasingly by organizations to easily track all of a researcher's outputs, there was a way to include work on their ORCID record.
In both cases, the F1000 process works in a similar way to the AGU one outlined earlier. A thank you email is sent to peer reviewers, which includes an encrypted link allowing reviewers to claim credit for their review activity and attach it to their ORCID record. However, because of the open nature of their peer review process, F1000 includes more information about review activities to ORCID records than AGU. Citation data includes the name of the referee, the name of the publication, the title of paper being refereed, the date of publication of the referee report, and the unique identifier (DOI) of both the paper and referee report (Fig. 4 shows an example of how these will look on an ORCID record). In the near future, F1000 plans to request permission from authors and reviewers to enable any type of output published through F1000 to automatically update their ORCID record.
EXPECTED FUTURE OUTCOMES
These two case studies are late-stage prototypes and demonstrate functionality and user experience. As more researchers connect their ID during the review process, more citations will be enabled, supporting improved recognition for peer review in the AGU and F1000 communities. Educating researchers about opportunities for peer review citation and recognition, encouraging them to use their ID, or register for one if they have not already obtained so, and making the citation process as quick and seamless as possible, will be critical to gaining widespread adoption of this functionality.
We expect several other organizations to implement ORCID's peer review functionality in the coming months, including the following: Aries Systems, Peerage of Science, Politics and Religion journal, and Publons. While early interest by publishers and publishing vendors is encouraging, we hope that other types of organizations will also embrace the data exchange standard recommended and use it as they implement their own systems for recognition. This will be critical for ensuring recognition for the widest possible range of review activities.
Educating researchers about opportunities for peer review recognition, encouraging them to register for an ORCID ID if they have not already obtained so, and making the process as quick and seamless as possible will also be critical to gaining widespread adoption of this functionality.