Recognising the contribution of Nature Research journal referees

This guest blog comes from Ritu Dhand, VP Editorial, Nature Journals.

Nature’s trial to formally acknowledge the contribution of its peer reviewers by naming them on published papers with permission of the referee and author has been expanded to eight more Nature Research journals. 55% of referees opted in during the initial phase.

The Nature-branded journals publish over 8,000 primary research papers each year. Behind each paper is a talented team of reviewers who have helped our professional editors to assess the scientific claims being made. Peer review is the formal quality-assurance mechanism whereby research manuscripts are subjected to technical evaluation and assessment of impact, and is a cornerstone of quality, integrity and reproducibility in research. However, most reviewers receive little recognition for their efforts. Given how highly we value the contributions of our reviewers, we wanted to give them an option to be formally and publicly recognised for their role in the peer-review process.

We have also been exploring ways to introduce transparency to the peer review process. For several years, researchers have argued that single-blind peer review, where the referees are unknown to the author, is sub-optimal. The lack of transparency means researchers must have confidence that referees and editors are acting with integrity and without bias. In a survey with responses from 1,230 Nature referees, 82% agreed that the traditional peer review process is effective in ensuring work that is published is high-quality. Yet 63% of respondents also agreed that publishers should experiment with alternative peer review methods, and 51% agreed that peer review could be more transparent and that they expect publishers to do more.

As a way of both acknowledging the work of reviewers and introducing transparency to the peer-review process, we launched our referee recognition trial at Nature in spring 2016. At the end of the peer-review process, authors and peer reviewers are given the option of having referee names formally acknowledged on the published paper. If the authors also agree, peer reviewers who give permission will have their name included in the ‘reviewer information’ section of the paper where we thank them for their contribution. In some cases, some referees on a paper may choose to have their name listed, while others may choose to remain anonymous.

Over the last three years, around 3,700 Nature referees across the natural sciences have chosen to be publicly recognised and around 80% of Nature papers have at least one referee named. We have not seen any significant differences in behaviour between researchers in the life and physical sciences. 91% of Nature authors opted in to the trial, while among referees, 55% opted in (26% opted out and 19% did not respond). When surveyed, 80% of referees that had participated in the Nature referee recognition trial said they would be happy to be named again. The Nature reviews journals also rolled out the referee recognition trial one year ago and saw 57% of reviewers opting in to be publicly named. More recently, we have rolled out the trial at eight Nature Research journals:  Nature Astronomy, Nature Climate Change, Nature Nanotechnology, Nature Neuroscience, Nature Physics, Nature Plants, Nature Protocols and Nature Communications.

We analysed the gender and career stage of authors and referees who took part in the trial over a nine-month period (where these data were available in our peer review system or could be cross-checked with public sources). The percentages of female and male corresponding authors opting into the trial were similar: 90% and 93%, respectively. The proportion of female and male referees who agreed to be named was also similar: around 50% and 56%, respectively. A similar proportion of referees from early, middle and late career stages were happy to be named: 54% of researchers/post-docs, 50% of assistant/associate professors, and 55% of professors opted in to the trial.

We also surveyed all reviewers who had reviewed for our journals in the course of one year to better understand their motivations for participating in the peer-review process and their views on peer review more broadly. Altruism was a key driver of participation in peer review. 87% of researchers who responded said they considered it their academic duty to peer review and 77% said that participating would help to safeguard the quality of published research. Conversely, only 6% of reviewers noted that participating in peer review enhanced their CV and 7% said it encouraged favourable views from editors. Unsurprisingly, most reviewers (94%) said that the subject area is a key factor when deciding which manuscripts to review.

Despite the time and effort peer review requires, 71% of respondents did not expect acknowledgement for peer review and 58% thought that rewards may compromise the review procedure. However, when asked from whom would they most value recognition, 44% said they would most value recognition for peer review to come from publishers or editors.

We also asked reviewers what they thought the impacts of public referee recognition might be. 78% felt that naming the reviewers would result in better written reports; 68% thought it would have a positive impact on transparency; 47% thought it would have a positive effect on honesty of reporting; and 52% of those who had not been formally acknowledged by a Nature-branded journal indicated that they would consider being named if given the option.

About a quarter of researchers opted out of the trial and appeared to be against the principle of referees being named on published papers. Their concerns mainly focused around the practice could increase the chances of the system being gamed by individuals — perhaps starting a ‘you owe me’ mechanism — or referee reports being toned down, either to avoid upsetting or from fear of retaliation from disgruntled authors, particularly those in senior positions. Many of these researchers believe that peer review should always be confidential and are against this level of transparency. For these reasons, referee recognition remains optional on the Nature-branded journals.

To see so many of our referees choosing to be named is a reflection of the changing attitudes towards peer review.  We are happy that we can publicly acknowledge the contribution to peer review of so many of them. We continue to listen to the community and acknowledge the call for further consideration of other ways to do peer review. Nature Communications has been publishing referee reports for over three years, and we are discussing whether offering this as an option on other Nature Research journals is something we can practically consider in the future. Watch this space for further information!

 

A related Nature editorial is also available here: Three-year trial shows support for recognizing peer reviewers

Towards minimal reporting standards for life scientists

This guest blog comes from a group of journal editors and experts in reproducibility and transparent reporting, who are putting together a framework for minimal reporting standards in the life sciences.

Transparency in reporting benefits scientific communication on many levels. While specific needs and expectations vary across fields, the effective use of research findings relies on the availability of core information about research materials, data, and analysis. These are the underlying principles that led to the design of the TOP guidelines, which outline a framework that over 1,000  journals and publishers have elected to follow.

In September 2017, the second major TOP guidelines workshop hosted by the Center for Open Science led to a position paper suggesting a standardized approach for reporting, provisionally entitled the TOP Statement.

Based on discussions at that meeting and at the 2017 Peer Review Congress, in December 2017 we convened a working group of journal editors and experts to support this overall effort by developing a minimal set of reporting standards for research in the life sciences. This framework could both inform the TOP statement and serve in other contexts where better reporting can improve reproducibility.

In this “minimal standards” working group, we aim to draw from the collective experience of journals implementing a range of different approaches designed to enhance reporting and reproducibility (e.g. STAR Methods), existing life science checklists (e.g. the Nature Research reporting summary), and results of recent meta-research studying the efficacy of such interventions (e.g. Macleod et al. 2017; Han et al. 2017); to devise a set of minimal expectations that journals could agree to ask their authors to meet.

An advantage of aligning on minimal standards is consistency in policies and expectations across journals, which is beneficial for authors as they prepare papers for publication and for reviewers as they assess them. We also hope that other major stakeholders engaged in the research cycle, including institutional review bodies and funders, will see the value of agreeing on this type of reporting standard as a minimal expectation, as broad-based endorsement from an early stage in the research life cycle would provide important support for overall adoption and implementation.

The working group will provide three key deliverables:

  • A “minimal standards” framework setting out minimal expectations across four core areas of materials (including data and code), design, analysis and reporting (MDAR)
  • A “minimal standards” checklist intended to operationalize the framework by serving as an implementation tool to aid authors in complying with journal policies, and editors and reviewers in assessing reporting and compliance with policies
  • An “elaboration” document or user guide providing context for the “minimal standards” framework and checklist

While all three outputs are intended to provide tools to help journals, researchers and other stakeholders with adoption of the minimal standards framework, we do not intend to be prescriptive about the precise mechanism of implementation and we anticipate that in many cases they will be used as a yardstick within the context of an existing reporting system. Nevertheless, we hope these tools will provide a consolidated view to help raise reporting standards across the life sciences.

We anticipate completing draft versions of these tools by spring 2019.  We also hope to work with a wider group of journals, as well as funders, institutions, and researchers to gather feedback and seek consensus towards defining and applying these minimal standards.  As part of this feedback stage, we will conduct a “community pilot” involving interested journals to test application of the tools we provide within the context of their procedures and community. Editors or publishers who are interested in participating are encouraged to contact Veronique Kiermer and Sowmya Swaminathan for more information.

In the current working group, we have focused our efforts on life science papers because of extensive previous activity in this field in devising reporting standards for research and publication.  However, once the life science guidelines are in place we hope that we and others will be able to extend this effort to other areas of science and devise similar tools for other fields.  Ultimately, we believe that a shared understanding of expectations and clear information about experimental and analytical procedures have the potential to benefit many different areas of research as we all work towards greater transparency and the support that it provides for the progress of science.

We are posting this notification across multiple venues to maximize communication and outreach, to give as many people as possible an opportunity to influence our thinking.  We welcome comments and suggestions within the context of any of these posts or in other venues.  If you have additional questions about our work, would like informed of progress, or would like to volunteer to provide input, please contact Veronique Kiermer and Sowmya Swaminathan.

On behalf of the “minimal standards” working group:
Karen Chambers (Wiley)
Andy Collings (eLife)
Chris Graf (Wiley)
Veronique Kiermer (Public Library of Science; vkiermer@plos.org)
David Mellor (Center for Open Science)
Malcolm Macleod (University of Edinburgh)
Sowmya Swaminathan (Nature Research/Springer Nature; s.swaminathan@us.nature.com)
Deborah Sweet (Cell Press/Elsevier)
Valda Vinson (Science/AAAS)

Nature Research journals improve accessibility of data availability statements

The Nature Research journals have taken further steps to promote transparency and reproducibility by making information on the availability of research data within our articles easier to access.      

This guest blog comes from Iain Hrynaszkiewicz, Head of Data Publishing, Open Research Group at Springer Nature, and Sowmya Swaminathan, Head of Editorial Policy and Research Integrity at Nature Research.

All research articles published in Nature Research titles now provide data availability statements as a distinct article section that is freely and universally accessible. This means that data availability statements are now equivalently accessible to abstracts, full reference lists, supplementary information, acknowledgements, and other key article information. See two examples from Nature here (pictured) and here.

Since 2016, we have required all primary research papers published in Nature Research journals to include a data availability statement. The aim of this policy was to make the conditions of access to the “minimal dataset” ― defined as the dataset necessary to interpret, validate and extend the findings ― transparent to all readers. Data availability statements have become a widely established mechanism for authors to consistently describe if and how research data supporting their publications are available.  Such statements are required by many other Springer Nature journals in addition to the Nature Research journals, including the BMC group of journals, as well as those of other publishers. They are also increasingly used by funding agencies, institutions and researchers, as a means to measure data-sharing practices and behaviours ― and for building better connections between data and literature. Some funding agencies, such as the UK’s Engineering and Physical Sciences Research Council, also require that data access statements are provided for policy compliance.

We believe that enhancing discoverability of data availability statements, by providing them as a separate section, could also:

  • Increase accessibility and reuse of the data-supporting publications, by making it easier to find ― by humans and machines
  • Encourage citation and reuse of data, including of data that are not publicly available
  • Promote good practices and common standards in preparing data availability statements
  • Enable funding agencies, institutions and other stakeholders to better monitor data sharing and compliance with data sharing-policies
  • Enable more precise research of data-sharing behaviours and practices

Our change in the way we present data availability statements to readers underscores our commitment to facilitating data access and the importance of data as a crucial component underlying the integrity, re-use and extension of published research. Our guide to authors and our specific guidance on data availability and data citations have been updated to reflect these changes.

Peer Review Week 2018: Creating equal opportunities for peer reviewers through training

This guest blog comes from James Houghton, Associate Publishing Manager, Nature Masterclasses.

This year’s Peer Review Week ― a global event celebrating the essential of role peer in maintaining scientific quality ― kicked off on 10 September. Diversity and inclusion in peer review is this year’s theme. Peer review is an essential part of the scientific publishing process. It ensures a certain level of scientific rigour and accuracy in published work by giving authors critical feedback to improve their papers. It is an activity academics must find time for among all the other demands they juggle. The burden of peer review is carried unevenly with some researchers doing more than their fair share and others not being offered (or not taking) the chance to participate.

Although it is known that women are underrepresented in STEM fields, the proportion of women contributing as peer reviewers is smaller than their representation in science overall. Female researchers are also less likely to accept invitations to review than their male counterparts. Early career researchers, regardless of their gender, can also be subjected to seniority bias, reducing their opportunities to contribute to peer review. These biases threaten the supply of reviewers needed to cope with the ever-increasing volume of scientific output. In addition, by being unwilling (owing to a lack of confidence, for instance) or unable to peer review, researchers can miss out on a valuable experience that could help them improve their writing skills, provide insights into emerging research topics or the latest advances in a field, and raise their profile as a researcher.

The biases that lead to underrepresentation of certain groups in the peer review process are often unconscious. Therefore, one way to broaden participation is to encourage editors and authors to be mindful of underrepresented groups when considering their choices of referees or recommending peer reviewers to assess their papers. Another approach is to improve access to training.  Some researchers might have the chance to help their supervisors to produce peer-review reports early in their career, but this option might not be available to every researcher. Formal training on how to produce a useful referee report can improve researchers’ confidence to participate in the review process and the quality of their reports, and can help widen representation.  However, such training is rare, and when given the opportunity to review, researchers without training are more likely to return reports that do not meet editors’ expectations, which can decrease the likelihood of them being asked to review in the future.

{credit}Ludic Creatives{/credit}

At Nature Masterclasses we have developed a freely available online course that provides an overview of the peer review process and offers practical tips for how to be a great reviewer through video interviews, informative posts and interactive exercises. In our course, Nature Research editors and renowned scientists explain the importance of peer review and share their insights and experience on what editors expect from a good peer review report. They also advise on how to review an article and how to write and structure an excellent report.  The course also covers the ethics of peer review and new variations and innovations to improve the process. We hope that this resource will help level the playing field and ensure equal opportunities for researchers to peer review, independent of their gender, seniority, access to resources and geographical location.

Better training for peer reviewers will improve researchers’ confidence in their ability to provide an informative assessment and empower them to say “yes” when invited to review. High-quality reviews from trained researchers also benefit the academic community as a whole, delivering better scrutiny of submitted papers, informing editors in their decision-making process and helping authors to improve their publications.

If you’re interested in finding out more about peer review, you can visit the Springer Nature peer reviewer resource page or the Peer Review Week 2018 events page.

Posted in Uncategorized