A dedicated home for computational science

This guest blog comes from Fernando Chirigati, Chief Editor of Nature Computational Science.

 

Recent advances in computer technology, be it in hardware or in software, have revolutionized the way researchers do science: problems that are too complex for human or analytical solutions are now easier to address; problems that would take years to solve can now be unraveled in days, hours, or even seconds. The use and development of advanced computing capabilities to analyse and solve scientific problems, also known as computational science, has undoubtedly played a key role in transformational scientific breakthroughs of our last century, making progress possible in many different disciplines. 

Providing a multidisciplinary forum for computational science

Given the growing interest in and throughput of computational science research, Nature Research is launching Nature Computational Science, a new journal that will provide a home for exceptional work being done in this field. The primary focus of the journal will be on publishing research related to the development and use of computational techniques and mathematical models, as well as on their application to address complex problems across a range of scientific disciplines, including, but not limited to, bioinformatics, cheminformatics, geoinformatics, computational physics and cosmology, materials science, and urban science.

The multidisciplinary nature of computational science will be one of the journal’s strengths, creating an opportunity to bring together researchers from different scientific disciplines who can learn from each other. Nature Computational Science will offer a high-quality, curated forum for stimulating such a multidisciplinary environment, where all of the computational work across these different fields can be found in a single, dedicated place, thus fostering discussion and collaboration. 

We welcome submissions on both fundamental and applied research, from groundbreaking algorithms and frameworks that notably help to advance scientific research, to methodologies that use computing capabilities in novel ways to find new insights and solve challenging real-world problems. We also encourage researchers and developers to submit software systems that have a significant impact on science and that help scientists to experience aha moments.

Like the other Nature titles, we have a dedicated team of in-house manuscript editors, production editors, and editorial assistants in place. Our editors will engage with research communities in academia and industry to provide exceptional author and reviewer service. The journal will uphold high-quality editorial and publishing policies.

By commissioning Comments, Reviews, Perspectives, and News & Views articles, Nature Computational Science will provide accurate, up-to-date background on a variety of relevant topics, as well as encourage discussion on timely matters. Given our multidisciplinary community, we will highlight content that is of interest to a broad audience.

Nature Computational Science will be published online-only, and for submissions from January 2021, the journal will be a Transformative Journal, meaning that Plan S-funded authors, as well as others wishing to publish open access, can submit their primary research to these journals safe in the knowledge that they will be complying with their funders’ requirements.

Fostering reproducible research

Many of the problems that computational science tackles today affect millions of people, which makes it integral to ensure that the complex computational analyses result in conclusions that are trustworthy and actionable. Nature Computational Science will champion the reproducibility of scientific outcomes, ensuring that articles meet the highest standards of reproducibility and transparency in reporting.

Our editors will work with authors and reviewers to pay close attention to the peer review and sharing of data and code. Code will be peer reviewed when central to the manuscript. As other Nature journals, we will be offering authors the option to use container-based platforms to facilitate code peer review and publication (see details here). Nature-branded journals strongly encourage researchers to use open repositories to share their code and data upon publication and to choose a license approved by the Open Source Initiative.

Recognizing the important role of preprint posting in the process of open scientific discourse, Nature Computational Science will offer authors the option to add a link from the published paper to its corresponding preprint, thus ensuring that the links are visible to all readers. Authors will be able to use established preprint servers such as arXiv and bioRxiv.

Launching in January 2021

Nature Computational Science will launch in January 2021, and it is now open for submissions. Find out more on our website. Any questions can be directed to computationalscience@nature.com 

Journals test the Materials Design Analysis Reporting (MDAR) checklist

Reproduced from summary presentation, available at https://osf.io/znq64 

This guest blog comes from Sowmya Swaminathan, Head of Editorial Policy and Research Integrity at Nature Research.

We are pleased to share results from a pilot with 13 journals that tested the Materials Design Analysis Reporting (MDAR) checklist, a minimum standards reporting checklist for the life sciences.

The MDAR framework, a minimum standards reporting framework for life sciences, was designed to provide a harmonizing principle for reporting requirements currently in use at various journals. It is meant to be flexible to adapt to various journal policies, and provides two levels of reporting stringency, a minimum recommendation and a best practice recommendation. The checklist was designed as an optional instrument to help adoption of this reporting framework. A statement of task can be found here.

We are very grateful to the 13 participating journals and platforms – BMC Microbiology, Ecology & Evolution, eLife, EMBO journals, Epigenetics, F1000R, Molecular Cancer Therapeutics, Microbiology Open, PeerJ, PLOS Biology, PNAS, Science, Scientific Reports – for testing the MDAR checklist.

The pilot had two main goals: first, to understand whether the checklist was accessible and useful for authors and editors to help comply with journal policy and second, to understand whether the elements within the checklist are clearly conveyed so as to help fulfil policy expectations. In total, 211 authors across participating journals tested the checklist and provided their feedback. Participating journal teams screened 289 manuscripts using the checklist and 89 of these manuscripts were subject to a dual-assessment by independent reviewers, which allowed us to determine inter-assessor agreement, and thus clarity of specific items on the checklist.

We are encouraged to find that about 80% of authors and editors found the checklist useful to different degrees and that the majority of participating editors reported only a small increase in manuscript processing time as a result of using the checklist. While participating authors and editors did not identify major gaps in the requirements covered in the checklist, the feedback from authors and editors and the inter-assessor agreement results have given us a better understanding of areas in the checklist and elaboration document where the language is unclear and needs to be improved.

We are making the draft MDAR Framework, MDAR Checklist and MDAR Elaboration document and the pilot datasets available here. This work was also recently presented at the NASEM workshop on Enhancing Scientific Reproducibility through Transparent Reporting (slides available here).

We are currently gathering feedback on the MDAR framework, checklist and elaboration document from a broad group of about 40-50 experts on transparency and reproducibility. Based on the feedback from the pilot and the expert input, we anticipate revising all three MDAR outputs by the end of 2019.

We are sharing this update on the work of the Minimum Standards Working Group through coordinated posts on member platforms. If you would like more information about our work and progress, please contact Veronique Kiermer and Sowmya Swaminathan.

On behalf of the “minimal standards” working group:
Karen Chambers (Wiley)
Andy Collings (eLife)
Chris Graf (Wiley)
Veronique Kiermer (Public Library of Science; vkiermer@plos.org)
David Mellor (Center for Open Science)
Malcolm Macleod (University of Edinburgh)
Sowmya Swaminathan (Nature Research/Springer Nature; s.swaminathan@us.nature.com)
Deborah Sweet (Cell Press/Elsevier)
Valda Vinson (Science/AAAS)

What we have learnt testing container-platforms for peer review and publication of code

A year ago now, we launched a trial to test the use of cloud-container platforms for peer review and publication of code at several Nature journals. The trial phase of this initiative is now officially over, and we would like to share the experience and outcomes, and provide an overview of what comes next.

What problem are we trying to solve?

Our guiding principle is that when new code is central to the main claims made in the paper, it is imperative that the code meets the same quality and reproducibility standards as the paper itself. This means the code needs to be properly documented, evaluated by experts so that it is functional (ie peer reviewed), and permanently identified and accessible at the time of publication to ensure the reproducibility of the results (these same principles apply to other research objects like data and protocols, but those were not the focus in this particular trial).

Nature Methods adopted the practice of ‘peer reviewing code’ for software papers in 2007 (editorial). Under this practice, editors require authors to submit the source code, a test dataset and details of installation, and ask the reviewers to install and test-run the code during peer review. This form of peer review is highly time consuming for authors, editors and reviewers, but it is also necessary. It is not uncommon for reviewers to point to basic flaws in the instructions or files that could render the code completely unusable.

Over a year ago, we partnered with Code Ocean, a Docker-based platform that allows authors to deposit code and data and enables users to run the code on the cloud with the set parameters to reproduce the results, or execute the code with new input values.  Together we developed a set of workflows and basic functionality of the platform that enables authors to upload the code and data associated with their submission, and reviewers to access the platform anonymously during peer review (see figure, reproduced from reference 1).

The trial was meant to evaluate if such a platform would provide:

  • A service to authors by assisting them in depositing the code and data and compiling them in an open, executable-based platform.
  • A service to reviewers by making code peer review easier (as easy as clicking a button). Reviewers can evaluate the code in the cloud using computing time that we provide as a publisher, not their own.
  • A service to readers by providing the code associated with the paper in a way that is properly identified, documented and supplied in a publically accessible platform that allows running, reusing and repurposing the code.

The trial was optional for authors at the three participating journals (Nature Methods, Nature Biotechnology and Nature Machine Intelligence) and we tracked feedback from authors and reviewers, author opt in rates and user-engagement metrics.

What have we learnt? Results!

Over 95 papers have now participated in the trial and more than 20 published papers are providing open, verified, properly documented and cited code using the technology.

Despite the additional work that authors need to do upfront when they sign up to the trial, we’ve seen large author uptake, with 54% of authors across all journals opting in to participate. Importantly, our reviewers actively engage with the platform. Capsules have received an average of 34 views via the private links provided to the reviewers. Approximately half of the reviewers signed up and duplicated the capsules, a requirement for running the code. Each reviewer that signs up runs the code 1.3 times, on average.  Importantly, peer review of code in this manner has surfaced problems with some manuscripts that would have led to the ‘irreproducibility’ of the code and the results.

Upon publication, we provide the links to access the code and data in a ‘Code Availability’ statement of the paper, which is provided openly to all readers regardless of access status.

We are looking at ways to improve the workflows and experience by providing better information, an easier workflow for editors and authors and better ways of surfacing code that is shared openly and peer reviewed through the use of badges.

We have been very pleased to see that the high standards we are applying to ensure open science and reproducibility of code in our papers has been noted, as we’ve received very positive feedback about the initiative from authors, reviewers and the science community. You can read more about the initiative and the results in the below editorials and in the Science Editor piece that we recently published.

Science Editor: Three approaches to support reproducible research

Nature Biotechnology: ‘Changing coding culture

Nature Machine Intelligence: ‘Sharing high expectations

Nature Methods: ‘Easing the burden of code review

What’s next?

Given the positive effects we’ve seen so far, we will continue the current practice at the journals. We also want to learn how the workflow would scale and to test it on more scientific disciplines so we have added Nature, Nature Protocols and BMC Bioinformatics to the trial.

A huge thanks to our authors, editors and reviewers who have engaged with us in this journey, we couldn’t have done it without you! We hope that this initiative, alongside others that promote data and protocol sharing, will help us develop our articles to live by the promise of more open and reproducible science.

References

1. Pastrana, E., Kousta, S. & Swaminathan, S. Three Approaches to Support Reproducible Research. Science Editor

AUTHORS: This guest blog comes from Erika Pastrana, Editorial Director for the Nature Research Journals and Sowmya Swaminathan, Head of Editorial Policy and Research Integrity at Nature Research.

Peer Review Week 2019: Improving peer review quality through transparent, reproducible research

This guest blog comes from Sowmya Swaminathan, Head of Editorial Policy and Research Integrity for Nature Research

By the time a research study reaches the peer review process, many crucial decisions that affect the rigor of the study design, methodology, data collection, analysis and reporting have already been made. Nevertheless, by developing and implementing editorial policies and by providing a publishing infrastructure that supports publication of transparent reproducible research, editors, journals and publishers can help improve the published paper, adding value and quality to the peer review and publication process.

Broadly speaking, four pillars – policy, publishing infrastructure, advocacy and awareness, and collective action – have driven editorial and publishing innovation and furthered our mission to work in partnership with the research community to advance quality and integrity.  In this blog post I provide an overview and examples of the many initiatives undertaken at Nature Research to support publication of reproducible research.

Policy

Transparency is at the heart of our policies designed to improve the reproducibility of published research. We ask authors to report information about their experimental design, as well as to clearly identify and make their datasets, code and materials available, also making it easier for reviewers to access the information they need to assess the study appropriately. We strongly support open research practices such as sharing the underlying building blocks of the research article – data, code and protocols – through repositories.

We have found that policies centred on transparency have had an impact.  For example, independent studies have found that the Nature Research Life Science Reporting Summary, an instrument to support transparent reporting in life science articles, which we introduced in 2013, has improved reporting of statistics and other aspects of experimental design and analysis [1,2].

We recognize that what works for reporting in life sciences is often not applicable to many of the other disciplines. While we advocate for a minimum threshold for transparency across core aspects of data, code, and materials, we have also worked with experts to tailor approaches that are designed to meet field-specific needs, for example in areas of photovoltaics and photonics research.

Data availability is another area where implementing a policy focused on transparency has had clear benefits. Since 2016, when we introduced a mandatory data availability statement on all research articles published in Nature-branded journals, we have seen a rise in data sharing through public repositories across our journals, especially in the life sciences, and increased appreciation of the value of data sharing to underscore the integrity and credibility of published work in many disciplines.

Publishing Infrastructure

Designing an innovative peer review and publishing infrastructure that supports all aspects of publishing reproducible research is central to our overall vision for an open and transparent ecosystem. A robust technology infrastructure is also essential to drive large-scale adoption of best practice approaches by authors, reviewers and editors. Over the years, we have introduced a number of publishing innovations that have furthered our commitment to reproducibility. These include avenues for publishing data and protocols such as Scientific Data and Protocol Exchange, and new article formats like Data Descriptors and Registered Reports that focus on data and methodological rigour respectively, rather than the specific results.

More recently, three Nature Research journals have tested executable platforms for peer review and publication of code. Although the policy and practice of peer reviewing code has been in place at these journals for many years, powering the process through an executable platform sets the stage for a more seamless and scalable experience for authors, reviewers and editors.

Advocacy and awareness

Advocacy and awareness-raising in the broader research and publishing community are other important areas of engagement for us in advancing our commitment to integrity in research. In the pages of Nature and the Nature-branded journals, we have often highlighted and debated the many different, complex issues, challenges and solutions on the path to transparent, reproducible research including discipline-specific needs and barriers to reproducible research (for example, see recent discussions about reproducibility in nano-medicine and data and code sharing in physics).

Collective action

Shifting entrenched patterns of how research is conducted and published requires stakeholders across the research and publishing community to work collectively in the push for better practice. Nature Research journals are proud to have participated in and supported many such efforts to accelerate data sharing, advance best practice toward open and transparent research and align on minimum reporting standards.

We believe that our editors and journals have an important role to play in tackling the many issues that affect the quality and integrity of published research. Indeed, we feel privileged to be able to engage with a global and multidisciplinary research community and are committed to furthering the cause of transparent, reliable research with all the tools at our disposal.

Join the discussions during Peer Review Week: #QualityinPeerReview, #PeerRevWk2019 #PeerReviewWeek

References:

  1. The NPQIP Collaborative group, Did a change in Nature journals’ editorial policy for life sciences research improve reporting? BMJ Open Science 2019;3:e000035. doi: 10.1136/bmjos-2017-000035
  2. Han S, et al. (2017) A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review. PLoS ONE 12(9): e0183591. https://doi.org/10.1371/journal.pone.0183591

Recognising the contribution of Nature Research journal referees

This guest blog comes from Ritu Dhand, VP Editorial, Nature Journals.

Nature’s trial to formally acknowledge the contribution of its peer reviewers by naming them on published papers with permission of the referee and author has been expanded to eight more Nature Research journals. 55% of referees opted in during the initial phase.

The Nature-branded journals publish over 8,000 primary research papers each year. Behind each paper is a talented team of reviewers who have helped our professional editors to assess the scientific claims being made. Peer review is the formal quality-assurance mechanism whereby research manuscripts are subjected to technical evaluation and assessment of impact, and is a cornerstone of quality, integrity and reproducibility in research. However, most reviewers receive little recognition for their efforts. Given how highly we value the contributions of our reviewers, we wanted to give them an option to be formally and publicly recognised for their role in the peer-review process.

We have also been exploring ways to introduce transparency to the peer review process. For several years, researchers have argued that single-blind peer review, where the referees are unknown to the author, is sub-optimal. The lack of transparency means researchers must have confidence that referees and editors are acting with integrity and without bias. In a survey with responses from 1,230 Nature referees, 82% agreed that the traditional peer review process is effective in ensuring work that is published is high-quality. Yet 63% of respondents also agreed that publishers should experiment with alternative peer review methods, and 51% agreed that peer review could be more transparent and that they expect publishers to do more.

As a way of both acknowledging the work of reviewers and introducing transparency to the peer-review process, we launched our referee recognition trial at Nature in spring 2016. At the end of the peer-review process, authors and peer reviewers are given the option of having referee names formally acknowledged on the published paper. If the authors also agree, peer reviewers who give permission will have their name included in the ‘reviewer information’ section of the paper where we thank them for their contribution. In some cases, some referees on a paper may choose to have their name listed, while others may choose to remain anonymous.

Over the last three years, around 3,700 Nature referees across the natural sciences have chosen to be publicly recognised and around 80% of Nature papers have at least one referee named. We have not seen any significant differences in behaviour between researchers in the life and physical sciences. 91% of Nature authors opted in to the trial, while among referees, 55% opted in (26% opted out and 19% did not respond). When surveyed, 80% of referees that had participated in the Nature referee recognition trial said they would be happy to be named again. The Nature reviews journals also rolled out the referee recognition trial one year ago and saw 57% of reviewers opting in to be publicly named. More recently, we have rolled out the trial at eight Nature Research journals:  Nature Astronomy, Nature Climate Change, Nature Nanotechnology, Nature Neuroscience, Nature Physics, Nature Plants, Nature Protocols and Nature Communications.

We analysed the gender and career stage of authors and referees who took part in the trial over a nine-month period (where these data were available in our peer review system or could be cross-checked with public sources). The percentages of female and male corresponding authors opting into the trial were similar: 90% and 93%, respectively. The proportion of female and male referees who agreed to be named was also similar: around 50% and 56%, respectively. A similar proportion of referees from early, middle and late career stages were happy to be named: 54% of researchers/post-docs, 50% of assistant/associate professors, and 55% of professors opted in to the trial.

We also surveyed all reviewers who had reviewed for our journals in the course of one year to better understand their motivations for participating in the peer-review process and their views on peer review more broadly. Altruism was a key driver of participation in peer review. 87% of researchers who responded said they considered it their academic duty to peer review and 77% said that participating would help to safeguard the quality of published research. Conversely, only 6% of reviewers noted that participating in peer review enhanced their CV and 7% said it encouraged favourable views from editors. Unsurprisingly, most reviewers (94%) said that the subject area is a key factor when deciding which manuscripts to review.

Despite the time and effort peer review requires, 71% of respondents did not expect acknowledgement for peer review and 58% thought that rewards may compromise the review procedure. However, when asked from whom would they most value recognition, 44% said they would most value recognition for peer review to come from publishers or editors.

We also asked reviewers what they thought the impacts of public referee recognition might be. 78% felt that naming the reviewers would result in better written reports; 68% thought it would have a positive impact on transparency; 47% thought it would have a positive effect on honesty of reporting; and 52% of those who had not been formally acknowledged by a Nature-branded journal indicated that they would consider being named if given the option.

About a quarter of researchers opted out of the trial and appeared to be against the principle of referees being named on published papers. Their concerns mainly focused around the practice could increase the chances of the system being gamed by individuals — perhaps starting a ‘you owe me’ mechanism — or referee reports being toned down, either to avoid upsetting or from fear of retaliation from disgruntled authors, particularly those in senior positions. Many of these researchers believe that peer review should always be confidential and are against this level of transparency. For these reasons, referee recognition remains optional on the Nature-branded journals.

To see so many of our referees choosing to be named is a reflection of the changing attitudes towards peer review.  We are happy that we can publicly acknowledge the contribution to peer review of so many of them. We continue to listen to the community and acknowledge the call for further consideration of other ways to do peer review. Nature Communications has been publishing referee reports for over three years, and we are discussing whether offering this as an option on other Nature Research journals is something we can practically consider in the future. Watch this space for further information!

 

A related Nature editorial is also available here: Three-year trial shows support for recognizing peer reviewers

The Great Pacific Garbage Patch, beer supplies and more: the most popular science stories of 2018

Twenty-two of 2018’s papers in the Altmetric Top 100 were published in Nature Research journals: Nature, Nature Communications, Nature Plants, Nature Biotechnology, Nature Climate Change, Nature Human Behaviour, npj Science of Learning and Scientific Reports.

Launched today, the annual Altmetric Top 100 showcases the research published this year that has caught the public eye through international online attention. By tracking what people are saying about scholarly articles in the news, blogs, on social media networks, Wikipedia and many other sources, Altmetric calculates an Attention Score for each paper.

In this blog, the team in the Nature Research Press Office has picked some of their favourite studies, summarised their findings, and linked to coverage they received in the wider media. The full list is available at https://www.altmetric.com/top100/2018.

For articles from our subscription journals, we’ve included Springer Nature SharedIt links, which means anyone can read them. SharedIt, our free content-sharing initiative, was launched in October 2016.

#7 Scientific Reports — Evidence that the Great Pacific Garbage Patch is rapidly accumulating plastic

More than 79,000 tonnes of ocean plastic are floating inside The Great Pacific Garbage Patch, a figure up to 16 times higher than previously estimated, reported a study published in Scientific Reports earlier this year.

The study proved popular with the press generating over 1,400 news stories. Outlets that covered the research included NPRBBC News, National Geographic, The Hindu and Spiegel.

#9 Nature — Global warming transforms coral reef assemblages

Credit: ARC Centre of Excellence for Coral Reef StudiesStudies/ Gergely Torda

A paper in Nature reported that corals on the Great Barrier Reef experienced a catastrophic die-off following the extended marine heatwave of 2016, transforming the ecological functioning of almost one-third of the 3,863 reefs that comprise the world’s largest reef system. The paper generated over 1,000 news stories, including articles in The New York Times, NPR, The Financial Times and Le Monde.

#19 Nature Plants — Beer supply threatened by future weather extremes

 

Beer’s main ingredient, barley, will have substantially diminished yields as severe droughts and heat extremes become more frequent owing to climate change, reported a paper published in Nature Plants in October. Beer will become scarcer and more expensive to varying degrees depending on national economic status and culture. In Ireland, for example, beer prices could increase by between 43% and 338% by 2099 under the most severe climate scenario.

The Guardian, the Associated Press, Reuters, NPR, and BuzzFeed were among those to report on the findings.

#50 Nature — The genome of the offspring of a Neanderthal mother and a Denisovan father

Credit: Bence Viola, Max Planck Institute for Evolutionary Anthropology)

A study in Nature reported the genome sequence of an ancient hominin bone fragment from Denisova Cave, Russia. The results suggested that the adolescent individual had a Neanderthal mother and a Denisovan father and provided direct evidence of interbreeding between Neanderthals and Denisovans. Coverage in nearly 2,000 news outlets, including 35 target outlets and nearly 250 web stories in China. The story was covered by BBC News, El País, Science, People’s Daily and National Geographic.

#69 Nature Communications — Embryos and embryonic stem cells from the white rhinoceros

Assisted reproductive technologies have been used to create hybrid embryos of the endangered northern white rhinoceros and a closely related subspecies, according to a Nature Communications study in July. In vitro fertilization has been used before in large mammals such as horses, but this report was the first to successfully develop rhinoceros embryos to the blastocyst stage in cell culture — potentially ready for implantation. The findings raise the possibility of being able to preserve some of the genes of the northern white rhinoceros.

Media coverage of the findings included The New York Times, Nature, ABC Australia, The Financial Times and Die Zeit.

#83 Nature Human Behaviour — Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015

Attempts to replicate 21 experimental social science studies published in Nature and Science between 2010 and 2015 found that one of the four Nature papers and seven of the seventeen Science papers evaluated did not replicate under the primary high-powered replication method used. The study was published in Nature Human Behaviour in August. The original studies likely contained false positives and inflated effect sizes, the authors suggested.

Reporting on the study included articles in The Washington Post, Times Higher Education, The Atlantic and BuzzFeed.

Towards minimal reporting standards for life scientists

This guest blog comes from a group of journal editors and experts in reproducibility and transparent reporting, who are putting together a framework for minimal reporting standards in the life sciences.

Transparency in reporting benefits scientific communication on many levels. While specific needs and expectations vary across fields, the effective use of research findings relies on the availability of core information about research materials, data, and analysis. These are the underlying principles that led to the design of the TOP guidelines, which outline a framework that over 1,000  journals and publishers have elected to follow.

In September 2017, the second major TOP guidelines workshop hosted by the Center for Open Science led to a position paper suggesting a standardized approach for reporting, provisionally entitled the TOP Statement.

Based on discussions at that meeting and at the 2017 Peer Review Congress, in December 2017 we convened a working group of journal editors and experts to support this overall effort by developing a minimal set of reporting standards for research in the life sciences. This framework could both inform the TOP statement and serve in other contexts where better reporting can improve reproducibility.

In this “minimal standards” working group, we aim to draw from the collective experience of journals implementing a range of different approaches designed to enhance reporting and reproducibility (e.g. STAR Methods), existing life science checklists (e.g. the Nature Research reporting summary), and results of recent meta-research studying the efficacy of such interventions (e.g. Macleod et al. 2017; Han et al. 2017); to devise a set of minimal expectations that journals could agree to ask their authors to meet.

An advantage of aligning on minimal standards is consistency in policies and expectations across journals, which is beneficial for authors as they prepare papers for publication and for reviewers as they assess them. We also hope that other major stakeholders engaged in the research cycle, including institutional review bodies and funders, will see the value of agreeing on this type of reporting standard as a minimal expectation, as broad-based endorsement from an early stage in the research life cycle would provide important support for overall adoption and implementation.

The working group will provide three key deliverables:

  • A “minimal standards” framework setting out minimal expectations across four core areas of materials (including data and code), design, analysis and reporting (MDAR)
  • A “minimal standards” checklist intended to operationalize the framework by serving as an implementation tool to aid authors in complying with journal policies, and editors and reviewers in assessing reporting and compliance with policies
  • An “elaboration” document or user guide providing context for the “minimal standards” framework and checklist

While all three outputs are intended to provide tools to help journals, researchers and other stakeholders with adoption of the minimal standards framework, we do not intend to be prescriptive about the precise mechanism of implementation and we anticipate that in many cases they will be used as a yardstick within the context of an existing reporting system. Nevertheless, we hope these tools will provide a consolidated view to help raise reporting standards across the life sciences.

We anticipate completing draft versions of these tools by spring 2019.  We also hope to work with a wider group of journals, as well as funders, institutions, and researchers to gather feedback and seek consensus towards defining and applying these minimal standards.  As part of this feedback stage, we will conduct a “community pilot” involving interested journals to test application of the tools we provide within the context of their procedures and community. Editors or publishers who are interested in participating are encouraged to contact Veronique Kiermer and Sowmya Swaminathan for more information.

In the current working group, we have focused our efforts on life science papers because of extensive previous activity in this field in devising reporting standards for research and publication.  However, once the life science guidelines are in place we hope that we and others will be able to extend this effort to other areas of science and devise similar tools for other fields.  Ultimately, we believe that a shared understanding of expectations and clear information about experimental and analytical procedures have the potential to benefit many different areas of research as we all work towards greater transparency and the support that it provides for the progress of science.

We are posting this notification across multiple venues to maximize communication and outreach, to give as many people as possible an opportunity to influence our thinking.  We welcome comments and suggestions within the context of any of these posts or in other venues.  If you have additional questions about our work, would like informed of progress, or would like to volunteer to provide input, please contact Veronique Kiermer and Sowmya Swaminathan.

On behalf of the “minimal standards” working group:
Karen Chambers (Wiley)
Andy Collings (eLife)
Chris Graf (Wiley)
Veronique Kiermer (Public Library of Science; vkiermer@plos.org)
David Mellor (Center for Open Science)
Malcolm Macleod (University of Edinburgh)
Sowmya Swaminathan (Nature Research/Springer Nature; s.swaminathan@us.nature.com)
Deborah Sweet (Cell Press/Elsevier)
Valda Vinson (Science/AAAS)

Supporting early career researchers through travel grants

As part of our commitment to championing the cause of promising early career researchers, the Communications journals (Biology, Chemistry and Physics) introduce a new series of travel grants.

This guest blog comes from Joe Bennett, Publisher, Communications journals.

Today the Communications journals have introduced travel grants for early career researchers. Our hope is that by introducing these grants we can reach promising but underfunded researchers who need support the most. This is the first round of what we expect will become a recurring process, and is part of a longstanding commitment by the journals to champion the cause of early career researchers.

The grants

Three grants, each of €2500, will be made available. We have chosen this amount as it will allow support for a researcher, without access to other funding, to attend an international scientific meeting and present their work. We understand that early career researchers are best placed to choose where they would benefit the most from presenting their work, and so applicants are invited to tell us which meeting they require funding to attend. As the grants are designed to support researchers who are working within limited means, recipients will receive the grant funds in full immediately after the panel has made their choice.

We have chosen to introduce this first round of travel grants as we believe that our journals should do more than just publish great science, they should also play an active role within the  communities they serve. We also know that travel to scientific conferences can allow researchers to present their work, hear about the latest research and meet other scientists from around the world to discuss ideas and possible collaborations.

The grants are available across the breadth of the subject areas of biology, chemistry and physics. Although the grants match the subject areas covered by the journals Communications Biology, Communications Chemistry and Communications Physics there is no requirement for applicants to have published in, or to have reviewed for the journals previously. Likewise there is no obligation for the grant recipients to publish their work in the journals.

Supporting early career researchers is vital

We have written before about the challenges facing early career researchers, including the fierce competition for funding.They make a positive contribution to our journals as authors, reviewers and Editorial Board Members. Many of our own in-house editorial staff were also early career researchers before joining Nature Research. Early career researchers are a part of the fabric of our journals and we believe that their work should be supported and their achievements highlighted. This is why we are proud to introduce the first round of grants to strengthen our commitment to champion their work.

A fair assessment

We considered carefully how to make the assessment process as fair as possible and to be mindful of how our unconscious bias can influence decision making. We have designed our process to account for this and will consider each application on its own terms whilst guided by a shared set of principles. We have tried to ensure that our selection panels include members with a broad and diverse range of experiences and have considered factors including gender, geography and whether they were the first member of their family to join academia when deciding the composition of our panels. Active scientists drawn from the Editorial Board of each journal will join our in-house editors on the judging panels.

To be considered for a grant, applicants must first demonstrate that they have a need for funding support. We will then consider the promise of the research within the application when we choose the recipients. All applicants will be judged against the same criteria:

  • Has the applicant demonstrated that without the grant they would not have the necessary funding available to enable travel to the event?
  • Does the applicant plan to present research that the assessment panel feel has outstanding potential and should be seen by the wider community?
  • How does the applicant stand to benefit from travelling to and attending the meeting?
  • Has the applicant been working within a scientifically emerging country or in difficult circumstances?

We admire researchers who conduct research with limited resources, who have overcome systemic barriers or any number of other challenges in pursuit of their ambition to pursue great science. When assessing applicants we will not be selecting the grant recipients based on an exceptional track record, but rather looking for applicants with outstanding promise who have been working within difficult circumstances. Not only will the grants benefit the researchers in question, but empowering and including traditionally marginalised researchers benefits the wider community as we get to meet them, hear their ideas, and learn from their experiences.

Apply

The grants are now open for applications until 5th November 2018. To read more and apply please visit our website: www.nature.com/early-career-travel-grants

Nature Research journals improve accessibility of data availability statements

The Nature Research journals have taken further steps to promote transparency and reproducibility by making information on the availability of research data within our articles easier to access.      

This guest blog comes from Iain Hrynaszkiewicz, Head of Data Publishing, Open Research Group at Springer Nature, and Sowmya Swaminathan, Head of Editorial Policy and Research Integrity at Nature Research.

All research articles published in Nature Research titles now provide data availability statements as a distinct article section that is freely and universally accessible. This means that data availability statements are now equivalently accessible to abstracts, full reference lists, supplementary information, acknowledgements, and other key article information. See two examples from Nature here (pictured) and here.

Since 2016, we have required all primary research papers published in Nature Research journals to include a data availability statement. The aim of this policy was to make the conditions of access to the “minimal dataset” ― defined as the dataset necessary to interpret, validate and extend the findings ― transparent to all readers. Data availability statements have become a widely established mechanism for authors to consistently describe if and how research data supporting their publications are available.  Such statements are required by many other Springer Nature journals in addition to the Nature Research journals, including the BMC group of journals, as well as those of other publishers. They are also increasingly used by funding agencies, institutions and researchers, as a means to measure data-sharing practices and behaviours ― and for building better connections between data and literature. Some funding agencies, such as the UK’s Engineering and Physical Sciences Research Council, also require that data access statements are provided for policy compliance.

We believe that enhancing discoverability of data availability statements, by providing them as a separate section, could also:

  • Increase accessibility and reuse of the data-supporting publications, by making it easier to find ― by humans and machines
  • Encourage citation and reuse of data, including of data that are not publicly available
  • Promote good practices and common standards in preparing data availability statements
  • Enable funding agencies, institutions and other stakeholders to better monitor data sharing and compliance with data sharing-policies
  • Enable more precise research of data-sharing behaviours and practices

Our change in the way we present data availability statements to readers underscores our commitment to facilitating data access and the importance of data as a crucial component underlying the integrity, re-use and extension of published research. Our guide to authors and our specific guidance on data availability and data citations have been updated to reflect these changes.

Nature Research journals trial new tools to enhance code peer review and publication

Starting this month, three Nature journals—Nature Methods, Nature Biotechnology and Nature Machine Intelligence—will run a trial in partnership with Code Ocean to enable authors to share fully-functional and executable code accompanying their articles and to facilitate peer review of code by the reviewers.

 This guest blog comes from Erika Pastrana, Executive Editor for the Nature Research Journals and Sowmya Swaminathan, Head of Editorial Policy and Research Integrity at Nature Research.

Increasing the reproducibility of scientific findings is a goal that all of us in the research enterprise share.

One path towards achieving this is to encourage authors to provide all relevant data and code associated with a published article. This enables others to re-run the analyses, reproduce the results and re-use the code and data to build on the work, advancing science further.

Since 2014 the Nature journals have required authors of studies with custom code or algorithms that are central to the conclusions to provide a “Code Availability” statement indicating whether and how the code or algorithm can be accessed, including any restrictions to access. In 2016, we adopted a policy of mandatory “data availability statements” on all Nature journal papers. The guiding principle is that these statements must provide enough information for readers to be able to reproduce the results and access the code and data for use in their own research.

A number of Nature Research journals have, for years, also peer reviewed code when it is central to the paper to ensure it is vetted scientifically, and provided the code as part of the published paper, typically in the supplementary information or via a link to a folder on GitHub (see this Nature Methods editorial from 2014). Despite our long-running efforts to publish code that is peer reviewed and useful, our platforms have not always been best suited to this task.

We know peer reviewing code is cumbersome as it requires authors to compile the code in a format that is accessible for others to check, and reviewers to download the code and data, set up the computational environment in their own computer and install the many dependencies that are often required to make it all work. To facilitate this process, we recently developed new guidelines for authors and a checklist to help during code submisison—but there are now tools available that go beyond checklists and PDFs.

Code Ocean is a computational reproducibility platform that aims to make code more readily executable and discoverable. The platform, which is based on Docker, hosts the code and data in the necessary computational environment and allows users to re-run the analysis in the cloud and reproduce the results, bypassing the need to install the software.

The trial is optional for authors of papers undergoing code peer review at these selected journals. Reviewers will be offered as much runtime as they need to run the code and analyses (100 hours per month by default), and upon publication, the code and data will be assigned a digital object identifier (DOI) and cited in the article, enabling readers to  access it freely via a link. Code Ocean, through CLOCKSS, will guarantee the preservation of the code, data, results, metadata and computational environment.

By partnering with Code Ocean, we hope to further facilitate compliance with our policies and practices, and to provide benefits to authors, reviewers and readers by improving the peer review experience and facilitating sharing of code that is reproducible and useful. We hope this functionality will also enhance our papers by linking to a platform where the results, code and data can be more easily verified, reproduced and re-used.

We will be attentively listening to the response in our community, and will be surveying all the authors and reviewers that participate in the trial to learn from their experience.

Code Ocean web-based interface