Points of Significance is now free access

The Points of Significance column on statistics is now free access as part of a larger resource on statistics on nature.com

Stats Collection

{credit}Erin Dewalt{/credit}

When Nature Methods launched the Points of Significance column over a year ago we were hopeful that those biologists with a limited background in statistics, or who just needed a refresher, would find it accessible and useful for helping them improve the statistical rigor of their research. We have since received comments from researchers and educators in fields ranging from biology to meteorology who say they read the column regularly and use it in their courses. Hearing that the column has had a wider impact than we anticipated has been very encouraging and we hope the column continues for quite some time.

In the meantime, I’m very happy to say that the entire column is now freely available as part of the Statistics for biologists collection, a new free resource on nature.com that collects selected published articles on statistics from across the Nature family of journals. A blog post on Of Schemes and Memes by Veronique Kiermer (Director, Authors and Reviewers services) introduces this resource and discusses its role in NPG’s ongoing efforts to improve reproducibility in science.

Although the primary focus of the articles is on biological applications, researchers from many disciplines will find useful information in this collection. The Points of Significance articles have a dedicated page listing these articles in chronological order with a brief summary of each.

The Statistics for biologists collection will not be a static resource, but will be continuously updated with new content. In particular, each new Points of Significance article will be added as it is published. We have also tried to highlight content of a similar nature that other publishers have made freely available.

I’d like to thank Martin Krzywinski and Naomi Altman for their continued hard work on these columns, and our guest author Paul Blainey. Without their dedication and time the Points of Significance column would never have been this successful. Now it will be easier for even more people to benefit from these wonderful articles.

Super-resolution microscopy at Nature Methods

On this 10th anniversary of the first issue of Nature Methods it is appropriate to look back at the relationship between the journal and super-resolution microscopy, one of the technologies we have chosen as one of the top ten methods developments in the ten years since Nature Methods published it first issue.

Super-resolution microscopy first appeared in Nature Methods with the online publication of two papers on August 9, 2006. One demonstrated the first super-resolution microscopy image using a genetically-encoded fluorescent probe (Willig et. al., 2006) and the other was the first publication describing stochastic optical reconstruction microscopy (STORM; Rust et. al., 2006), the class of methods now often referred to as single molecule localization microscopy (SMLM). Initially, the papers were mostly overshadowed by the media storm accompanying the publication one day later of photoactivated localization microscopy (PALM) by Betzig et. al. in Science, but the STORM and PALM papers together were instrumental in driving wider development of the nascent super-resolution microscopy field that had previously been confined to a small number of highly specialist groups.

A visualization of SRM papers published in Nature Methods over the years.

A visualization of SRM papers published in Nature Methods over the years.{credit}D. Evanko{/credit}

Nature Methods has now published 64 articles on super-resolution microscopy, 49 full original research articles, 11 Correspondences and 4 Review and Commentary articles. The accompanying illustration conveys the wide range of topics covered and their historical progression. Super-resolution microscopy was also our choice of Method of the Year in 2008.

The first three years of super-resolution microscopy (SRM) publications in Nature Methods were dominated by advances in localization-based SRM and early attempts at live cell SRM. Betzig and colleagues defined important considerations for performing live-cell PALM (Shroff et. al., 2008) and PALM was adapted as a massively parallel single particle tracking technique called sptPALM (Manley et. al., 2008). Another early paper demonstrated the use of dual-plane imaging for 3D SRM several microns deep into a sample (Juette et. al., 2008), but to this day SRM is still dominated by 2D imaging.

It was clear that the probes used for localization-based SRM were critical to the performance of these techniques. In early 2009 we published the new fluorescent proteins, PA-mCherry (Subach et. al., 2009) and mEOS2 (McKinney et. al., 2009), from the Verkhusha and Looger labs respectively. The performance characteristics of mEOS2 and the Looger lab’s very open reagent sharing habit helped contribute to this protein dominating much fluorescence protein based SRM.

In late 2009 we began to address the prior lack of sufficient attention to the analysis methods used in localization-based SRM with the publishing of two papers (Mortensen et. al., 2009 and Smith et. al., 2009) focused on minimum likelihood algorithms for precisely estimating the centers of fluorophore image spots, a fundamental underpinning of the whole class of localization-based SRM methods. At this time we also started publishing Correspondences describing user-friendly software for performing the early localization analysis steps; first LivePALM (Hedde et. al., 2009) and QuickPALM (Henriques et. al., 2010) and in later years DAOSTORM (Holden et. al., 2011) and RapidSTORM (Wolter et. al., 2012). During this period researchers also scoured other fields for algorithms new to imaging analysis and imported the powerful compressed sensing analysis method (Zhu et. al., 2012), developed novel localization methods like radial symmetry (Parthasarathy, 2012) and characterized and corrected the noise attributes of sCMOS cameras so that they could challenge EMCCDs as the camera of choice for localization-based SRM (Huang et. al., 2013). A particularly interesting development was the use of Bayesian analysis for image generation that didn’t require explicit fluorophore localization and could work with the intrinsic blinking and bleaching of high density GFP-labeled live samples (Cox et. al., 2011).

We soon determined that the later analysis steps required for interpretation of the underlying biology were most ripe for, and in need of, further development. Improper analysis could easily lead to artifacts, particularly when trying to use localization-based SRM to examine protein clustering (Annibale et. al., 2011). Notable early work in this area was the use of pair correlation analysis to examine protein organization in the plasma membrane (Sengupta et. al., 2011). An ongoing issue in analyzing localization-based SRM images has been determining the resolution of the resulting image, a far less straightforward task than one might expect. Adoption and development of Fourier ring correlation from electron microscopy provided a compelling solution to this challenge (Niewenhuize et. al., 2013) but more work remains to be done before researchers can be confident of reliably measuring the resolution of their images.

Although manuscripts with a focus on analysis methods made up the majority of articles published in Nature Methods over the past 10 years, there were also continuous developments in imaging technology. STED microscopy was improved through the use of continuous wave lasers (Willig et. al., 2007) and time gating (Vicidomini et. al., 2011). A STED configuration that created a spherical scanning spot was used to image the 3D structure of a single mitochondria (Schmidt et. al., 2008). There was also further development of the optical methods used for localization-based SRM. Temporal focusing of two-photon irradiation allowed confined photoactivation in whole cells, thus limiting photobleaching outside the imaging area (York et. al., 2011). Confined photoactivation and imaging was also accomplished using dual orthogonal objectives to combine light-sheet microscopy with localization-based SRM (Cella Zanacchi et. al., 2011). Finally, a dual-objective scheme with objectives facing one another combined with astigmatism improved the resolution of 3D localization-based SRM (Xu et. al., 2012).

In recent years, alternative SRM methods made an appearance. The scanning-based method, reversible saturable optical fluorescence transitions (RESOLFT), was massively parallelized and used for imaging whole living cells (Chmyrov et. al., 2013). An intriguing recent report combined elements of STED and localization-based SRM in a new imaging modality that discriminates nanoareas of fluorescently labeled rigid proteins using polarization (Hafi et. al., 2014).

Improvements in imaging technology are of little use if you can’t label your targets of interest. Labeling methods have therefore been an important component of the SRM papers published in Nature Methods. Trimethoprim labeling (Wombacher et. al., 2010) and SNAP tag labeling (Klein et. al., 2011) both allowed direct labeling of proteins in live cells and this was combined with bright fast-switching probes to allow fast 3D localization-based SRM in whole living cells at ~25 nm resolution (Jones et. al., 2011). Other investigators improved labeling not by direct labeling using chemical tags, but by using smaller affinity probes such as nanobodies (Ries et. al., 2012) or aptamers (Opazo et. al., 2012). A particularly intriguing class of labeling methods relies on DNA oligos. Barcoding (Lubeck et. al., 2012) and sequential labeling (Jungmann et. al., 2014; Lubeck et. al., 2014) allowed highly multiplex labeling of target proteins and nucleic acids.

With so many developments and choices for researchers it is increasingly important for them to have quality data on the relative performance of different techniques and tools. To this end, Nature Methods has been publishing increasing numbers of Analysis articles reporting such performance comparisons and SRM has been no exception to this. The performance of a wide selection of chemical fluorophores for localization-based SRM was characterized in a real-life imaging situation (Dempsey et. al., 2011) and a recent report characterized the photoactivation efficiency of fluorescent proteins (Durisic et. al., 2014).

We hope you enjoyed this brief summary of SRM in Nature Methods. Although I have tried to include much of what Nature Methods has published in this field, the summary is by no means comprehensive. Most significantly, it doesn’t include many of the methods that are used to double the resolution of fluorescence microscopy. If there is sufficient interest we will consider extending our summary to include both these and more recent developments as they occur.

A change of leadership at Nature Methods

The former Chief Editor of Nature Methods bids farewell to his cherished scientific journal and welcomes its new Chief.

The first of this month marked my tenth year working for Nature Methods, nearly six of those years as Chief Editor. I joined the Nature Methods’ team as an assistant editor just as the journal was preparing its first issue and quickly became enamored of both the journal and the job. Coming from an engineering background, I always had an affinity for the tools and techniques used by scientists, and working with the researchers developing the next generations of research methods has been an enormous privilege.

The changes that have taken place over the past decade have been remarkable, and for someone who started graduate school pouring polyacrylamide sequencing gels the differences in what budding researchers have at their disposal in the lab are breathtaking. I hope that over the years Nature Methods has played some part in highlighting the importance of research methods development and helping give due credit to those researchers whose work enables so many others to make discoveries that would otherwise be infeasible or impossible.

One of my best memories as an editor is still the excitement of seeing Xiaowei Zhuang’s 2006 Gordon Research Conference poster describing stochastic optical reconstruction microscopy (STORM) and thinking, “What an amazing idea!” The excitement redoubled soon after at the same conference when George Patterson gave his talk on photoactivated localization microscopy (PALM). Little did I know at the time that this would be the start of a long running relationship between Nature Methods and the yet-to-emerge super-resolution microscopy community.

I will sorely miss the interactions I’ve had with the microscopy, biophysics and other communities I worked so closely with during my time at Nature Methods, but I’m confident that the journal will continue to serve these communities much as it has over these last ten years. The experience of developing, editing and publishing the Points of View column on data visualization and Points of Significance column on statistics has also been a highlight of my time at Nature Methods and, as services to authors, I’m glad I will still be involved with these.

In my new role as Head of Editorial Services for Nature Publishing Group and Palgrave Macmillan I will be serving the much broader research community and focused on improving the author (and reviewer) experience before manuscript acceptance. I’ll still be reachable at my old email address and on Twitter where I welcome comments on what can be done to improve the service we offer as a publisher.

I can’t say enough about how touched I am by the sentiments expressed in emails I’ve received from authors and reviewers I’ve worked with over the years. It has been an honor and privilege working with such amazing people and helping to communicate to the wider community your hard work developing important research methods and tools.

Even though I’ve moved on, Nature Methods lies in good hands with its new Chief Editor, Natalie de Souza. Natalie was a manuscript editor at Nature Methods for over seven years and is well placed to build on its past success.

A heartfelt thank-you to all the authors, reviewers, readers and colleagues I’ve worked with over the past ten years. The journal could never have been as successful as it has been without all your contributions.

Help create Nature Methods’ 10 year anniversary cover

Our October 2014 issue will be a celebration of our 10 year anniversary and we want you to help us celebrate by contributing to the creation of the anniversary issue cover.

We are looking for striking original images of the number 10 created by techniques or tools used for basic laboratory research in the biological sciences. This could be fluorescent cells patterned in the shape of a 10, a 10 written using two photon lithography, a DNA Origami-based 10, or 10 written using any number of other methods. The more imaginative the better!

The best image, or images, will be used to create the 10 year anniversary cover for our October issue. We will also post the best submissions, or all of them if there aren’t too many, here on methagora with a short description of how the image was created and by whom. If you use a method you published with us we’ll also highlight that article.

Please no drawn, computer generated, or heavily edited images. Images that work well on a black background may be preferable if the final cover ends up being a composite of several submissions, but we encourage submissions regardless of the background color.

We must receive your submission by August 25, 2014 for us to consider it for the cover. Late submissions will, however, still be considered for posting here on methagora with a description and acknowledgment. Multiple submissions from a single individual are welcome and we’ll highlight as many as possible here on methagora.

Email each image candidate to methods@us.nature.com with your contact information and a short paragraph describing what it is and what research method or tool was used to generate it.

Please spread the word to your colleagues! We want the variety of submissions to be representative of the wide array of techniques that have appeared in our pages over these last ten years.

What we publish

The editors of a scientific journal have an editorial prerogative to publish articles that fall under the editorial scope of the journal as they see it. But defining this scope in a way that is clear to those outside the editorial team can be difficult and any definition can become dated as science and the journal evolve. Here we discuss the scope of Nature Methods.

As stated in our Guide to Authors, Nature Methods publishes “novel methods and significant improvements to tried-and-tested basic research techniques in the life sciences.” We broadly define “research techniques” as methodological procedures, biological or synthetic reagents, computational algorithms, software tools, instrumentation and other technologies.

The phrase “basic research” in the sentence above is key and, as explained in April’s Editorial, methods intended for later stage research applied to the clinic, drug discovery or industrial processes are generally considered outside our scope. These applications are often classified as biotechnology and thus are probably more appropriate for Nature Biotechnology or, if clinically oriented, a Technical Report in Nature Medicine.

But as April’s Editorial acknowledges, method and tool developments can be relevant for both basic research and more ‘downstream’ applications. This requires us to be continuously walking an editorial tightrope between them. As circumstances change and fields develop we may need to adjust how we apply our editorial scope.

As also stated in our Guide to Authors, Nature Methods is targeted at “academic and industry researchers actively involved in laboratory practice.” The phrase “laboratory practice” is intended to indicate that the journal generally only publishes methods for work that occurs in a research laboratory environment. On occasion, we may consider compelling work that doesn’t fall under the typical definition of laboratory research, particularly in areas like ecology where the basic research environment extends beyond the confines of a brick and mortar lab. An example was our publication of The Metatron: an experimental system to study dispersal and metaecosystems for terrestrial organisms.

We are constantly reassessing our editorial scope and can work with authors to adapt the presentation of work that might otherwise be considered out of scope if we feel it is sufficiently compelling, relevant to our readership and can be appropriately presented as important for basic research. We are happy to respond to presubmission inquiries submitted via our manuscript submission system asking about the appropriateness of the scope of a proposed manuscript. But if a manuscript is already written please submit the full manuscript as a regular submission and don’t worry about formatting it to fit Nature Methods’ article style. This will allow us to make a more informed decision and format can be dealt with later in the event we proceed towards publication. If a manuscript is clearly out of scope we will endeavor to provide a fast decision.

Finally, if there is an area of basic biological research that you feel is underrepresented in Nature Methods but should be of substantial interest to our readership, please let us know. For example, we published virtually no computational methods for the first several years of the journal but they now represent a substantial fraction of our articles. As we strive to serve our readers we want to avoid falling into a pattern of publishing research limited to a few areas, but our success in doing this depends heavily on receiving submissions from a broad range of research areas and we encourage the wider basic research community to consider Nature Methods even if we haven’t yet published much, or anything in a particular area.

Here there be software

Software plays an important role in scientific research, and published studies increasingly rely on custom software code developed by authors. This calls for better transparency in research articles and improved access to the software and code itself.

This month in Nature Methods and on methagora we revisit issues regarding software reporting and availability first raised exactly seven years ago in our March 2007 Editorial “Social software“. Our March 2014 Editorial updates and expands on these editorial policies and a blog post provides details of our guidelines for custom algorithms and software reported in Nature Methods research papers. We encourage researchers to read these, particularly those considering submitting a research manuscript using or reporting custom software to us. We also hope that publicizing our editorial policies might aid other journals in thinking about how to handle algorithms and software associated with research they publish.

Of course, these efforts are only one small part of what needs to be done to improve access to and use of scientific research software. As can be seen by our somewhat complex guidelines, it is difficult to establish simple rules that are sensible and fair for all cases and all communities. Community participation will be essential for refining and improving how software is handled.

Nature Methods currently relies on the use of Supplementary Software zip files for authors to supply the software and code underlying research articles. This isn’t pretty but it fulfills our basic needs. For example, 50% of the research articles in our March issue contain Supplementary Software files. But better methods are needed to archive and document code and assign provenance.

An important initiative in this regard is the “Code as a research object” project that is a collaboration between Mozilla Science Lab, Github and figshare that seeks to “better integrate code and scientific software into the scholarly workflow.” The aim is to create citable endpoints for the exact code used in particular studies. [Full disclosure: figshare is a product of Digital Science which, like Nature Methods, is part of Macmillan Publishers.]

The project is still in its early stages and follows on the similar but broader Research Object community project. Similarly, GigaScience and F1000Research are experimenting with archiving code and pipelines with DOIs.

We applaud these efforts and encourage the broader research community to participate in them. The current discussion about what is needed for code reuse (announced on the ScienceLab blog) and going on in a thread at Github would greatly benefit from more input by researchers who don’t consider themselves code jockeys.

There are many sophisticated and powerful things that could be done in an ideal world to facilitate code exposure and reuse, but the situation at the great majority of journals is so underdeveloped and the needs so acute that even small flexible steps forward will have a positive impact. Most important is for facilities to be put in place that allow and encourage the entire community to move forward, not just a small portion of it.

Guidelines for algorithms and software in Nature Methods

A large proportion of original research published in Nature Methods relies to varying degress on custom algorithms and software developed by the authors. Here we provide guidance on our relevant material sharing and reporting policies.

Nature Methods first outlined our material sharing and reporting standards for algorithms and software in a March 2007 Editorial. Now, after seven years of experience applying those policies we updated and expanded on them in our March 2014 Editorial. On this page we provide more detailed guidelines for authors submitting manuscripts containing unpublished algorithms and software they created. We are posting this information here because we’d like these guidelines to evolve and we want input from our communities on how they think this should happen. Please comment below and let us know your thoughts. We will update this document as our policies change.

Manuscripts published in Nature Methods include methods and tools in which algorithms and software represent an increasingly important methodological component. However, the degree to which they are central to the reported methodology can vary considerably. The algorithm or tool may be the entire motivation for publishing the work or it may be ancillary to it. Additionally, the methodology may be a novel algorithm of value in and of itself but a coded implementation is still necessary for the authors to show that it works as expected. Finally, the software tool may implement existing algorithms in a user-friendly form to deliver high value functionality of substantial general interest. Because of this wide variety it is inappropriate to enforce one-size-fits-all standards for algorithms and software reported in Nature Methods. The guidelines below represent our current editorial position on software reporting and release.

Client-side Software
This is software that is installed and used on a personal computer and not intended to be accessed remotely as a web service. It can be entirely stand-alone on a commonly available operating system (Windows, Mac OS X, or *nix) or can require the user to have a popular software platform installed (MATLAB or LabVIEW). In all cases, but particularly when using MATLAB or LabVIEW, all platform versions and software dependencies must be detailed in the supplied documentation.

At Submission

  • If the custom algorithm/software is central to the method and has not been reported previously in a published research paper it must be supplied by the authors in a usable form including one or more of the following.
    1. Source code
    2. Complete pseudocode
    3. Full mathematical description of the algorithm
    4. Compiled standalone software

    We strongly urge that full source code be provided. A compiled executable alone is not sufficient but may be required if the tool is intended to be of wide general use. Final acceptable forms of release of the algorithm, software and code will be determined by the editor after consultation with referees. This decision will be influenced by the editorial motivation for publishing the work (i.e. high novelty, satisfies wide general need, etc).

  • If the software is ancillary to the methodology being reported or is a routine implementation of obvious processes, such as microscope control software or analyses that are otherwise adequately described, the software need not be supplied to reviewers at submission but final release requirements may change in the course of the review process.
  • Supplied source code or software must be accompanied by documentation sufficient for a typical user to compile, install and use the software. Depending on the nature of the software tool, how central it is to the manuscript and our editorial motivation for considering the work, the minimum documentation may be a simple readme file or a full manual in PDF format.
  • If appropriate, sample data known to work on the software should be provided along with the expected output. Referees are encouraged to try and use the tool to analyze their own data.
  • The software and associated files may be supplied for reviewers as either:
    1. A single Supplementary Software zip file up to 200 MB in size
    2. Four DVDs to be mailed to the reviewers.
  • Any restrictions on the availability of software or code used to implement novel algorithms must be specified at the time of submission. Editors will decide whether any restrictions are acceptable in consultation with the reviewers. If some restrictions are deemed acceptable, they must be clearly explained in the methods section of the manuscript. Authors must supply all information needed for the reviewers to properly evaluate the software or code. If the motivation of the submitted manuscript is to provide a useful tool, rather than report a new algorithmic development, there should be no substantial restrictions on software or code availability.
  • We encourage authors to provide a license with the software or code.
  • A narrative description of key algorithmic components should be provided in the main text. Extensive equations, pseudocode or snippets of source code should be confined to the Online Methods or a Supplementary Note.

At Acceptance

  • If the software is central to the methodology and non-obvious, the source code should be provided in a Supplementary Software zip file as described above so that readers can easily access the exact code used to obtain the results in the paper. There are some possible exceptions:
    1. If the author’s institution requires a user to accept a license agreement or if the author has other reasonable grounds for not providing the source code as Supplementary Software, it may be acceptable for the author to host source code on an institutional server and require that users fill out an online form and agree to a license before downloading the software. In this instance the software must have version numbering and a link to the version used in the work must be provided in the manuscript.
    2. In some situations it may be permissible for authors to supply only compiled software as Supplementary Software but the source code to academic users upon email request. Details of availability must be clearly stated in the manuscript.
    3. It is not acceptable to make software and code available by email request only.

  • If the software or code isn’t the main tool/method being reported in the manuscript the authors may provide a note in the readme file of the Supplementary Software cautioning users that the code is unsupported and not intended for general use. In this case it is permissible that the software or code be made available only by email request but the authors must state this availability in the manuscript.
  • Regardless of how the software is made available, the code supplied with the manuscript must be identical to that used to obtain the data in the paper. An exception can be made for changes that don’t alter the processing of input data. The authors may however provide a link to access new versions of the software.
  • We strongly encourage authors to include a license with all published software and code.
  • We encourage authors to provide macros for recording the software version and parameter settings during analyses or to integrate this functionality into the software itself.

Web Tools/Resources
These represent a special class of software that many times can’t be expected to follow the same guidelines outlined above. This is particularly true if the web tool or resource is being supplied as a service and has few, if any, novel computational aspects to it. The only end-user requirement for web tools is that they be freely accessible with any modern web browser.

Nature.com provides a proxy server for reviewers to access web tools and resources anonymously.

At Submission

  • The authors must supply a working link and any necessary log in information.
  • Any unpublished algorithms central to the operation of the tool should be supplied in forms a), c) or d) detailed above.

At Acceptance

  • The authors should supply written confirmation that they will keep the website and tool operating and freely accessible for the foreseeable future.

Bring on the box plots

Box plots are excellent for visualizing important core statistics of sample data. We hope that a new online plotting tool BoxPlotR will help encourage their wider use in basic biological research.

The same three samples plotted by bar chart (left) and box plot (right).

The same three samples plotted by bar chart with s.e.m. error bars (left) and Tukey-style box plot (right). The box plot more clearly represents the underlying data.

A bar chart is often a person’s first choice of plot type when they want to compare values. This is appropriate when the values arise from counting. But when the value is a mean or median of data points taken from a sample, a bar chart is usually inappropriate. As discussed in our March Editorial and the accompanying Points of View and Points of Significance columns, a “mean-and-error” scatter-type plot or a box plot are more appropriate for sampled data. In summary, we strongly recommend that box plots be used when you have at least five data points, but for samples with 3-5 data points mean-and-error plots are more appropriate.

Box plots are heavily used in biomedical research in which statisticians have historically had considerable input into study design and analysis. But although similar types and quantities of sample data also appear in basic research (such as that published in Nature journals) box plots are much less common than bar charts in these manuscripts. Last year in Nature Methods for example, ~80% of sampled data was plotted using bar charts.

Discussions we had with the community suggested that an impediment to using box plots instead of bar charts to graph sample data was due to limited support for box plots in plotting programs commonly used by researchers. It also became apparent that some software that did support the box plot was deficient in communicating to users what the different elements of the plot represented. As a result, strangely labeled box plots were showing up in published papers. At NPG we thought it would be useful to provide authors with a simple online tool they could use to generate basic box plots of their data for publication.

The origin of BoxPlotR
At the VizBi 2013 conference in Cambridge Massachusetts I mentioned NPG’s desire for such a tool at a breakout session chaired by Martin Krzywinski in which the participants, including a young researcher named Jan Wildenhain, discussed what the community needed to create better figures. I also happened to mention our interest in this to Michaela Spitzer while visiting her poster from the Juri Rappsilber and Mike Tyers labs showing how the R-package ‘shiny’ by RStudio can be used to easily convert R code (a popular scripting language for statistics) into a visual application for exploring data.

Later at the conference Jan approached me and said he was intrigued by our desire for someone to design a webtool to create box plots and that he was interested in working on such a project. I happily told him to get in touch with me after the conference so we could discuss it further.

Three weeks after the conference concluded I still hadn’t heard from Jan and was beginning to worry that he had decided not to pursue this. Then… a few days later, I received an email from Jan. Much to my surprise he provided a link to a highly functional tool that he and Michaela, through their own initiative, had gone ahead and created using shiny and R. What followed was a productive and rewarding period of discussion and development during which time Michaela incorporated additional functionality and made selected design changes. The tool appeared so well designed and functional that I encouraged them to submit it to Nature Methods for publication as a Correspondence. After incorporating additional functionality and changes based on comments brought up during peer review BoxplotR was ready for publication.

Sample BoxPlotR plots

Sample BoxPlotR plots. Top: Simple Tukey-style box plot. Bottom: Tukey-style box plot with notches, means (crosses), 83% confidence intervals (gray bars; representative of p=0.05 significance) and n values.

Launch of BoxPlotR
To accompany the publication and launch of BoxPlotR we thought it would be useful to provide some information and practical advice about box plots to our readers. Nils Gehlenberg, a former author of several Points of View articles with Bang Wong, agreed to resurrect that popular column for our February issue with an article on bar charts and box plots. Similarly, Martin Krzywinski and Naomi Altman agreed to delay our planned Points of Significance article on the two-sample and paired t-test and instead devote an article to box plots.

Seeing how the community responded to our interest in creating an online box plot tool and then working with them on this project has been a great experience. This never would have been possible without the initiative and talent of Jan and Michaela or the support they received from their PIs Mike and Juri. We hope both our authors and others find BoxPlotR useful and we encourage feedback. General comments can be made here on our blog or by emailing the journal. For specific bug reports and feature requests please see the contact information at https://boxplot.tyerslab.com.

The Method of the Year for 2013 is… single-cell sequencing

Single-cell sequencing edged out other contenders as our choice of Method of the Year in 2013. These techniques really came into their own in 2013 and are fast providing new insights into the workings of single cells that ensemble methods are incapable of.

Method of the Year 2013Back in 2008 we chose next-generation sequencing as our Method of the Year not only because of how the new techniques would improve performance in conventional sequencing applications, but also because they opened up whole new applications, unthinkable with traditional Sanger sequencing. Our choice of Method of the Year in 2013 bears this out, as none of these single-cell sequencing applications would be possible without next-generation sequencing. And in some applications the sequencing is used almost exclusively for identifying and counting tagged molecules.

Our choice likely comes as a surprise to all those who were certain that we would pick CRISPR/Cas9 technology for targeted genome modification. This is certainly an exciting technology, and not only for genome engineering, but also for epigenome editing as described in a Method to Watch. But genome editing with engineered nucleases was our pick for the 2011 Method of the Year and although CRISPR/Cas9 provides a huge practical improvement by largely dispensing with the need to engineer the nuclease and relying instead on a programmable guide RNA, the advance over 2011 is mostly one of ease-of-use.

Methods to investigate biology at the level of single cells have been of keen interest to Nature Methods since the journal started. Our first research article from Robert Singer described a paraffin-embedded tissue FISH (peT-FISH) method to simultaneously detect expression of several genes in situ in single cells while maintaining tissue morphology (Capodieci, P. 2005). This was followed by many other imaging-based methods for such things as measuring cell growth (Groisman, A. 2006), quantifying mRNA (Raj, A. 2008) and protein (Gordon, A. 2006) levels, profiling intracellular signaling (Krutzik, P.O. & Nolan, G.P. 2006)(Loo, L.-H. 2007) and DNA insertion-site analysis (Schmidt, M. 2008) in single cells.

The number of original research articles published in Nature journals exploded in 2013

The number of original research articles published in Nature journals exploded in 2013. These numbers may not be complete.

The publication of M. Azim Surani’s article on mRNA-Seq whole-transcriptome analysis of a single cell (Tang, F. 2009) in 2009 helped signal the rise of sequencing-based methods for single-cell analysis. But even two years later the Reviews and Perspectives in our supplement on single-cell analysis were more focused on imaging-based than sequencing-based aproaches to single-cell analysis.

It was only in 2013 that we finally saw an explosion of original research articles using or reporting single-cell sequencing methods in Nature-family journals. Numerous studies reported new biological results that relied on sequencing of whole or partial genomes or transcriptomes from single cells.

Our Method of the Year special feature has three Commentaries by researchers in the field, including some of the earliest developers and users of methods for single-cell analysis. An Editorial, News Feature and Primer describe our choice and provide helpful background information. We hope you enjoy the selection of articles in our special feature.

Is phototoxicity compromising experimental results?

Light-induced damage to biological samples during fluorescence imaging is known to occur but receives too little attention by researchers.

The December Technology Feature in Nature Methods asks if super-resolution microscopy is right for you and a point that comes up repeatedly from the researchers we interviewed is the danger of phototoxicity and photodamage caused by the high irradiation intensities needed for the illuminating light. This has long been a concern with these methods and many of the papers describing them mention it.

But as discussed in the December Editorial, even fluorescence microscopy with low irradiation intensities can cause dangerous levels of phototoxicity that permanently damage the sample. Microscopists are aware of these concerns but there has been little effort to implement processes intended to reduce the likelihood of it compromising research study results. Dave Piston, Director of the Biophotonics Institute at Vanderbilt University School of Medicine, laments that while phototoxicity is a big deal he has gotten zero traction with NIH reviewers on trying to build some rules for it.

There are some good resources available to researchers that highlight the dangers of phototoxicity and provide advice on how to limit it. Methods in Cell Biology Vol 114 has an excellent chapter by Magidson and Khodjakov, Circumventing Photodamage in Live-Cell Microscopy, that should be mandatory reading for all researchers using fluorescence microscopy for biological research. Also, Nikon’s MicroscopyU has a literature list with several dozen references and recommended reading on phototoxicity. It could use some updating but is still useful.

Despite the amount of microscopy literature that discusses phototoxicity, discussion of the phenomenon in research articles published in Nature Journals is conspicuously absent. This is highlighted by a simple full-text search we performed on the HTML versions of research articles published in Nature, Nature Cell Biology, Nature Immunology, Nature Methods and Nature Neuroscience. The articles retrieved were limited to original research articles.

The table below lists the number of occurrences of each of the listed words in the period from January 1, 2005 to November 3, 2013 in each of the indicated journals. The percentages represent the number fraction of articles containing ‘phototoxicity’ relative to the numbers of articles containing each of the microscopy- or fluorescence-related terms. Note that this is NOT a measure of co-occurrence, only a measure of how common the term ‘phototoxicity’ is relative to the other terms.

phototoxicity fluorescence fluorescent microscopy microscope
# # % # % # % # %
Nature 8 2120 0.4% 1925 0.4% 1995 0.4% 1918 0.4%
Nature Cell Biology 8 815 1.0% 728 1.1% 866 0.9% 822 1.0%
Nature Immunology 6 552 1.1% 574 1.0% 408 1.5% 326 1.8%
Nature Methods 27 565 4.8% 494 5.5% 441 6.1% 407 6.6%
Nature Neuroscience 18 639 2.8% 727 2.5% 587 3.1% 736 2.4%

 

The same analysis was repeated with the term ‘photodamage’ to determine if there was a substantial difference in the usage of these two similar terms.

photodamage fluorescence fluorescent microscopy microscope
# # % # % # % # %
Nature 18 2120 0.8% 1925 0.9% 1995 0.9% 1918 0.9%
Nature Cell Biology 6 815 0.7% 728 0.8% 866 0.7% 822 0.7%
Nature Immunology 2 552 0.4% 574 0.3% 408 0.5% 326 0.6%
Nature Methods 29 565 5.1% 494 5.9% 441 6.6% 407 7.1%
Nature Neuroscience 12 639 1.9% 727 1.7% 587 2.0% 736 1.6%

 

These results carry the potentially large caveat that the analysis did not include the text of the supplementary information, but the rarity with which phototoxicity or photodamage is discussed (0.4% to 7% relative to microscopy terms) suggests that researchers don’t appreciate how important it is to pay attention to artifacts that result from light irradiation. Luckily, there are exceptions to this state of affairs.

An excellent example of testing for phototoxicity and the subtle effects it can induce can be found in a manuscript from Jeff Magee’s lab at Janelia Farm Research Campus published last year in Nature. Quoting from the manuscript, “Particular care was taken to limit photodamage during imaging and uncaging. This included the use of a passive 8× pulse splitter in the uncaging path in most experiments to reduce photodamage drastically [Ji, N. et al. Nat. Methods (2008)]. Basal fluorescence of both channels was continuously monitored as an immediate indicator of damage to cellular structures. Subtle signs of damage included decreases in or loss of phasic Ca2+ signals in spine heads in response to either uncaging or current injection, small but persistent depolarization following uncaging, and changes in the kinetics of voltage responses to uncaging or current injection. Experiments were terminated if neurons exhibited any of these phenomena.”

It is easy to see how these changes in Ca2+ responses could easily have been interpreted as real biological effects caused by the uncaged glutamate, rather than the uncaging light itself.

It is unrealistic to expect that any mandates or oversight would be able to prevent or detect such consequences of phototoxicity in research studies. It is essential that investigators themselves be vigilant and implement appropriate controls to detect these effects. Na Ji, also at Janelia Farm Research Campus says, “It is not enough to only look for instant and dramatic signs of phototoxicity. Sometimes the effects may be more subtle and even unperceivable during the imaging period, but may become obvious when the same sample is imaged the next day. Care has to be taken in data collection and interpretation, especially when the biological process under investigation itself is a subtle one.”

Finally, the application is just as important as the imaging method being used. For example, light-sheet microscopy is excellent at reducing irradiation levels in volumetric imaging. But some applications of super-resolution microscopy, even on living samples, might be less susceptible to artifacts caused by phototoxicity than are sensitive long-term imaging applications of living samples by light-sheet microscopy. Nobody’s microscope earns them a free pass on the dangers of photodamage arising from phototoxicity. Everyone needs to be vigilant.

Update: A reader helpfully pointed out that the danger of phototoxicity and photodamage also applies to optogenetics, where light (often in the blue region of the spectrum) is used to control protein activity.