A different kind of Method of the Year for 2012

Our choice of Method of the Year in prior years has tended to be methods that generally didn’t even exist only a few years earlier but which had quickly bounded onto the scientific stage and attracted the attention of a large portion of the scientific community. Targeted proteomics, our choice for 2012, on the other hand has existed for years in scaled-down forms using methods based on antibodies. Western blotting, immunofluorescence, antibody arrays, etc. can all be used to detect and measure targeted subsets the proteins expressed in cells and tissues.

During this time the workhorse of proteomics, the mass spectrometer, has been used mostly for shotgun proteomics experiments in which the goal was to analyze all the proteins in a sample. But the means to use these machines for targeted detection of defined subsets of proteins and obtain more reproducible measurements than shotgun experiments can typically provide have been around for decades.

Shotgun methods have been mostly confined to specialist laboratories as many biologists have been intimidated by the complexity of implementing and analyzing these experiments properly. Targeted proteomics on the other hand offers a tantalizing opportunity to bring a sampling of the power of mass spectrometry to the wider community of biologists. The assays are simpler, easier to run and well suited to the hypothesis-driven experiments that are the mainstay of biological research.

The ubiquitous Western blot has long filled a central role or functioned as a crucial control in many research studies. Unfortunately performing a high-quality Western blot can feel a bit like roulette. Sometimes you get a fantastic looking blot with an accurate antibody but other times either the blot is blank, the bands may look like they ran through some carnival ride or it might suffer from any number of other problems. This might prompt people to either look for a goat to appease the Western blot gods or take unscientific liberties with the presentation of the data in order to make it look like they are believe it should. It also lessens the likelihood that important replicates are performed or reported.

Targeted mass spectrometry offers the possibility for thousands of labs to move away from, or supplement, Western blots; and improve the quality and quantity of their protein measurements. This is not as sexy as next-generation sequencing, super-resolution imaging or optogenetics, some of our prior choices of Method of the Year, but the potential for revolutionizing an arguably mundane but indispensable technique was compelling enough that it played no small role in our decision. Only time will tell what impact the method has and we eagerly look forward to the answer.

Our reporting standards for fluorescent proteins – Feedback wanted

Several years ago, based on informal input from various members of the community, Nature Methods established some internal minimum reporting standards for manuscripts describing new or improved fluorescent proteins. These were never formally reported but were often communicated to authors of submitted manuscripts when the characterization data provided didn’t meet these standards.

Recently we were fortunate enough to be able to meet with a substantial number of fluorescent protein developers to informally endorse and revise these standards. Our revised reporting requirements for fluorescent proteins are listed below.

Minimum reporting requirements for fluorescent proteins

  1. Full absorption and excitation (250nm to 750nm) and emission curves (350nm to 950nm) under single-photon excitation and at least some data under 2-photon excitation
  2. Values for quantum yield, extinction coefficient, brightness and pKa
  3. Gel filtration data to show that the protein is monomeric or acknowledgement that it isn’t monomeric
  4. Data on fluorophore maturation time including the final maturation percentage. Detailed protocol must be provided
  5. Image data on several representative protein fusions to show that it does not disrupt protein function. This should include tubulin since it is pretty universally used for this purpose
  6. In vitro photostability data compared to other representative proteins. At a minimum this should be decay curves under widefield and confocal illumination to test two different irradiation intensity regimes. Ideally, graphs of the decay time constant versus power should be provided
  7. Cytotoxicity measured in mammalian cells by flow cytometry and compared to EGFP and at least one established fluorescent protein in the spectral range of the reported protein

We also used this opportunity to set some standards for photoswitchable fluorescent proteins. These proteins display quite complicated behaviors and the desired characteristics can vary depending on the application. An example of this is the different characteristics desired for (f)PALM/STORM vs RESOLFT or SSIM super-resolution imaging. These new reporting standards are listed below and supplement the ones above which would also apply to photoswitchable fluorescent proteins.

Additional minimum reporting requirements for photoswitchable FPs

  1. Graphs of 20+ cycles at different powers to observe decay with full details on power and methods
  2. Absorption spectra before and after photoconversion
  3. Optimal parameters for the best power and also for another power
  4. Measurement of how complete the switching is

We encourage developers and users of fluorescent proteins to comment on these minimal reporting standards. But more than that… we’d like your help in moving forward from here.

Are additional standards needed due to new developments?

Do we need standards similar to these in other areas?

We have found that enforcing common standards on highly related tools can greatly improve the efficiency and objectivity of the peer review process and help avoid holding similar developments to different standards. Of course, this requires flexibility in enforcement and we will always allow editor’s some discretion in enforcing these requirements when there is a legitimate reason for doing so.

Fluorescent Proteins and Sensors Webinar – Questions & Answers

Our very first webinar is now live. The topic is “Fluorescent protein and sensors: A practical discussion” and you can register to view it at www.nature.com/webcasts/fluorescent_proteins. Update: Registration link inactivated. Please go here to listen to the archived discussion in .mp3 format.

Nature Methods was joined by Robert Campbell, David Piston and Thomas Knopfel who have been developing and using fluorescent proteins and sensors for years. We had a nice discussion that provided good practical information for users of these tools. If you haven’t watched it, I encourage you to do so. If you watch the webcast within the first month it is live you have the opportunity to submit questions for our participants. Please use the form on the webcast viewing page to submit questions. There will be a delay in providing answers here on our blog while we consult with the participants.

Participants in fluorescent proteins and sensors webinar

Our participants: Robert Campbell, David Piston and Thomas Knopfel

Here we will be posting the questions we receive and answers from our participants. Readers may also comment directly on the blog below but we can not guarantee that any questions asked there will be answered. We do encourage anyone in the community to chime in with their response to any questions that are posed, even if they don’t agree with our participants.

High-content screening: Our tech feature and the GE image competition

In single cell experiments, each well in a 384-well plate can spout a fountain of information. Chris Bakal at the Institute of Cancer Research, which is part of the University of London, practices “high content in high throughput” as he extracts hundreds of different features from single cells in his lab. In this month’s technology feature on single cell analysis, Bakal explains where his work leads and what he looks for in an imaging system.

In the past, drug discovery has driven high-content analysis but that trend is shifting. High content screening instruments are now increasingly finding homes in academic labs.

For example, the IN Cell Analyzer from GE Healthcare Life Sciences is also used at the University of Texas MD Anderson Cancer Center’s department of experimental Therapeutics. Geoffrey Grandjean, who was interviewed in this month’s technology feature, helped to set up a core facility service there, performing high-throughput, high-content siRNA screening. Seeing the images on a regular basis motivated him to start graduate school in experimental therapeutics.

Grandjean co-authored a paper in Cancer Research that looked at resistance to anti-mitotic chemotherapy agents and documented several gene clusters influencing reaction to the chemotherapy drug in “strikingly different ways.” The team also found that modulating microtubule stability in cancer cells is a way to enhance paclitaxel cytotoxicity. The team used siRNA to look at genetic factors regulating microtubule stability when ovarian cancer cells are treated with the drug paclitaxel.

{credit}Credit: G. Grandjean/U of Texas MD Anderson Cancer Center{/credit}

One of Grandjean’s images won last year’s IN Cell Analyzer Image Competition. A cancer drug makes the cellular scaffold so rigid that the cell cannot divide, thus resulting in a huge cell that dwarfs those around it.

Voting for GE Healthcare’s 2012 cell imaging competition has opened in two categories: microscopy and high-content analysis. You can cast your vote by December 19th. Winners will have their images displayed in New York City’s Times Square.