Method of the Year 2016

As is our tradition every year we have chosen a method, or in this case a set of methods, that have experienced rapid growth in the last years. This year’s choice of epitranscriptome analysis does not comprise a single technique but is based on advances in detecting, enriching and profiling base modifications on all RNA species.

Some of these modifications are abundant and have known functions, others are rare and their role is still obscure. We believe recent methodological advances, as detailed in a Review by Chengqi Yi and colleagues, lay the groundwork for a comprehensive profiling of some of these marks that will shed light on their role in the cell.

Our selection of methods to watch highlights areas we think will experience growth in the coming year and be influential in biological research: from global metabolomics, to RNA-targeting CRISPR, to elucidating single cell function and faster brain imaging.  We do not claim to provide a comprehensive list and our choices may be biased by our fields of interest. We do hope you enjoy reading this feature and if you disagree with us, or if you think we have overlooked an important area, please let us know.

Glycoscience: a tea party no longer

Later this year or early next Richard Cummings plans to launch The Human Glycome Project. It will happen during a workshop that he is currently organizing and which is open to scientists from near and far. The workshop is slated to be held at the Radcliffe Institute for Advanced Study at Harvard University. Also in the works is a Harvard-based center for glycoscience that reaches out to potential collaborators at all Boston-area universities and academic medical centers.

Cummings, who hails from Alabama and who moved from Emory University School of Medicine to Harvard Medical School last fall, loves glycans, which are the ubiquitous carbohydrates made by all cells, and which can be linked to lipids or proteins. Both in humans and in a variety of animal species, the universe of glycolipids and glycoproteins is extraordinary, he says.

In Cummings’ box of plans is the development a human reference glycome so the growing research community committed to these macromolecules can explore the diversity of the human glycome and develop methods and standards with which to do so. He also envisions comparative glycomics, the comparison of human, porcine and bovine glycomics to tease out differences and similarities. “It wasn’t possible before, really,” he says. But dreaming big in glycoscience is now becoming possible.

Glycobiology has been hampered by complicated methods, which his and other labs have been addressing over the years. In his recent work, published the June issue of Nature Methods, the Cummings lab uses household bleach to release glycans from tissue and cells. He started this research at Emory School of Medicine and continued at his new lab at Harvard Medical School. He also directs the Center for Functional Glycomics, a virtual center that he already led at Emory and that is funded by the National Institutes of Health to explore protein-glycan interactions and to develop new tools and technologies to explore glycoconjugate functions.

When people now stop by the Cummings lab they can, for example, leave with four grams of carbohydrates in a 50ml tube full of white powder. “Those are all the carbohydrate structures in the pig lung,” he says. With this material on hand scientists can use nuclear magnetic resonance techniques for glycan analysis.

Cummings and his team want to enable more labs around the world to study glycoscience by shipping material to colleagues upon request.

Hear Rick Cummings talk about the offer here (14 seconds)



Glycans are difficult to synthesize but now it is possible to harvest them from natural sources such as eggs, meat or plants. “We can make them at such large scale now, we‘re going to just give them away,” he says. Once purified, glycans can be archived, printed on microarrays to explore glycan recognition by lectins, antibodies, bacteria or viruses, or sequenced with mass spectrometry, nuclear magnetic resonance techniques or other methods.

As researchers become aware of the role of carbohydrates in health and disease, the field of glycoscience is broadening, says Cummings. Glycans are being recognized as one of the four major classes of macromolecules, alongside nucleic acids, proteins and lipids.

In the 1970s and 1980s, this field was just getting its start and it was considered merely another part of biochemistry. When carbohydrate researchers got together at meetings, it was more like “tea parties” with 50 to 100 attendees, says Cummings. Glycoscience was far from the spotlight. The community began using the term glycobiology, which Raymond Dwek coined in 1985 and which resonated with researchers. And then, he says,  “all of us kind of chose the term glycomics at some point to distinguish ourselves scientifically from proteomics and the other ‘omics.”

Hear Rick Cummings talk about the history of the field here (40 seconds)


Studying glycan function preceded the study of carbohydrate structure, says Cummings, a situation not unlike molecular biology. For example, work by the chemist Linus Pauling on sickle cell disease occurred before the responsible mutation had been identified and before it was possible to sequence DNA. “We really didn’t know the gene until years later,” says Cummings. The molecular biology arena exploded when it became possible to clone and to synthesize oligonucleotides. “We’re at that point now in glyco-science,” he says.

These days it’s increasingly difficult for scientists to overlook glycans, says Cummings. Access and collaboration are what is needed next to grow the field now that researchers are more than willing to, as he says, “dip their little toes in the glycoscience waters.” That being said, he does still hear disparaging comments about glycoscience, but he takes the remarks as a matter of pride. “So you can think of glycans as being like that little awkward kid on the playground who grew up to be a sizable individual whom no one bullies anymore.”

Understanding and documenting variation in human genomes

To understand disease one needs to understand the genetic variations that underlie it. Many tools exist that predict the deleteriousness of variants in the human genome; PolyPhen2, SIFT or CADD (combined annotation dependent depletion), to name only a few examples.  On page 109 of our March issue Yuval Itan et al. present the mutation significance cutoff (MSC) to replace a global threshold for calling variants deleterious, often used for CADD scores, with a gene-level threshold. For MSC, as for any other variant prediction tool, it was important to validate the quality of the predictions with variants known to be deleterious. Established mutation databases are often used as ground truth to test the quality of prediction tools.  MSC, for example, was validated against variants found in two large databases, HGMD and ClinVar.

The February editorial discusses the strength and limitations of large human variation databases and emphasizes the importance of sharing variant data in publicly accessible databases. We encourage our readers to share their experience with these databases and to recommend their favorite ones.

Genomics at top-speed: Q&A with Stephen Kingsmore

 

Stephen Kingsmore

Stephen Kingsmore

Biomedical researcher Stephen Kingsmore is on the move. He has just taken on his new post running the new Rady Pediatric Genomics and Systems Medicine Institute, which is part of Rady Children’s Hospital in San Diego. He is leaving Children’s Mercy Hospital (CMH) in Kansas City where he founded the Center for Pediatric Genomic Medicine.

Kingsmore has also just published along with colleagues at CMH a method called STAT-seq in which the team performs whole genome sequencing and analysis in 26 hours.

As Neil Miller, CMH’s director of informatics explains, CMH is in the process of making most of the downstream characterization and interpretation software behind the STAT-seq pipeline freely available. The team also plans to make its warehouse of genetic variants available and they want to launch a software-as-a-service offering for people without IT infrastructure so that they can use these tools.

Nature Methods   spoke to Kingsmore and what follows is an edited version of the conversation.

Q: To better diagnose and make treatment decisions about seriously ill babies in intensive care you have found a way to sequence whole genomes of parents and their newborn and analyze them in 26 hours. The babies might be having unexplained seizures, parents are deeply upset. Does this involve a lot of people doing the analysis? Or is much of the analysis automated? Just thinking there might be a new world of jobs opening up.

Stephen Kingsmore: The analysis and interpretation are highly automated. That being said, there will be a new world of jobs opening up as folk like us scale up to meet the needs of local populations.

There are 9 million people living in the Rady catchment area, so we foresee a need for 25,000 parent or child genomes a year! That’s a very large number of new genetic counselors.

Q: How do you validate that the genomic analysis is right, especially under these high-pressure circumstances with high stakes? Sanger sequencing?

S.K.: Yes, Sanger sequencing or other appropriate confirmatory test, depending on the type of mutation.

There may also be a need for functional validation, since all that Sanger does is to say that the letter code is correct – it doesn’t speak to whether the mutation is actually causing disease.

Q: Structural variants are complicated to find, making for time-consuming analysis, but they play a role in many diseases. What is needed to make them part of speedy genome analysis?

S.K.: Yes, this is a key need. We need robust, fast methods for finding structural variants genome wide. Microarrays don’t pick up small structural variants nor complex variants, like inversions.

This will be a race between longer read or longer insert whole genome sequencing and newer methods such as offerings from companies such as 10X Genomics and BioNano.

We then need to integrate the two types of variant information so we get a full picture of variations. And all of that can happen with the ease of interpretation and speed now possible for whole genome sequencing.

Q There are a number of fast computational analysis pipelines such as Churchill, SpeedSeq and now yours—is this officially a race of speed demons? One of them, Speed-Seq describes genome analysis in 13 hours. How to compare these tools, their sensitivity and specificity?

S.K.: We need a bake-off! I’m biased, but I think ours is fastest with its genome analysis that takes between one and one and a half hours. It has the highest sensitivity and specificity for nucleotide variants, and the smallest IT footprint for local implementation.

However, ours is not yet in the public domain nor yet available on a software-as-a-service basis, and does not yet have fully integrated structural variant calls. We hope to rectify these things by the end of the year.

Q: In your new study you use proprietary hardware by a company called Edico Genome; others are using open source software. Do people need to decide on belonging to the open or closed club when they want to try to implement what you are doing?

S.K.: Edico is a mix of hardware and software. The overall cost, I think, is significantly lower than traditional compute plus freeware. We are strongly focused on making freeware versions of the software described in the manuscript available by the end of the year.

That being said, there are some excellent commercial software options, and Genomics England has gone that route after their bake-off for the 100k Genome Project. So yes, people should really step back at this juncture and think critically about their needs over the next two to three years

Q: The data , particularly on childrens’ genomes is sensitive but also of great interest to researchers. How do you work out data-sharing schemes with these data?

S.K. This is a delicate balancing act. There is great value for researchers to be able to re-analyze genomes together with structured clinical data, especially where a diagnosis was not evident.

We like the secure NIH database of Genotypes and Phenotypes (dbGAP) route, which balances the need for confidentiality, even of de-identified data, with the needs of researchers and funding agencies.

 

The ethics of self-organizing tissue

It becomes increasingly clear that stem cells are able to form remarkably complex structures in vitro, if they are handled right. In this month’s issue, two pieces raise the question of whether recent developments in methods for patterning embryonic stem cells in vitro raise potential ethical, regulatory or public perception concerns, or if they may do so in the future.

You can find the commentary from leading stem cell and developmental biologists here [https://www.nature.com/nmeth/journal/v12/n10/full/nmeth.3586.html] and the editorial here  [https://www.nature.com/nmeth/journal/v12/n10/full/nmeth.3618.html].

We note that some of these matters were also brought up in a paper published at the end of last year (Cells Tissues Organs 2014;199(4):221-7).

 

DIY Biolabs – and why they matter

When proponents of Do-it-yourself Biology  explain their motivation for getting involved in the movement  they often resort to colorful imagery. Take for example Patrick D’haeseleer who helps organize the Counter Culture Labs in the San Francisco Bay Area. He asks, “When the first village tamed fire, the neighboring village was freaking out. Should only the village elders be allowed to make fire or should we teach everybody?”  “Any new technology has risk, but it behooves us to have all citizens know how these technologies work and what the risks are. “ he continues, “ the technology needs to be democratized because it will dominate the 21th century.”

In our September editorial we encouraged people to look up  DIY Biolabs in their backyard and consider getting involved.

A recent editorial in Nature  also addresses the topic of Citizen scientists, focusing on sample collection and data analysis.  The authors raise the question how conflict of interest should be addressed and recommend full transparency about motives and ambitions of citizen scientists.

We agree that it is important to be upfront about one’s involvement in scientific endeavors, but motives conflicting with those of established scientists need not preclude participation in the scientific process.  As people learn more about methods and their potential,  they may change their position on certain issues, or, if not, they will  have a better grounded basis for what they belief.  Either outcome is a success.

A physicist’s adventures in biology, funding and job-hunting:

Q&A with Ronald Walsworth who is a staff scientist at the Harvard-Smithsonian Center for Astrophysics and a faculty member in the Harvard physics department.

The scientist profiled in the August issue of Nature Methods, Ronald Walsworth, who is a physicist at the Harvard Center for Astrophysics (CfA) has built and tested a quantum diamond microscope that benefits from particular flaws in a diamond.

What follows is an edited excerpt of his conversation with Nature Methods’ technology editor Vivien Marx. Read more here.

Ronald Walsworth (r) and Chih-Hao Li (l) adjust a laser frequency comb used in the search for Earth-like exoplanets. Photo credit: Harvard-Smithsonian Center for Astrophysics

Ronald Walsworth (r) and Chih-Hao Li (l) adjust a laser frequency comb used in the search for Earth-like exoplanets.
Photo credit: Harvard-Smithsonian Center for Astrophysics

  Q: The new instrument can quantify single cells, what else can it or might it do?

  RW: We have all these neat, cool things that diamond sensing can do, based on the   way it helps to detect small changes in magnetic fields. Diamonds can go into extreme environments–underground, under water, or they can go into high temperatures such as in an airplane engine–where there is a need for sensors.

  Some companies are developing imaging systems based on our research on the special kind of flaw in diamonds that involve nitrogen vacancy (NV) centers. There is an entire community of researchers working on NV centers who use them as nanoscale probes with which you can map out magnetic signatures at the near-atomic scale.

  There are physical science applications such as nano-scale probing of surfaces of novel materials that might be used in computing or for energy storage. They are also used for condensed matter physics research. In the life sciences, these diamonds can probe living tissue. We’re pursuing an approach using a planar surface with many NV centers to image a sample. In our paper, we showed single-cell imaging and we think we could move toward single molecule sensing and imaging, such as assaying tissues for magnetic signatures in early-stage brain disease.

Q: You do fundamental physics. Isn’t biology too squishy for you?

RW: We are doing a lot of basic physics research, a bit of astrophysics, lots of different things. I enjoy learning new things. One way to do that and to be professionally productive is to develop new tools that are relevant from day one for some field that is new to me. Then I have the enjoyment of learning while I am contributing and while I’m still kind of ignorant. About half way up the learning curve is where I often have my best ideas.

I am involved in collaborations about how to create networks of atomic clocks, another is about sensing gravity waves, or new ways for detecting exoplanets. I am a professor in the department of physics and a member of the center for brain science. I have a lab there, too, next to [neuroscientist] Jeff Lichtman. Some astro-people are increasingly interested in aspects of biology, too. There’s the Origins of Life Initiative at Harvard led by my friend and exoplanet astronomer Dmitar Sasselov, who is also interested in questions of synthetic biology.

Q: Your background led you down this path?

RW: I did my PhD in physics at Harvard and kind of never left. I never really did a post-doc. I got my PhD in the physics department on atomic clocks and fundamental symmetry tests in physics and did some work at the CfA where there was an atomic clock group. Then I got an offer to do a post-doc with future Nobel Laureate David Wineland, at the National Institute of Standards and Technology. I felt that was great, but I wanted to finish up my PhD work.

One thing led to another, and after a year I developed some ideas of my own that I wanted to pursue. There was some unused lab space and I took a gamble that I could just keep myself going on my own and do the research that I wanted to do and declined the post-doc. Years passed and I was able to raise money and build up a research group from scratch.

Q: You said you don’t recommend this path to others, why might it not work for them?

RW: The Smithsonian had some funding that made me a principal investigator day one.  You didn’t have to go through some formal hiring process, you could just try things. If you got resources to keep yourself afloat, you did science. It was very fertile and it gave myself and others a path through which to rise up, if we could get science done.

By the mid-1990s, I had built a group of four to five people, without being hired by anyone in any formal way other than having people at CfA say: “you can stick around in some space in a corner.”

I never went through a formal hiring process following up on an advertised position. I’ve never been hired anywhere. The last time I applied for a post was in 1984 when I applied to graduate school.

Q: Is that kind of path still possible?

RW: There is a good aspect to having more formalized procedures, there isn‘t nepotism and favoritism. Under-represented groups can be properly advanced and you are making sure there is a level playing field: those are all good things.

When it’s a bit more like the Wild West, there are lowered hurdles for young scientists when they are creative and full of energy. But even when you are trying to make things progressive and formalized it becomes unintentionally regressive to satisfy older and middle-aged scientists on hiring panels and study sections who vet and decide that “this is the young person we are allowing to join our ranks.” That lets existing researchers decide who should be hired and given a chance, rather than giving everybody a chance to pursue their creativity and see how it works out.

Q: How do you personally advise young scientists?

RW: Now I am going to sound like a griping middle-aged guy but I can think back to the young guy, too. There are too many barriers keeping people from trying things out. Scientists write proposals to get funding. We are enthusiastic about some of the proposals. Others we are less excited about. It’s tough when those ones are the ones we are less excited about.

But we always need to look out for the younger people, the next generation of scientists with promise and talent. We need to clear the path so they can get things done. I have physics graduate students from MIT and Harvard, some are also in chemistry or biophysics, some are MD/PhDs. I try to integrate everybody. They are in different locations, so we switch locations for lab group meetings. Fridays we have group seminars and then we go out to lunch together. One of my post-docs is someone who should already have a job. That’s another thing that bothers me, there are not enough jobs.

There are jobs for people who want to leave science who use their great analytical skills to do data analysis in the financial world. But for people to deploy their skills properly and continue in science, it’s hard. I have helped land good jobs for some of my people and I have a few more who are just finishing post-docs, who are just great and who need jobs.

Funding crisis in basic research

The editorial in our November issue discusses the shortage of funding for basic science in the US and how the ramifications of funding shortfalls on society at large can be measured.  But the US is not alone in facing a much tightened research budget.

Outcries over changes in funding polices can be heard also in many European countries.

In a column in Nature on October 9, Amaya Moro-Martin, a member of the governing board of Euroscience describes the problems and sums their root cause up by saying that “the policy-makers and leaders of an increasing number of nations have completely lost touch with the reality of research.”  The numbers she quotes are indeed stark: Italy’s spending on basic research has dropped precipitously and recruitment of scientists has fallen by 90%. And the situation is equally dire if not more so in Spain, Greece and Portugal.  She predicts that these budget cuts are triggering a brain drain from southern to northern Europe and, more seriously, lead people to leave research altogether in search for a more stable career.

The focus of the European research commissioner on applied rather than basic research will not solve this problem. Applied research does not ask the fundamental questions that underlie new discoveries, instead it improves upon what is already known.

Scientists in Canada voice similar concerns that their government puts increasing focus on funding projects that offer immediate commercial value rather than supporting basic research. A summary by the Canadian Association of University Teachers summarizes the drop in Canadian federal funding for basic Research over the last 8 years.  The authors urge their government to make basic research a priority and leave the awarding of grants to peers rather than side stepping this process by determining which projects or institutions will receive money.