XFEL projects, tools, data portals

Earlier this year, the EuXFEL’s first laser beam reached the ‘hutch’.

Earlier this year, the EuXFEL’s first laser beam
reached the ‘hutch’.{credit}Jessica Mancuso{/credit}

As of September 1, the European X-ray free-electron laser (EuXFEL) is ready for the research community’s experiments; the user page is here.

In the September issue of Nature Methodswe present some of the experimental ideas researchers are exploring in that facility.

Other XFELs are operational or in the works: FERMI facilityLinac Coherent Light Source at Stanford (LCLS)Pohang Accelerator Laboratory (PAL) X-ray Free-Electron Laser, SPring-8 Angstrom Compact Free ElectronLaser (SACLA)Swiss Free-Electron Laser (SwissFEL).

One day, there might even be a XFEL that fits on a table-top (see below). The day is already here when scientists need to analyze mountains of XFEL-data. The EuXFEL will likely make those mountains grow in height. There are tools for that and likely more tools to come (see far below for a list of some tools).

Tabletop XFEL

To complement the large XFEL facilities, a number of research groups are developing benchtop XFELs9,10. Such projects involve miniaturization of all aspects of the technology, including the accelerator. Some groups explore ways to pass a laser through plasma to produce bright, high-energy, short-pulsed beams. Separately, some researchers use a terahertz generator, which can provide sufficiently high pulse energies, says Franz Kärtner, a physicist at the University of Hamburg who also holds an appointment at MIT. His team, along with Petra Fromme of Arizona State University, is developing such a tabletop XFEL instrument.

The scientists would like to use the instrument for coherent diffractive imaging and spectroscopy experiments on photosystem II, a protein complex involved in photosynthesis. Their compact XFEL approach, which will use a terahertz generator, lasers and nonlinear optics, is calculated to achieve photon energies between 10 and 12 keV, hard X-rays that can be harnessed for imaging at atomic resolution, says Kärtner.

Although this compact XFEL will generate fewer photons per shot than a large-scale FEL—106 to 109 photons per shot as opposed to 1012 photons or more—the machine will be able to produce very short pulses, on the order of 0.5 femtoseconds. That is 10–100 times shorter than current FEL pulses. And if that comes to be, says Kärtner, the peak power of the instrument may be almost on par with that of an XFEL.

A terahertz accelerator module for a table-top XFEL in the making

A terahertz accelerator module for a table-top XFEL in the making
{credit}DESY/Heiner Müller-Elsner{/credit}

In the instrument’s terahertz-driven accelerator there will be acceleration gradients between 500 MV/m to 1GV/m. It’s this high frequency that helps to compress electron bunches over short distances and that will let the developers to use compact electron guns. In this fashion, they will be able to shoot a coherent electron beam directly from a gun and emit an X-ray beam much like a FEL, says Kärtner.

Tabletop XFELs fill an important experimental gap between Röntgen’s X-ray tube and the large-scale FELs. “There is nothing in between,” says Kärtner. That’s akin to a situation in optical science in which researchers need to choose between a light bulb and a large-scale optical laser such as the one at the National Ignition Facility at Lawrence Livermore National Laboratory. If the developers of the compact XFEL succeed at packing enough photons into each shot, the instrument will have potential applications in many fields, he says, including enhanced characterization of materials or higher-resolution medical and structural biology imaging.

  1.  Kneip, et al. Nat. Phys. 6, 980–983 (2010).
  2.  Kärtner, X. et al. Nucl. Instrum. Methods Phys. Res. A 829, 24–29 (2016)

Data mountains

XFEL-based experiments produce mountains of data. At EuXFEL, there are two two-dimensional pixel detectors, which will each deliver 10-40 gigabytes of data every second of an experiment.

Experimental data will be housed in the facility’s online systems and then moved to offline disk-based systems also at the facility where researchers can access and analyze it, says Filipe Maia, a biophysicist at the Uppsala University.

The data torrent makes for “a daunting problem,” says Maia, “and currently there’s clearly a lack of user friendly tools.” This issue is a general trend and not unique to XFEL-based research, but researchers are getting better at handling datasets, which happens also as they familiarize themselves with the increasingly available tools. After publication, he hopes the XFEL-data will be transmitted to an online repository to share it with the community. One such resource is the Coherent X-ray Imaging Data Bank (CXIDB), which he built.


Here are some tools for analyzing and managing XFEL data                             

 Resource Description
CASS-CFEL-ASG Suite of tools for real-time monitoring of XFEL experiments, data analysis and visualization, raw data correction, crystal hit finding.
cctbx.xfel Suite of tools for processing measurements made during SFX experiments at an XFEL. Built on Computational Crystallographic Toolbox.
CrystFEL Software suite for processing SFX data.
 Cheetah Data analysis and high-throughput data reduction tools for SFX data.
 Condor Simulation of Flash X-ray imaging to help solve structures without needing crystallization
 Dragonfly  Software/algorithm for single-particle imaging with XFELs
 Hummingbird  Real-time monitoring of XFEL experiments
 Hawk  Package for analyzing and phasing diffraction patterns from single particle-based experiments
 IOTA Spot-finding software for XFEL-based diffraction images. Part of the cctbx.xfel suite.
 OnDA  Real-time monitoring and data analysis of XFEL experiments
 psana  A data analysis framework at LCLS
 SACLA analysis  framework Real-time data processing pipeline at SACLA for serial femtosecond crystallography; it uses modified Cheetah and CrystFEL.
WavePropaGator Software framework for simulating XFEL experiments.
 XATOM  Software calculating and simulating X-ray atom interaction.
Part of the software package Xraypac.
 XMDYN  Simulation tool for modeling dynamics of matter that is exposed to high-intensity X-rays. Part of the software package Xraypac.
 Resources and Portals 
 Coherent X-ray Imaging Data Bank (CXIDB) A database for coherent X-ray imaging experiments.
 LCLS data  analysis  Data analysis resources at LCLS.
 Protein Data  Bank (PDB)  Data repository for protein structures.
 SIMEX  A project that aims to develop an experimental simulation platform for use at XFELs

Sources: Henry Chapman, DESY; Janos Hajdu, Filipe Maia, Uppsala University; Sébastien Boutet, LCLS

LCLS: Stanford Linac Coherent Light Source; Linac Coherent Light Source, SLAC National Accelerator Laboratory (formerly named Stanford Linear Accelerator Center)
SACLA: Spring-8 Angstrom Compact Free Electron Laser
SFX: Serial Femtosecond Crystallography
XFEL: X-ray free-electron laser

A physicist’s adventures in biology, funding and job-hunting:

Q&A with Ronald Walsworth who is a staff scientist at the Harvard-Smithsonian Center for Astrophysics and a faculty member in the Harvard physics department.

The scientist profiled in the August issue of Nature Methods, Ronald Walsworth, who is a physicist at the Harvard Center for Astrophysics (CfA) has built and tested a quantum diamond microscope that benefits from particular flaws in a diamond.

What follows is an edited excerpt of his conversation with Nature Methods’ technology editor Vivien Marx. Read more here.

Ronald Walsworth (r) and Chih-Hao Li (l) adjust a laser frequency comb used in the search for Earth-like exoplanets. Photo credit: Harvard-Smithsonian Center for Astrophysics

Ronald Walsworth (r) and Chih-Hao Li (l) adjust a laser frequency comb used in the search for Earth-like exoplanets.
Photo credit: Harvard-Smithsonian Center for Astrophysics

  Q: The new instrument can quantify single cells, what else can it or might it do?

  RW: We have all these neat, cool things that diamond sensing can do, based on the   way it helps to detect small changes in magnetic fields. Diamonds can go into extreme environments–underground, under water, or they can go into high temperatures such as in an airplane engine–where there is a need for sensors.

  Some companies are developing imaging systems based on our research on the special kind of flaw in diamonds that involve nitrogen vacancy (NV) centers. There is an entire community of researchers working on NV centers who use them as nanoscale probes with which you can map out magnetic signatures at the near-atomic scale.

  There are physical science applications such as nano-scale probing of surfaces of novel materials that might be used in computing or for energy storage. They are also used for condensed matter physics research. In the life sciences, these diamonds can probe living tissue. We’re pursuing an approach using a planar surface with many NV centers to image a sample. In our paper, we showed single-cell imaging and we think we could move toward single molecule sensing and imaging, such as assaying tissues for magnetic signatures in early-stage brain disease.

Q: You do fundamental physics. Isn’t biology too squishy for you?

RW: We are doing a lot of basic physics research, a bit of astrophysics, lots of different things. I enjoy learning new things. One way to do that and to be professionally productive is to develop new tools that are relevant from day one for some field that is new to me. Then I have the enjoyment of learning while I am contributing and while I’m still kind of ignorant. About half way up the learning curve is where I often have my best ideas.

I am involved in collaborations about how to create networks of atomic clocks, another is about sensing gravity waves, or new ways for detecting exoplanets. I am a professor in the department of physics and a member of the center for brain science. I have a lab there, too, next to [neuroscientist] Jeff Lichtman. Some astro-people are increasingly interested in aspects of biology, too. There’s the Origins of Life Initiative at Harvard led by my friend and exoplanet astronomer Dmitar Sasselov, who is also interested in questions of synthetic biology.

Q: Your background led you down this path?

RW: I did my PhD in physics at Harvard and kind of never left. I never really did a post-doc. I got my PhD in the physics department on atomic clocks and fundamental symmetry tests in physics and did some work at the CfA where there was an atomic clock group. Then I got an offer to do a post-doc with future Nobel Laureate David Wineland, at the National Institute of Standards and Technology. I felt that was great, but I wanted to finish up my PhD work.

One thing led to another, and after a year I developed some ideas of my own that I wanted to pursue. There was some unused lab space and I took a gamble that I could just keep myself going on my own and do the research that I wanted to do and declined the post-doc. Years passed and I was able to raise money and build up a research group from scratch.

Q: You said you don’t recommend this path to others, why might it not work for them?

RW: The Smithsonian had some funding that made me a principal investigator day one.  You didn’t have to go through some formal hiring process, you could just try things. If you got resources to keep yourself afloat, you did science. It was very fertile and it gave myself and others a path through which to rise up, if we could get science done.

By the mid-1990s, I had built a group of four to five people, without being hired by anyone in any formal way other than having people at CfA say: “you can stick around in some space in a corner.”

I never went through a formal hiring process following up on an advertised position. I’ve never been hired anywhere. The last time I applied for a post was in 1984 when I applied to graduate school.

Q: Is that kind of path still possible?

RW: There is a good aspect to having more formalized procedures, there isn‘t nepotism and favoritism. Under-represented groups can be properly advanced and you are making sure there is a level playing field: those are all good things.

When it’s a bit more like the Wild West, there are lowered hurdles for young scientists when they are creative and full of energy. But even when you are trying to make things progressive and formalized it becomes unintentionally regressive to satisfy older and middle-aged scientists on hiring panels and study sections who vet and decide that “this is the young person we are allowing to join our ranks.” That lets existing researchers decide who should be hired and given a chance, rather than giving everybody a chance to pursue their creativity and see how it works out.

Q: How do you personally advise young scientists?

RW: Now I am going to sound like a griping middle-aged guy but I can think back to the young guy, too. There are too many barriers keeping people from trying things out. Scientists write proposals to get funding. We are enthusiastic about some of the proposals. Others we are less excited about. It’s tough when those ones are the ones we are less excited about.

But we always need to look out for the younger people, the next generation of scientists with promise and talent. We need to clear the path so they can get things done. I have physics graduate students from MIT and Harvard, some are also in chemistry or biophysics, some are MD/PhDs. I try to integrate everybody. They are in different locations, so we switch locations for lab group meetings. Fridays we have group seminars and then we go out to lunch together. One of my post-docs is someone who should already have a job. That’s another thing that bothers me, there are not enough jobs.

There are jobs for people who want to leave science who use their great analytical skills to do data analysis in the financial world. But for people to deploy their skills properly and continue in science, it’s hard. I have helped land good jobs for some of my people and I have a few more who are just finishing post-docs, who are just great and who need jobs.

Is phototoxicity compromising experimental results?

Light-induced damage to biological samples during fluorescence imaging is known to occur but receives too little attention by researchers.

The December Technology Feature in Nature Methods asks if super-resolution microscopy is right for you and a point that comes up repeatedly from the researchers we interviewed is the danger of phototoxicity and photodamage caused by the high irradiation intensities needed for the illuminating light. This has long been a concern with these methods and many of the papers describing them mention it.

But as discussed in the December Editorial, even fluorescence microscopy with low irradiation intensities can cause dangerous levels of phototoxicity that permanently damage the sample. Microscopists are aware of these concerns but there has been little effort to implement processes intended to reduce the likelihood of it compromising research study results. Dave Piston, Director of the Biophotonics Institute at Vanderbilt University School of Medicine, laments that while phototoxicity is a big deal he has gotten zero traction with NIH reviewers on trying to build some rules for it.

There are some good resources available to researchers that highlight the dangers of phototoxicity and provide advice on how to limit it. Methods in Cell Biology Vol 114 has an excellent chapter by Magidson and Khodjakov, Circumventing Photodamage in Live-Cell Microscopy, that should be mandatory reading for all researchers using fluorescence microscopy for biological research. Also, Nikon’s MicroscopyU has a literature list with several dozen references and recommended reading on phototoxicity. It could use some updating but is still useful.

Despite the amount of microscopy literature that discusses phototoxicity, discussion of the phenomenon in research articles published in Nature Journals is conspicuously absent. This is highlighted by a simple full-text search we performed on the HTML versions of research articles published in Nature, Nature Cell Biology, Nature Immunology, Nature Methods and Nature Neuroscience. The articles retrieved were limited to original research articles.

The table below lists the number of occurrences of each of the listed words in the period from January 1, 2005 to November 3, 2013 in each of the indicated journals. The percentages represent the number fraction of articles containing ‘phototoxicity’ relative to the numbers of articles containing each of the microscopy- or fluorescence-related terms. Note that this is NOT a measure of co-occurrence, only a measure of how common the term ‘phototoxicity’ is relative to the other terms.

phototoxicity fluorescence fluorescent microscopy microscope
# # % # % # % # %
Nature 8 2120 0.4% 1925 0.4% 1995 0.4% 1918 0.4%
Nature Cell Biology 8 815 1.0% 728 1.1% 866 0.9% 822 1.0%
Nature Immunology 6 552 1.1% 574 1.0% 408 1.5% 326 1.8%
Nature Methods 27 565 4.8% 494 5.5% 441 6.1% 407 6.6%
Nature Neuroscience 18 639 2.8% 727 2.5% 587 3.1% 736 2.4%

 

The same analysis was repeated with the term ‘photodamage’ to determine if there was a substantial difference in the usage of these two similar terms.

photodamage fluorescence fluorescent microscopy microscope
# # % # % # % # %
Nature 18 2120 0.8% 1925 0.9% 1995 0.9% 1918 0.9%
Nature Cell Biology 6 815 0.7% 728 0.8% 866 0.7% 822 0.7%
Nature Immunology 2 552 0.4% 574 0.3% 408 0.5% 326 0.6%
Nature Methods 29 565 5.1% 494 5.9% 441 6.6% 407 7.1%
Nature Neuroscience 12 639 1.9% 727 1.7% 587 2.0% 736 1.6%

 

These results carry the potentially large caveat that the analysis did not include the text of the supplementary information, but the rarity with which phototoxicity or photodamage is discussed (0.4% to 7% relative to microscopy terms) suggests that researchers don’t appreciate how important it is to pay attention to artifacts that result from light irradiation. Luckily, there are exceptions to this state of affairs.

An excellent example of testing for phototoxicity and the subtle effects it can induce can be found in a manuscript from Jeff Magee’s lab at Janelia Farm Research Campus published last year in Nature. Quoting from the manuscript, “Particular care was taken to limit photodamage during imaging and uncaging. This included the use of a passive 8× pulse splitter in the uncaging path in most experiments to reduce photodamage drastically [Ji, N. et al. Nat. Methods (2008)]. Basal fluorescence of both channels was continuously monitored as an immediate indicator of damage to cellular structures. Subtle signs of damage included decreases in or loss of phasic Ca2+ signals in spine heads in response to either uncaging or current injection, small but persistent depolarization following uncaging, and changes in the kinetics of voltage responses to uncaging or current injection. Experiments were terminated if neurons exhibited any of these phenomena.”

It is easy to see how these changes in Ca2+ responses could easily have been interpreted as real biological effects caused by the uncaged glutamate, rather than the uncaging light itself.

It is unrealistic to expect that any mandates or oversight would be able to prevent or detect such consequences of phototoxicity in research studies. It is essential that investigators themselves be vigilant and implement appropriate controls to detect these effects. Na Ji, also at Janelia Farm Research Campus says, “It is not enough to only look for instant and dramatic signs of phototoxicity. Sometimes the effects may be more subtle and even unperceivable during the imaging period, but may become obvious when the same sample is imaged the next day. Care has to be taken in data collection and interpretation, especially when the biological process under investigation itself is a subtle one.”

Finally, the application is just as important as the imaging method being used. For example, light-sheet microscopy is excellent at reducing irradiation levels in volumetric imaging. But some applications of super-resolution microscopy, even on living samples, might be less susceptible to artifacts caused by phototoxicity than are sensitive long-term imaging applications of living samples by light-sheet microscopy. Nobody’s microscope earns them a free pass on the dangers of photodamage arising from phototoxicity. Everyone needs to be vigilant.

Update: A reader helpfully pointed out that the danger of phototoxicity and photodamage also applies to optogenetics, where light (often in the blue region of the spectrum) is used to control protein activity.

DNA origami on the rise

Nanotechnology is all the rage these days but its use by practicing biologists is still very limited. A recent entry in the nanotechnology arena is DNA origami, a method for creating nanostructures out of DNA that is more accessible than previous methods and allows larger and more complex structures to be created with greater ease.

In the April issue of Nature Methods you will find a primer to DNA origami that provides an excellent introduction to this technology with valuable practical advice on designing and synthesizing DNA nanostructures using the DNA origami methodology. We hope that this primer will stimulate biologists or others new to this field to take a look at this technology and dream up exciting new applications.

One of the crucial steps of DNA origami is isolating your properly folded structure. A Correspondence by William Shih, one of the pioneers of DNA origami, describes some simple but very useful modifications to an agarose gel electroelution method that many people use for isolating PCR products or small DNA fragments from restriction digests. These changes greatly increase the efficiency of isolating intact large DNA nanostructures compared to existing methods.

Finally, the Editorial discusses the prospects of DNA nanostructures created using DNA origami as biological research tools.

Based on the number of posters describing applications of DNA origami at the 2010 Gordon Research Conference on Single Molecule Approaches to Cell Biology, compared to previous years the biological community and the single molecule biophysics community in particular is showing interest in the methodology. Only time will tell if it fairs better among biologists than other promising nanotechnology tools and methods.

We’d like to know what our readers think of the biological research prospects of this technology, or other nanotechnology tools and methods for that matter. Tell us what you think.