New Journal and New Determination in Time for Parkinson’s Awareness Month

Contributor: James Beck, Ph.D.

Today’s launch of npj Parkinson’s Disease has auspicious timing. As we publish the inaugural papers of this new journal – all seeking answers that can help us understand this chronic and neurodegenerative disease – we do it during the time of year also recognized as Parkinson’s Awareness Month in the US.

April, as the birthday month for the late James Parkinson, is a time when we remember that there are still seven to 10 million people worldwide living with the disease who are counting on science to help them lead better lives. It is a time when we remember the urgency of addressing the visible and invisible symptoms of the disease.

For you see, there is more to Parkinson’s disease than simply the visible signs of a shaking palsy.  Parkinson’s disease is an ‘invisible’ disease too.  Many of the symptoms people with Parkinson’s experience lie just under the surface as invisible, non-motor symptoms such as fatigue, autonomic dysfunction, balance problems and others.  And as of now, these go unsolved.

At a time when there is a proliferation of biomedical journals, a new journal may not seem particularly noteworthy. However, npj Parkinson’s Disease is indeed different. Parkinson’s Awareness Month makes it clear that much work needs to be done to help those who live with Parkinson’s disease, and through npj Parkinson’s disease, the Parkinson’s Disease Foundation and Nature Publishing Group are intent on publishing excellent science to get that work done.

The inaugural issue of the journal does exactly that, with articles addressing some of the unresolved issues in Parkinson’s disease, including how the disease might start and what might be the biological basis for one of its most troubling symptoms – hallucinations.

  • On the first point, despite the more than 50 years since the loss of dopamine was recognized as the basis for the profound changes in movement that people with Parkinson’s experience, scientists are still at a loss to explain what triggers the disease in more than a limited number of cases.  Malú Tansey, Ph.D., and her colleagues at Emory University are working to understand more subtle triggers of Parkinson’s.  Tansey et al. describe how a single nucleotide polymorphism in the MHC-II locus may link environmental factors to regulation of antigen presentation. Their work continues the trend of showing how genes and environment remain inexorably linked in disease risk.
  • Another issue unseen by observers but of which many people with PD are acutely aware, is hallucinations. Psychosis, unfortunately, is not an uncommon symptom in PD, but it can be one of the most debilitating and troubling ones, often leading to early placement in a nursing home.  Therefore, the fMRI study by Simon Lewis, M.D., and colleagues using surrogates of visual hallucinations is particularly relevant.  Here, they revealed significant abnormal connectivity in attentional networks during tasks designed to mimic hallucinations.  Identifying fundamental issues underlying PD psychosis such as this, is an important step to treating and preventing this disabling symptom.
NPJ_PD_BlogImage_042115

{credit}Parkinsons Disease Foundation{/credit}

While these represent just a few of the unsolved mysteries in Parkinson’s disease, there are many more. Trying to determine which mystery to tackle next is no mean feat. On our end, PDF is working to understand unmet needs in Parkinson’s disease, and which are most pressing, by asking people who live with the disease about their priorities through our “people’s choice” research award.  And with the open access nature of npj Parkinson’s Disease, we can ensure that those answers can disseminated for all to read.

As I have said before, at PDF, we believe that empowering patients and scientists with access to information will help us come closer to ending Parkinson’s disease.  We hope you’ll check out this first issue of npj Parkinson’s Disease, to see what we’re learning. There are millions of people worldwide counting on us to join with them in finding the answers.

James Beck, Ph.D., is Vice President, Scientific Affairs at the Parkinson’s Disease Foundation. He oversees the strategy of PDF’s research programs as part of the organization’s mission to end Parkinson’s disease.

 

Fast-track peer review experiment: First findings

 

Yesterday marked the close of our one month fast-track peer review experiment on Scientific Reports. The experiment, which was designed as a trial for a small number of manuscripts, was developed because we want to tackle some of the issues we see in peer review – authors tell us they are frustrated with the speed of peer review, and we also are keen to explore ways to credit reviewers. As one of a number of initiatives we are running this year, including the recent roll-out of double-blind peer review, we wanted to ascertain what demand there was for a faster service. You may be interested to read about the rationale for the trial in our previous blog post.

What we’ve heard from the community

Members of the Editorial Board of Scientific Reports have raised some philosophical issues including:  potential for discrimination against authors who are unable to pay additional fees; concerns that this may impact the quality of the reviews; or encourage unethical behaviour on the part of authors or reviewers. Nature Publishing Group is committed to the upmost editorial integrity, robustness and ethical practices. We have confidence the fast-track set-up adhered to these standards, but we also take the concerns raised seriously and wish to acknowledge them here.

What we’ve learnt

We have already learnt a lot from the past four weeks, and expect to learn more as we follow-up. We thank those Editorial Board members and all those who took the time to write to us, and for constructive conversations, with much-good will and shared commitment to Scientific Reports, and to tackling the wider issues that face academia. We have been listening to all of the community feedback. In future we will be working more closely with the Scientific Reports Editorial Board to welcome their suggestions about how we can address the challenges in the peer-review system, and to continue to improve Scientific Reports.

Outcomes from the trial

We wanted to share the immediate first findings and what we hope to do now that the trial has concluded.

The fast-track trial ended yesterday, and over the four weeks the service was available:

  • We received 25  requests from authors to pay for fast-tracked peer review.
  • These authors represent a range of institutions, countries, and levels: professors (16 authors), doctors (8 authors) and one PhD student.
  • Geographically, the highest number (10 manuscripts)   were from China, but there were submissions from the UK, US, Germany, Finland and Sweden as well as further Asia Pacific countries, including Japan, Taiwan, Korea and Singapore.

Although we have a limited data set from this trial, it did confirm that, for some communities, speed matters and that there is demand for a service of this type. The trial was capped at 40 manuscripts, and  25 submissions, from a pool of approximately 1800 manuscripts submitted a month, is a small percentage. We need to spend more time collecting qualitative feedback to understand the motivations of our authors.

We want to reiterate that our service for authors was in no way impacted by this small scale trial, and our commitment to robust peer review remains firm. This trial was never intended to be scalable, or permanent – it was an experiment to learn.

What next?

The fast-track service on Scientific Reports is no longer available. We will be reviewing what we have already learnt, and our planned follow-up to gather feedback from trial participants, peer reviewers, editorial board members, and the wider scientific community. Our goal continues to be to serve scientists and the advance of science. We see challenges with the peer review system as it currently operates, and remain committed to innovating to credit peer-reviewers, to expedite the process while not compromising its integrity or overloading busy researchers.

Ultimately, our aim is to provide a better service to our authors and more support for our Editorial Board and reviewers. We know with any innovation that there will be as many challenges as opportunities. We want to find ways to strike the right balance in offering services to authors to meet their needs, but we’re committed to finding those solutions with the research community.

Pushing the limits – Light in the 21st Century

Guest post by Congcong Huang, Associated Editor, Nature Communications and Nicky Dean, Team Manager Physics, Nature Communications.

This week we conclude our series of ‘beautiful experiments with light’ featured in our poll and finally reach the new millennium in which lasers continue to enable powerful and diverse experiments.

Our story begins with the generation of ultrafast laser pulses. Following the invention of lasers in 1960 enormous efforts were made to shorten the pulse duration which led to femtosecond lasers in the late 1980s and finally in 2001 to the first reports of attosecond laser pulses. In one attosecond (10-18 s), light travels slightly more than the length of a water molecule, while molecules are essentially frozen during this time, with molecular vibration at femtosecond (10-15 s) and rotation at picosecond (10-12 s) timescales. This makes it possible to access the timescale of electron dynamics inside molecules.

As light pulses have been made ultrashort –short enough even to capture the motion of electrons – a natural question is whether the speed of light can be controlled to the same extent. It is not surprising that light slows down when it travels through glass or water, but this is only a modest effect. It was thus a stunning observation, made by Lene Hau and her group in Harvard in 1999, that light travels at a cycling speed – 7 orders of magnitude slower than c – in a sodium atom cloud right below its Bose-Einstein condensation temperature. The cold atoms alone cannot do the work; the use of a laser field that efficiently cancels light absorption, known as electromagnetically-induced transparency, makes the trick possible. The demonstration sparked a new chapter for laser controlled optical materials.

Meanwhile, more attempts at controlling the behavior of light were underway. As mentioned above, light slows down when it passes through a medium, an effect characterized by the medium’s refractive index. This index is normally positive, and it tells us how light rays will be bent when they move from one medium into another. You can see this effect by looking at a straw in a glass of water which appears to be sharply bent at the surface. In the late 1960s, Victor Veselago wondered what might happen if the refractive index was negative. He predicted that light entering such a medium should bend in the opposite sense to what we normally expect (as if the straw would bend the ‘wrong’ way). In 2001, David Smith and colleagues realized this prediction by constructing an artificial material, or ‘metamaterial’, made of an array of copper split-rings on circuit boards. Their metamaterial exhibited a negative refractive index at around 10 GHz. Following Smith’s demonstration, many more negative-index metamaterials have been made using all kinds of different structures, across a range of frequencies, including the visible spectrum.

Diffraction pattern of a virus particle, taken with an X-ray free electron laser.

Diffraction pattern of a virus particle, taken with an X-ray free electron laser.

Continue reading

Update on Scientific Reports Fast Track Experiment

On Tuesday 24th March we introduced a small-scale, one month experiment on fast track peer review (up to 40 manuscripts maximum), which would enable authors to receive a first decision within three weeks of passing our quality control checks. You can read our original post on this here.

The trial has been running for two weeks and we have a couple of weeks left to run on it. We received some feedback from Editorial Board Members, authors and the general scientific community, both positive and negative. Last week a group of Editorial Board Members wrote to us, and our response of 31st March is below. The Editorial Board Members have since raised some additional questions which we will answer shortly, as an update to this blog post.

Rubriq are also updating their own website with more information about the trial and their procedures. We continue discussions with our Editorial Board Members and other members of the scientific community, and plan to feed back on the results of the trial once it has concluded and we have had time for analysis.

**

E-MAIL DATED 31st MARCH 2015

Dear colleagues,

Thank you for your thoughtful letter regarding our experiment on peer review with Rubriq. I appreciate your getting in touch, and I’m happy to clarify some aspects of the trial.

This is a very small scale pilot study involving a few manuscripts (approx. 40) over the course of a few weeks.  We have assessed the quality of service that Rubriq provides and feel confident that the peer review reports they will deliver will be of a comparable standard to our own. This was a crucial factor in our decision to work with them, as we place enormous importance on the quality of our peer review process.

Reviewer selection is carried out by a team of Peer Review Coordinators at Rubriq following criteria agreed by Scientific Reports in-house Editors so that they match our own –  including those used to exclude potential conflicts of interest. The reviews that are solicited by Rubriq first go through internal vetting by the Peer Review Coordinators to ensure that the reviewers have provided detailed, actionable evaluations. These reviews are then transferred to Scientific Reports (with the identities and affiliations of the reviewers) where they are then evaluated by the Scientific Reports in-house Editors.  

Appeals will handled by  Scientific Reports following standard procedures. 

NPG is first and foremost here to serve scientists. As scientists our method is to run experiments, measure the results, learn and adapt. Testing and evolving the peer review process is something we’ve embraced over many years at NPG. The decision to conduct this pilot study was taken after careful consideration – in a 2014 survey of over 30,000 NPG researchers, authors told us that they want us to innovate when it comes to peer review:

·         70% authors are frustrated with the time peer-review takes

·         77% think traditional peer review could be made more efficient

·         67% think publishers should experiment with alternative peer-review methods

Reviewers involved in the fast-track trial will be paid by Rubriq on condition they return their reports within an agreed timeframe – there should not be any difference in quality between fast track and standard reviewer reports.

We know with any innovation that there will be as many challenges as opportunities and so we will be closely monitoring the trial to check for any differences in metrics, other than faster time to first decision, between fast-track and standard submissions to ensure that we are not introducing any biases.

This small scale study will not affect our standard peer review service so authors who are unable to afford the fast-track option will not be impacted and our usual waiver policy will continue to be applied to non-fast track submissions.

We’re committed to exploring, learning, and better understanding the needs and choices of our authors. We will be carefully considering feedback from across the whole research community, including our Editorial Board Members, when deciding whether to extend the pilot study.

As members of the Scientific Reports editorial board, I value your feedback and I would be happy to discuss this further if helpful.

With all best wishes,

Nandita Quaderi 

**

UPDATE 8th April 2015

We have responded to the following questions raised by 19 editorial board members of Scientific Reports. These responses were sent today as part of an email correspondence, but they may help to answer questions from others. As promised, we are sharing them here.

We have recently conducted a successful private parallel run of peer-review outputs comparing Rubriq with Scientific Reports.  We have worked closely with Rubriq to be confident that the reports they provide are as rigorous as we would expect from our own Scientific Reports reviewers. 

I’d like to take this opportunity to address the specific questions on your website:

Who are the peer review coordinators at Rubriq?

The peer review co-ordinators are PhDs who work as in-house members of Rubriq staff.  Based on the topic and the techniques used in the research, the peer review co-ordinators match manuscripts to reviewers with a published track record of expertise in these areas. They do not make decisions on the manuscript but perform a review of each reviewer’s report to ensure that the review is complete and thoughtfully executed.

What are their scientific and academic credentials?

The minimum requirement for a peer review co-ordinator is a PhD in the biomedical sciences.

The external peer reviewers they invite are all active, publishing researchers with a minimum of a doctorate-level degree in the relevant field.

Do they have the necessary expertise to assign reviewers?

Each peer review co-ordinator has at least three years’ experience in identifying and recruiting appropriate peer reviewers for manuscripts.

Why is there only internal vetting of peer reviews?

The standard peer review process at Scientific Reports involves editorial board members choosing peer reviewers, and then making editorial decisions based on reports provided by these reviewers. For the purposes of this small-scale pilot (which was not intended to be scalable in its current format) peer reviewers are chosen by Rubriq. The editorial decisions, based on peer reviewers’ reports, are made by PhD-qualified, in-house Scientific Reports editors. This ensures we can deliver a first decision within three weeks.

The trial has been running for two weeks and we have a couple of weeks left to run on it. We have received both positive and negative feedback, and plan to summarise key findings of the trial once it has concluded and we have had time for analysis.

 

 

 

Back to light, back to reality

Guest post by Federico Levi, Associate Editor Nature Communications

The experiments in this week ’s blog entry accompanying our poll of ‘ the most beautiful experiment with light’ were carried out in the second half of the twentieth century, in which physicists were still struggling to accept the counter-intuitive implications of quantum physics.

One of the most bewildering embodiments of quantum theory is quantum entanglement.  When two particles are entangled, performing a measurement on one of them seems to instantaneously influence the other particle, even if it is light-years away. This paradox, famously termed ‘spooky action at a distance’ by Albert Einstein was formulated by Einstein, Podolsky and Rosen (EPR) in 1935. To restore reality, they argued that quantum physics could simply be our limited understanding of a deeper and less troubling theory, classically constructed over a set of ‘hidden variables’.

It took almost thirty years to devise a way to test their hypothesis. In 1964 John Stewart Bell demonstrated the famous theorem carrying his name, which showed how there would be an experimentally measurable difference between the prediction of quantum physics and that ‘less troubling theory’ imagined by EPR. Light provided the means to carry out this test. In 1972, by looking at the correlations in the linear polarization of photons emitted by an atomic cascade of calcium, Stuart J. Freedman and John F. Clauser tested Bell’s theorem at the Lawrence Berkeley Laboratory in California. The result was a landmark confirmation of quantum mechanics.

Jumping more than 10 years ahead, we find researchers dealing with the consequences of yet another quantum principle, namely the indistinguishability of fundamental particles. Two particles in exactly the same quantum state have to be considered essentially indistinguishable, and photons make no exception. In 1987 at Rochester University, New York, Chung Ki Hong, Zhe Yu Ou and Leonard Mandel showed what may be the most direct evidence of this principle. When a photon hits a beam splitter, it can continue in one of two ways with 50% probability. If two indistinguishable photons hit a beam splitter coincidentally, quantum interference forbids the outcome where they follow different output paths. The pronounced ‘dip’ in the measurement of simultaneous arrivals at the two outputs is a hallmark of the indistinguishability of a photon pair, a sought-after quality for quantum information applications.

Large-area metallic photonic crystal layer rolled onto a glass rod.

Large-area metallic photonic crystal layer rolled onto a glass rod.

Continue reading

From the Big Bang to atomic clocks

Guest post by Amos Martinez, Associate Editor, Nature Communications

We kick off this week’s experiments for our poll with the discovery of a special kind of light: cosmic microwave background. The story of this discovery is a beautiful example of the fortuity of scientific discovery.

In 1964, Arno Penzias and Robert Wilson were looking for radio emission from the Milky Way using an antenna originally built for radio-wave satellite communications. They soon noticed a noise in the microwave region evenly spread in all directions of space. After eliminating every known noise source, including a pigeon’s nest in the 6m antenna, they reached the conclusion that the noise could only be coming from outside this galaxy.

Elsewhere, cosmologists were debating whether the universe had a beginning and had been created by a Big Bang or had always existed. Advocates of the big bang theory Robert H. Dicke, Jim Peebles, and David Wilkinson had predicted that had the big bang taken place it would have generated an enormous blast of radiation that should still be detectable in the microwave region with a sensitive enough device. Sure enough, that persistent noise measured by Penzias and Wilson turned out not to be caused by pigeons but by radiation generated during the creation of the Universe. This discovery represented the first solid experimental proof of the Big Bang and, as Stephen Hawking put it, the final nail in the coffin of the steady-state theory.

Cosmic microwave background: Big Bang’s afterglow. (Credit: ESA, HFI & LFI consortia.)

Cosmic microwave background: Big Bang’s afterglow. (Credit: ESA, HFI & LFI consortia.)

Continue reading

Into the laser era

Guest post by Rachel Won International Editor, Nature Photonics

This week’s set of experiments featured in our poll  are all about the advent of the maser (microwave amplification by stimulated emission of radiation) and the optical maser, now known as the laser, and the remarkable wide impact these inventions had in science, technology and society.

The concept of stimulated emission was introduced in 1917 by Einstein, who found that the process of absorption by atoms must be accompanied by an amplification process such that the received radiation can stimulate the emission of the same kind of radiation. It was not until 1953 that the effect was experimentally demonstrated by Charles Townes and his two graduate students at Columbia University in New York. Their maser used stimulated emission in a stream of energized ammonia molecules to produce amplification of microwaves at a frequency of about 24.0 GHz. The development of a maser was simultaneously carried out by Nikolay Basov and Alexander Prokhorov at the Lebedev Institute in Moscow.

The achievements led to the award of the Nobel Prize in Physics in 1964 to Townes, Basov and Prokhorov.

The invention of the maser kicked off a race to create a similar device for visible light, now known as a laser (with ‘microwave’ replaced by ‘light’). In 1958 Townes, together with Arthur Schawlow, then at Bell Labs, published a paper extending the maser techniques to the infrared and optical region. The first working laser, however, was built by Theodore Maiman at Hughes Research Laboratories in 1960. His laser used a solid-state synthetic ruby crystal pumped by a flashlamp to produce red laser light at 694 nm.

Theodore Maiman and his invention, the first laser. (Photo credit: HRL Laboratories, LLC)

Theodore Maiman and his invention, the first laser. (Photo credit: HRL Laboratories, LLC)

Continue reading

Betwixt and between

Guest post by Leonie Mueck, Associate Editor, Nature

In last week’s post you heard about beautiful experiments with light featured in our poll from the turn of the century. This week, we will talk about the time until the 1950s. And, while so many turning points in politics and history fall into that period, advances in optics and photonics were a bit betwixt and between. Scientists were modernizing their methods and instruments but still didn’t have modern-day tools like the laser, which in the 1960s would completely transform light-related research.

They did have highly advanced telescopes. When Edwin Hubble arrived at Mount Wilson Observatory in California it was 1919. By a lucky coincidence something else arrived there at around the same time: the Hooker Telescope, which allowed Hubble to perform detailed investigations on spiral nebulae. Thanks to his measurements, we now know that those nebulae are in fact distant galaxies. As jaw dropping as this finding was to his contemporaries, Hubble went on to show something even more ground-breaking. Looking at the Doppler shift of as many galaxies as possible, he found that the shift was proportional to the galaxies’ distance in 1929. The only plausible explanation for this phenomenon was that we live in an expanding Universe!

Andromeda Galaxy taken by Spitzer in infra-red, 24 micrometres. (Image: NASA/JPL–Caltech/K. Gordon, University of Arizona)

Andromeda Galaxy taken by Spitzer in infra-red, 24 micrometres. (Image: NASA/JPL–Caltech/K. Gordon, University of Arizona)

Continue reading

Flash forward: new surprises with light

Guest post by Maria Maragkou, Associate Editor at Nature Materials

This week’s entries for the poll of the most beautiful experiments with light occurred around the turn of the 19th century.

Wilhelm Roentgen, a physics professor in Wurzburg, changed the course of medicine when he accidentally discovered X-rays, energy waves at frequencies from 0.1 – 10 nanometres. In November 1895, while experimenting with an electron-discharge tube covered with black cardboard, he noticed that a fluorescent screen further away was illuminated. Eventually he realised that the tube emitted a type of ray – marked X for unknown – that was blocked by dense material, such as lead or bones, but could penetrate other objects. As he held a piece of lead in front of the X-rays, he could see the contrast between bones and flesh on the fluorescent screen. A few weeks later, Roentgen took an X-ray picture of his wife’s hand, who allegedly claimed “I have seen my death” upon seeing it. With the discovery of X-rays, it became possible to look inside the human body without surgery and Roentgen earned the first ever Nobel Prize in Physics in 1901 for this remarkable invention.

Classic Nature Paper 1896.

Roentgen’s first X-ray image featured in Nature in 1896.

Continue reading

From despair to repair: Empowering communities to restore their oceans

Dr Ayana Elizabeth Johnson with Barbuda Prime Minister, Baldwin Spencer.

Dr Ayana Elizabeth Johnson with then Antigua and Barbuda Prime Minister, Baldwin Spencer. (Image: Waitt Institute)

Dr. Ayana Elizabeth Johnson is a marine biologist and Executive Director of the Waitt Institute. Johnson’s mission is to collect, create, actualize and amplify the best ideas in ocean conservation. Her work has been featured in the New York Times, on her blog for National Geographic, in The Atlantic, and elsewhere. She holds a Ph.D. from Scripps Institution of Oceanography, a BA from Harvard University in Environmental Science and Public Policy, and has worked on ocean policy at both the National Oceanic and Atmospheric Administration (NOAA) and the Environmental Protection Agency (EPA). You can find her talking oceans on Twitter @ayanaeliza

“People used to talk about the size of the fish they caught vertically,” says a perspicacious 15-year-old Curaçaoan holding his hands off the ground at head height. “But now we show fish size horizontally.” As the young man lowers his hands at shoulder width apart to demonstrate this, it is strikingly clear the great fishing catches of old have all but gone in the southern Caribbean Sea.

The vibrantly scenic shores and glistening beaches of this bustling island are in stark contrast with the rather gloomier outlook of the once thriving Caribbean ecosystems that supported local fisheries. Speak to any of the older residents or fishermen on Curaçao and they’ll swear by the unprecedented changes they’ve seen in their oceans in the last half century.

This is a familiar picture across the Caribbean, which is suffering from the same threats of overfishing, climate change, pollution and habitat loss, seen worldwide. In August 2014, the National Oceanic and Atmospheric Administration (NOAA) listed 20 species of coral as threatened under the Endangered Species Act, including five Caribbean species. Projected impacts of global warming and ocean acidification motivated this action, but as marine biologist Ayana Elizabeth Johnson eloquently writes in a New York Times op-ed: “climate change really is only half the story.”

Johnson’s encounter with the young Curaçaoan and his jarringly precocious words struck a chord with her eight years ago, in the midst of her PhD research. Focusing on fisheries management and ecology in the southern Caribbean, she interviewed more than 400 fishermen, scuba divers, and locals in Curaçao and Bonaire, to inquire what major changes they had seen in their oceans.

“It is critical to understand what local people see as the threats to the ocean, as the perceived problems have a huge influence on what the perceived solutions should be,” says Johnson. “Often scientists’ outside perspective can be very different to the local one – and this can lead to disconnect when discussing sustainable policy and solutions.”

Continue reading