As online comments on newly published research become widespread, a new dilemma faces scientists wanting to enter the electronic fray: where to comment, and in what format for maximum impact?
That question faced Kenneth Lee, a researcher in regenerative medicine at the Chinese University of Hong Kong, when he wanted to post up his critique of controversial stem-cell research. Lee’s research group has, like many other scientists, tried and failed to replicate the work, published in Nature at the end of January. The studies are now under investigation, with some of their authors calling for retraction.
Lee had his pick of online fora. He could have posted on a closely watched stem-cell blog by researcher Paul Knoepfler at the University of California, Davis, which has been collecting tales of failed replications. He could have posted on PubPeer.com, a website where people can make anonymous comments about published papers, which has also seen large amounts of traffic discussing problems with images used in the studies. He could have posted on PubMed Commons, an initiative launched last October that allows scientists to comment on published abstracts on the PubMed website. He might have chosen any number of other venues — such as the news articles reporting on the controversy — or even his own website.
Instead, Lee picked ResearchGate, a social network that boasts more than 4 million signed-up researchers. And instead of just adding his comment linked to the publication’s page on the site, Lee posted up a structured mini-review, with sections for ‘methodology’, ‘analyses’, ‘references’, ‘findings’ and ‘conclusions’, and including his own images.
This did not happen by accident. ResearchGate’s managers had noticed that Lee was chattering about his replications on their network, and an employee invited him to be the first to try out their new post-publication review format. “I was very reluctant at first, but she said I keep the copyrights, so I reluctantly agreed,” Lee says. “This is how everything came together. I think it is just fate.”
“It looks interesting, and I am a supporter of innovative approaches to facilitate discussions among scientists in real time,” says microbiologist Ferric Fang of the University of Washington in Seattle. “A nice thing about the more structured format is that it encourages reviewers to be more systematic and to support their critiques. Short comments are OK but it is easier to make reckless statements in the absence of structure.” Fang adds that in this particular case, “I don’t expect the open review to have much impact on the paper since questions about its validity have already been raised”.
Asked why researchers should post their reviews on ResearchGate — as opposed to any other website — Madisch points out that his site has a community of verified scientists. “The content is free — anyone can read that from outside — but to contribute, you need to be affiliated with an institution that does research, so the quality is high,” he says. “I think Kenneth decided to publish on ResearchGate because he is part of an engaged community there. He wanted to get his replication out fast in order to warn others, and to get feedback on his work — rather than, say, write a letter to the editor, which can come six months after an article is published, and may be completely detached from the study itself. If there is one central place where people go, post-publication peer review becomes more efficient for everyone,” he says.
Will a few hubs such as ResearchGate or Pubpeer.com dominate post-publication peer review? Or will online comments look more like a scattered hodgepodge of reviews, comments and discussions across websites unlinked to original publications? And if so, can search functions tie the thicket together? To these questions, Madisch has a simple answer: “I don’t know where this will end, but I do know it will be really big.”
Lee says he would still like to publish his results in a journal, so that his students get the credit they deserve for their efforts. He says he doesn’t know whether his work posted on ResearchGate could be considered a citable object in itself. “But it has already been cited on the Wall Street Journal, BBC and Boston Globe, so the impact is really far reaching,” he notes. “The most important thing is that the finding is fairly and accurately reported so that other researchers can decide whether to use their valuable resources to continue pursuing the study.”
Online post-publication peer review, in the fuller sense that Lee has performed it, is unlikely to be common, says Fang. “Given the amount of time it takes to read and carefully review a paper, I suspect that the papers selected for discussion are going to be limited to very high-profile work about which readers have concerns. After all, there are something like a million new papers published each year and the average scientist reads only about 20–25 papers each month,” he says.
Elizabeth Iorns, chief executive of Science Exchange, and an advocate for efforts to reproduce published scientific research, agrees. She points out a subtlety in the way scientists have rushed to replicate the findings. Rather than, like Lee, acting as post-publication reviewers seeking to check the paper, she says, researchers are instead trying to adopt the method for their own laboratories, and so often are not performing exact replications of the original work.
“What we have learned is that researchers don’t generally want to perform confirmatory replication studies of other researchers’ findings,” she says.