Posted on behalf of Zoë Corbyn.
Make an article open access and it is more likely to get cited – at least that is one powerful argument in open access advocates’ arsenal to get researchers to make their work publicly available.
But new research published in the Federation of American Societies for Experimental Biology FASEB Journal suggests this may not be the case. The research – which its author claims uses a more rigorous methodology than many previous studies – shows that while providing open access to scientific journal articles certainly leads to more downloads, it simply doesn’t translate into citations.
To test whether open access articles received more citations than articles requiring a subscription, Cornell University communication researcher Philip Davis convinced the publishers of 36 journal titles crossing all subjects to randomly make about one in every five of 3,245 articles they published between January 2007 and February 2008 open access.
Davis then compared the citation counts of the 712 open access articles to the 2,533 controls finding that while the open access articles were downloaded more frequently in the first year, they were cited no more frequently – nor any earlier – over the three year period.
“The widely-accepted ‘open access citation advantage’ appears to be spurious,” said Davis, who also is the executive editor of the controversial Scholarly Kitchen blog, which is not afraid of taking the open access movement to task. “There are many benefits to the free access of scientific information, but a citation advantage doesn’t appear to be one of them.”
The results stand in contrast to those found by open access advocate Stevan Harnad, from Southampton University. He led a study that was published in PLoS ONE last year but used a different method (based on comparing non self-archived and self-archived subscription journal articles) to find open access articles received “significantly more” citations.
Harnad criticised the current study as “the sound of one hand clapping” with “no basis” for drawing the conclusions it did. Davis’ sample is likely “too small” to show the citation advantage, he says, and the study does not look properly at the key question of the extent to which the citation advantage is real versus simply an artefact of researchers selectively archiving their better (and therefore more citable) papers.
Davis’ study notes that articles that were also self-archived did receive 11% more citations on average within the three years, but with only 65 articles (2%) the effect was statistically insignificant.
“We do leave open the possibility that there is a real citation effect as a result of self archiving but that we simply do not have the statistical power to detect it,” says Davis, though he also stresses that it would be “difficult, if not impossible” to tease out whether any effect was the result of enhanced access or just better (more citable) papers being self-archived.
Image: photo by theseanster93 via Flickr under Creative Commons