News blog

Record number of journals banned for boosting impact factor with self-citations

More research journals than ever are boosting their impact factors by self-citation.

Every year, Thomson Reuters, the firm that publishes the impact-factor rankings, takes action against the most extreme offenders by banning them from the latest lists. It lets them in again, suitably chastened, a couple of years later.

And this year, the apparent game playing has reached an all-time high. Thomson Reuters has excluded 51 journals from its 2011 list, published yesterday; 28 of the banned are new offenders, says Marie McVeigh, director of the firm’s annual Journal Citation Reports (JCR), and the others remain blacklisted from last year. The full list is available here for subscribers to JCR.

That’s a substantial increase on previous years: 34 journals were excluded from the 2010 lists, compared to only 26 in 2009, 20 in 2008 and just 9 in 2007.

Almost all of those banned are excluded because of excessive self-citation, although three journals — Cell Transplantation, Medical Science Monitor and The Scientific World Journal — apparently worked together to cite each other and thus raise impact factors.  That “cartel” was originally reported by Phil Davis on The Scholarly Kitchen, and he has today posted a follow-up article on that ban. McVeigh says that this incident, which she calls “an anomaly in citation stacking”, is the only one of its kind that she has found.

To put all this in context, the 2011 ranking includes more than 10,500 journals, so the removed offenders make up fewer than 0.5% of the total.  Still, Thomson Reuters could have kicked out more journals. Its own statistics indicate that 140 journals have had self-citations making up more than 70% of total citations in the past two years. By comparison, four-fifths of journals keep this proportion below 30%.

But McVeigh explains that some cases of disproportionate self-citation aren’t censured, if a journal’s citations are so small that it hardly affects the impact-factor rankings. “Before we suppress a journal we set thresholds extremely high, because we don’t want to get into the business of policing minor behaviour,” she says. “We only suppress a journal when we think that this numerically significantly alters the ranking category”. The firm does not ascribe motives to the anomalous citation patterns it sees, she adds.

One of the newly banned journals, the Journal of Biomolecular Structural Dynamics (JBSD), boosted its impact factor from 1.1 in 2009 to 5.0 in 2010. Because of that sharp rise, last July it was invited by Thomson Reuters to explain its success in a ‘Rising Stars’ analysis.

In his Rising Stars response, editor-in-chief Ramaswamy Sarma told the firm that his journal’s high citation rate owed to two major reasons. First, after a controversial paper on protein folding, 31 laboratories had written commentaries on the work that were published in the JBSD February 2011 issue, leading to a high number of JBSD citations. More importantly, since 2009, JBSD had encouraged authors to show how their papers related to other recent JBSD publications, to enhance the education of doctoral students reading the journal.

The Rising Stars article was published in September but removed within a week, Sarma says, after Thomson Reuters contacted him about the problem of self-citations.

When Nature contacted Sarma last July to inquire about his journal’s self-citation rate (tipped off by the blog BioChemical Matters), he added that he did not follow the Thomson Reuters impact-factor statistics, preferring instead to look at usage statistics from Google Analytics (which the journal displays on its old website — according to a June 2012 announcement, it is no longer published by Adenine Press, and has moved to Taylor & Francis).

Today, Sarma says that the journal has discontinued its policy of encouraging student education by including self-citations, and its self-citations will go back to an acceptable range. The journal wants to get back into the Thomson Reuters JCR lists, he says.

“In a time when the discipline is fragmented by so many different journals, if one wants to run a viable and useful academic journal, covering both controversy and doctoral training, then continuing self-citation is unavoidable. And you say some scientists frown on it. An easy solution is for Thomson Reuters to publish an impact factor with self-citation only, and another one without self-citation to satisfy the unhappy people,” he wrote last July — a sentiment that he still holds.

Three years ago, Thomson Reuters did start publishing impact factors with and without self-citations. But in February,  Allen Whilhite, an economist at the University of Alabama in Huntsville, suggested that the firm remove journal self-citations from its impact-factor calculation altogether, to remove any incentive for editors to accrue them. His plea came after he published an article in Science reporting that one in five academics in a variety of social-science and business fields said they have been asked to pad their papers with superfluous references to get published. But, as McVeigh told me at the time, Thomson Reuters feels this would be a “fundamental change to the impact-factor calculation, to ‘respond to something that has not shown itself to be a huge problem’.”

Will the trend of ever-greater self-citation by journals continue? That perhaps depends on the importance placed on the impact factor itself — a number that many researchers feel scientists and research funders would do well to ignore.

Comments

  1. Report this comment

    Paul Peters said:

    I would like to respond to the statement in this post stating that “Cell Transplantation, Medical Science Monitor, and The Scientific World Journal… apparently worked together to cite each other and thus raise impact factors.” It is unfortunately true that two articles were published in The Scientific World Journal with excessive citations to the journal Cell Transplantation, which have subsequently been retracted on the grounds that they violate the journal’s Policy Against Citation Manipulation (https://www.tswj.com/policies/). These articles were both written by members of the Cell Transplantation Editorial Board, and the Editor who accepted both articles for publication in The Scientific World Journal, who later left The Scientific World Journal’s Editorial Board, is one of the Section Editors for Cell Transplantation.

    While this situation (which is explained on The Scientific World Journal’s website at https://www.tswj.com/statement/) is very regrettable, it is incorrect to describe this as a “citation cartel” since there have never been any articles with excessive citations to The Scientific World Journal published in Cell Transplantation or Medical Science Monitor. It appears that a number of Editors from Cell Transplantation worked together to exploit their position on the Editorial Boards of other journals (including The Scientific World Journal) in order to boost the citation count to Cell Transplantation, but without the involvement of any other Editorial Board Members or the former publisher of The Scientific World Journal. We very much agree that better safeguards should have been in place to prevent these sort of articles from being accepted for publication, and after Hindawi took over the publication of The Scientific World Journal last year we implemented a number of changes to the editorial workflow of the journal which should prevent any similar cases from happening in the future. More recently, we developed a tool that our in-house staff currently use to check every submitted manuscript that we receive in any of our journals in order to detect possible cases of citation manipulation prior to the article being sent for peer review.

    While we very much regret the fact that The Scientific World Journal will not receive an Impact Factor for the current year, we appreciate the need for Thomson Reuters to take a firm stance against any manipulation of the citation record, which is an issue that we take very seriously as a publisher.

    Paul Peters

    Hindawi Publishing Corporation

  2. Report this comment

    N Sombrero said:

    “Citation Cartel” is more commonly used, althoug by not all, by authors of papers. An example is the practice among some of the faculty from Singapore and Chinese (PRC) universities. Some of the individuals run “sweat shops” and publish a large number. They team up across the ocean with their friends and former associates (who also run sweat shops and publish in large numbers) and cite each other. Presto! High h-indices for all to enjoy!

  3. Report this comment

    Bulusu Gopalakrishnan said:

    Apologies if this is a stupid question, but, “what is the need or use of this glorified citation index?”
    It’s much ado about nothing, IMHO. And to top it all, this JCR is not “open access”! No wonder the “hype, hypocrisy and hot air” has inebriated researchers in our current dog-eat-dog world resulting in “fabrication” of these indices. I presume that the JJ Thomsons and JD Bernals were happier without these “indices” and could devote their time to serious research.

  4. Report this comment

    Raj Rajagopalan said:

    Impact Factors and h-Indices are non-robust and non-ideal measures that are meant for an ideal world. In an ideal world, in which one does not try to manipulate the data and the statistics, and if one assumes that citing a paper is a true indication or a true measure of the “impact” of the cited paper, then these measures would make sense and would be useful. But as everyone knows or should know, such is not the case. People like Hersh have caused more damage than help the cause of good science.

    Take the example of areas such as membrane technology for energy applications or for water treatment or synthesis of nano particles for fuel cells, advanced materials, catalysis or water treatment. These are areas in which the publication rates are very high (and the publishing population is very large). I know individuals who churn out publications at a rate of 15 to 50 per year or more (yes!). Never mind that most of the results are useless or irreproducible – or are not worth the time to even attempt to reproduce. Journals that publish papers in these areas tend to have high IFs. In fact, the desk decisions concerning whether a manuscript would be sent out for peer reviews are based on the perceived potential impact of the manuscript (i.e., is the manuscript likely to generate a good number of immediate citations to help increase the IF of the journal?).

    Now take the author. If the author publishes, say, 30 papers a year,  all he or she needs to do (say, in the extreme case) is to cite the 30 papers published in the previous year in the 30 papers he or she publishes in the current year! Then the author is guaranteed an h-index of 30 immediately! Of course, I do not know of anyone going to this extreme; but all one has to do is to cite, say, at least 10 of the previous papers and quickly build a high h-index. And one does not have to resort to “unethical” practices to justify such self-citations. One can justifiably include a statement such as “A number of previous attempts has been made in the literature for tailoring nanoparticles for applications in fuel-cell technology (see, for example, refs. 7 – 28)”, and then disperse your 10 self-citations between ref. 7 and ref. 28!

    The problem is that universities and research institutions shamelessly replace critical thinking and serious leadership with blind adherence to highly faulty measures of scholarship and productivity. These are particularly true in up-and-coming educational institutions in places like Singapore, China, South Korea, etc. To begin with, many of these institutions, in Asia as well as elsewhere, do not even bother to understand the tools they are given and what the tools are good for. You give a monkey a hammer, and the monkey will bang the hammer against anything it comes across. You give the monkey an iPad ……..!

  5. Report this comment

    Martins Emeje said:

    How many R & D results that changed the course of human civilization, from Agriculture, Health to Engineering etc were published in IF journals? Why is it that big Industries don’t publish results their inventions in journals? IF has rendered many scientists corrupt; forge data in order to publish in high IF journals and get promoted to professor that knows nothing. What were the IFs of the journals Archimides, Eistein, Pasteaur, Faraday, etc published their results in? I pity this generation!

  6. Report this comment

    Andranik Vartanyan said:

    Yes, in fact, in The Scientific World Journal published two articles with excessive citations to the journal Cell Transplantation. That they violate the journal’s Policy Against Citation Manipulation. https://biologymy.com

  7. Report this comment

    Dominic Jain said:

    It’s much ado about nothing, IMHO. And to top it all, this JCR is not “open access”! No wonder the “hype, hypocrisy and hot air” has inebriated researchers in our current dog-eat-dog world resulting in “fabrication” of these indices

    ——————————————————————————————————

    Jonathan G.
    Skype: JohnG64
    https://www.pelisnet.com

Comments are closed.