The International Mathematical Union has released a report on the use of citations in assessing research quality. The report, Citation Statistics, is written from a mathematical perspective and strongly cautions against the over-reliance on citation statistics such as impact factor and h-index. The belief that these parameters are accurate, objective and simple, is unfounded.
It states that the objectivity of citations is illusory because the meaning of citations is not well-understood. Its meaning can be very far from ‘impact’. Although having a single number to judge quality is indeed simple, it can lead to a shallow understanding of something as complicated as research. Numbers are not inherently superior to sound judgments.
The report, written by mathematicians, promotes the sensible use of citation statistics in evaluating research and points out several common misuses of this widespread application of mathematics. The authors of the report recognize that assessment must be practical and that easily-derived citation statistics will be part of the process, but caution that citations provide only a limited and incomplete view of research quality. Research is too important, they say, for its value to be measured with only a single coarse tool.
(This is a precis of the press release accompanying publication of the report, see links above.)
Further discussion of the report, together with other matters related to citation and quality metrics, is taking place online at the Nature Network Citation in Science group, which all are welcome to join.