Letter by Sipido and Glänzel Regarding Article, “Poorly Cited Articles in Peer-Reviewed Cardiovascular Journals from 1997 to 2007: Analysis of 5-Year Citation Rates”
To the Editor:
There is no doubt that the drive for the quantification of scientific value was initially fueled by legitimate scientific curiosity and that it has helped the biomedical field to gain insight in the dissemination of results in a novel way. The work of Eugene Garfield and the Science Citation Index has been instrumental to trace related publications and build networks of science and communities. Further quantification indexes have brought up new ways of looking at research, the scientific community, and the publication process. There is, however, grave concern that all of this is degenerating into a metrics mania where numbers are generated for the sake of numbers, scientific rigor is no longer applied, and evaluation of quality is superseded by the use of simple output statistics.1,2
In a recent issue of Circulation, Ranasinghe et al3 present an analysis of journals in the cardiovascular field and the poorly cited articles. Are these data of sufficient quality? Are they novel? Do they impact the field?
Bibliometrics is a discipline with its own methodology and need for rigor. The methods of Ranasinghe et al3 in selecting their corpus lacks rigor. It is stated that the list of journals is based on SCImago, but at least 3 journals that contribute substantially to the field in terms of volume and impact of the articles they publish are not included. Conversely, other journals are included that cannot be traced for the full period of analysis. Any automatic retrieval should be followed by independent validation by experts, and multiple search strategies need to be compared and combined. The selection of the article types that are included versus those that were excluded is not rigorous, because reviews are excluded with “other noncitable items,” at odds with the quoted reference and bibliometric studies (see, eg, Glanzel and Moed4). Other items relevant in terms of impact were also excluded such as letters5 and case reports.
The application of mathematical-statistical tests cannot disguise the otherwise somewhat rudimentary use of bibliometric methods. Recent studies providing a more differentiated methodology have not been considered.6 A major issue is the application of universal thresholds, first for the delineation of what the authors consider poorly cited articles and, later on, for the determination of poorly cited journals. This selection must be biased, because the predefined universal thresholds are arbitrary and do not take into account the different nature of articles and the different scopes and profiles of the journals. In particular, basic research, clinical research, and case studies represent different standards in citation impact.
Another methodological issue refers to the trend analysis. Ranasinghe et al3 find that the share of poorly cited articles decreases over time, but they also draw conclusions from the increase of their absolute number. This is a consequence of the fast growth of research literature in the discipline. The danger of using absolute numbers has already been questioned in the context of “inflationary bibliometric values.”7
This all implies that the selection of data and their analysis has major shortcomings as a bibliometric study.
The data are also not particularly novel. The analysis of poorly cited articles has been conducted before on a larger scale if not in the particular case of cardiovascular research. It is not surprising that the resulting ranking of journals is closely related to rankings based on impact factor, given that the same elements are included (ie, number of articles and number of citations). A high number of poorly cited articles will obviously reduce the impact factor.
The impact of a study is determined by the quality and positioning of the findings, and the perspective it offers. The discussion focuses on the latter and fails to address the limitations of the present data. This less than critical approach illustrates the problems in the debate on the impact of research. This is a highly relevant discussion, yet the difficulties involved when looking for impact are many. Nevertheless, with a careful and comprehensive approach, useful information can be obtained at aggregate level, illustrated by the analysis on the return on investment in biomedical research, showcasing cardiovascular disease.8
In conclusion, the report by Ranasinghe et al3 may generate further useful discussion on the nature of publications and of impact evaluation but it is not complete nor a very scholarly analysis, and therefore the data should be interpreted with caution. The evaluation of research based on journal metrics, at the aggregate and individual level, is not likely to disappear yet and will remain high on the agenda of policy makers and scientists. Studies in this field therefore need expertise and scientific rigor.
Karin R. Sipido, MD, PhD
Department of Cardiovascular Sciences
University of Leuven
Wolfgang Glänzel, Dr. rer. nat., PhD
Center for R&D Monitoring
Department of Managerial Economics, Strategy & Innovation
University of Leuven
We discussed the commentary with Alan Daugherty, Editor-in-Chief of Arteriosclerosis, Thrombosis, & Vascular Biology, Thomas Luscher, Editor-in-Chief of the European Heart Journal, and Irving Zucker, Editor-in-Chief of American Journal of Physiology Heart & Circulation.
Dr Sipido is Editor-in-Chief of Cardiovascular Research, Dr Glänzel is Editor-in-Chief of Scientometrics.
- © 2016 American Heart Association, Inc.
- Van Noorden R
- 2.↵San Francisco Declaration on Research Assessment (DORA). http://am.ascb.org/dora/ Accessed May 25, 2015.
- Ranasinghe I,
- Shojaee A,
- Bikdeli B,
- Gupta A,
- Chen R,
- Ross JS,
- Masoudi FA,
- Spertus JA,
- Nallamothu BK,
- Krumholz HM
- Huffman MD,
- Baldridge A,
- Bloomfield GS,
- Colantonio LD,
- Prabhakaran P,
- Ajay VS,
- Suh S,
- Lewison G,
- Prabhakaran D
- 8.↵European Medical Research Councils (EMRC). A Stronger Biomedical Research for a Better European Future. 2011. http://www.esf.org/fileadmin/Public_documents/Publications/emrc_wpII.pdf. Accessed May 25, 2015.