On the use of the number of citations in journals to evaluate Science--short-term citations, particularly/ Sobre o uso do numero de citacoes em periodicos para avaliar Ciencia--em especial, citacoes de curto prazo.
For some time now, there has been a large degree of criticism to these procedures, which take specific metrics to establish who has more or less value in the Brazilian graduate system--who will receive a greater or lesser score and, consequently, more or less funds, more or less scientific capital. Various proposals are being presented by different research sectors in the pursuit of new perspectives for such assessments, because it is expected that they will be adopted with the due emphasis.
While these rules remain, some studies have reported very interesting results, reinforcing the above-mentioned arguments. It is the case of the post by Lilian Nassi-Calo (1) on the blog "SciELo em Perspectiva", under the title "A miopia dos indicadores bibliometricos" ("The myopia of bibliometric indicators") (2). The author advocates the idea that, in the face of a large number of researchers, projects, papers and the like that have to be considered, bibliometric indicators make the work of evaluators easier by making them simple (or simplistic). This path includes the assessed researchers as well, when they share those values and even reinforce them. In the words of the author, "The use of bibliometric indicators for evaluation of science is a ubiquitous practice, despite the fact that there is no clear link between citations and quality, impact or scientific merit". This is further pronounced "When one approaches innovation--an inherent characteristic of scientific research--[where] the link is even more disconnected."
The researcher presents results of an article published in the journal "Nature" by Stephan and colleagues (3), which hypothesized that excessive use of citations in the past 2 or 3 years does not favor, or even discourages, the publication of innovative studies. Nassi-Calo says:
[...] the authors analyzed citations in the Web of Science of more than 660 thousand papers published between 2001-2015, categorized as research with high, moderate, and no degree of innovation. As a proxy for degree of innovation, the researchers assessed the list of references of the papers in search of unusual patterns of combination. In this analysis, the authors concluded that highly innovative papers take longer to be cited in comparison to moderately innovative and non-innovative papers. Among highly innovative papers, there were two types of behavior: either they were very frequently cited papers--citations began to increase after 3-4 years and continued to increase until 15 years after publication--or they were ignored, in comparison to papers with no degree of innovation. However, it should be noted that within 3 years after publication, the likelihood that a highly innovative paper would rank among the 1% most cited papers was lower than the probability for articles with no degree of innovation. Therefore, the authors concluded that the current research evaluation system underestimates works that may have a high impact in long-term assessment. It is also important to stress that papers which proved to have a high impact in the course of time were published in lesser-impact-factor journals.
The words of Stephan's team illustrate the adverse consequences of using these evaluation procedures, focused on citation indicators, particularly the short-term ones: "[...] the more we are linked to short-term bibliometric indicators, the less we can reward research with a high potential to go beyond borders--as well as those who do such research."
At the end of her post, she makes some recommendations aimed at researchers, development agencies, reviewers, editors and universities, and stresses the importance and need for effective recovery of quantitative and qualitative perspectives so as to avoid punishing propositions which have a greater potential to widen the horizons of science.
Shirley Donizete Prado and Fabiana Bom Kraemer
(1) Lilian Nassi-Calo is a chemist from Chemical Institute of USP; She holds a Phd in Biochemistry from the same institution. After that, she received a scholarship from the Alexander von Humboldt Foundation in Wuerzburg, Germany. After completing her studies, she was professor and researcher at IQ-USP. She worked in the private sector as an industrial chemist and currently works as the Coordinator of Scientific Communication at BIREME/PAHO/WHO and is a collaborator to SciELO.
(2) NASSI-CALO, L. A miopia dos indicadores bibliometricos [online]. SciELO em Perspectiva, 2017 [viewed 27 August 2018]. Available from: https://blog.scielo.org/blog/20l7/06/01/a-miopia-dos-indicadores-bibliometricos/.
(3) STEPHAN, P., VEUGELERS, R. and WANG, J. Reviewers are blinkered by bibliometrics. Nature[online]. 2017, vol. 544, no. 7651, pp. 411-412 [viewed 14 May 2017]. DOI: 10.1038/544411a. Available from: http:// www.nature.com/news/reviewers-are-blinkered-by-bibliometrics-1.21877
|Printer friendly Cite/link Email Feedback|
|Author:||Prado, Shirley Donizete; Kraemer, Fabiana Bom|
|Publication:||Demetra: Food, Nutrition & Health|
|Date:||Sep 1, 2018|
|Previous Article:||Between meals, subjectivities, science and anthropology: an academic path of Jesus Contreras/Entre comidas, subjetividades, ciencia e antropologia:...|
|Next Article:||Reflections for a food approach: society, culture and boundaries/ Reflexoes para uma abordagem alimentar: sociedade, cultura e fronteiras.|