The Colombian journal measurement system (Publindex): paradoxes of a system that devalues knowledge produced locally
Source: Text published in: ASEUC (2019). Periodical publication: ¿Para indexar o para leer?. Colombia: Unilibros de Colombia. Available at: http://www.aseuc.org.co/unilibros/uflip/UflipUnilibros26/page_1.html (Published at the request of the author).
Wilson López López
The lack of consistence in the design of measurement systems at Colciencias, the Colombian Department of Science, Technology and Innovation, creates costly paradoxes affecting the production of knowledge in Colombia. In this editorial, I will address some of these issues and their pernicious consequences for scientific output and visibility. I have discussed other related topics in previous editorials and papers (Aguillo, Uribe & López-López, 2018; Lopez-Lopez, 2018; Acevedo et al., 2018; Lopez-Lopez et al., 2018).
The lack of consistence in the design of measurement systems at Colciencias, the Colombian Department of Science, Technology and Innovation, creates costly paradoxes affecting the production of knowledge in Colombia.
Years ago, the Colombian system decided to enter a process of internationalisation of knowledge produced locally and set in motion several actions to discourage the creation of new journals and even remove some of the existing ones. This process was meant to be supplemented by actions directed towards getting Colombian journals to be covered by Web of Science (WoS) and Scopus, and once there improve their position in them. All this was based on the premise that such databases would ensure scientific and editorial quality for both the journals and their contents.
However, this has been challenged by the Leiden Manifesto (2014) and Hicks et al. (2015), amongst others, which have criticised both the proposed equivalence between journal quality and journal citation and the lack of transparency and the presence of technical issues in some of the metrics used to qualify them.
Movements reacting to such criticisms keep gaining traction. Examples are the Declaration on Research Assessment (DORA) and the decisions by countries such as Germany and England of incorporating expert panels for assessment. Even in the United States, quartiles are not central to evaluation; Clarivate Analytics, the owner of WoS, recently published an open access report stating that bibliometric indicators are not only plagued with technical issues, but that they should probably not be used as a simple way to evaluate journals, researchers, and institutions (Adams, McVeigh, Pendlebury, & Szomszo, 2019). It is very striking that the report has been produced by one of the key references used by the Colombian science and technology system, and yet the current Publindex process is still based on those metrics.
A very complete description of Publindex systemic problems was made recently by Flórez Carranza (2018), these issues include difficulties in recording information; pointless requests for information which contribute to duplicity; comparing journals without considering the quantity of articles, their internationality or citation growth; ignorance of the efforts undertaken by journals to get covered by WoS or Scopus; used of the highly criticised H5 metric; retroactive application of new criteria; ignorance of citation dynamics within disciplines, assuming ranking criteria that do not consider the way certain scientific communities work; and even using bibliometric principles to set the time of calls.
These changes to the Publindex evaluation model have been pushed by Decreto 1279 of 2002, which regulates incentives for lecturers at public universities but that influences editorial and academic work nationally, even in private universities. This raises the question of whether incentive regulations at public universities should regulate national production; if so, another question is whether private universities should use these criteria, even if they are detrimental to their own interests and devalue their publications.
WoS and Scopus quartile indexes and those of their metric subsystems such as the JCR, SJR (by SciMAGO consulting group) and CiteScore (Elsevier) have drawn criticism, the most visible came from DORA (2012) regarding the way in which quartiles are established. DORA (2012) points to the particular methodology, which can lead to cases where, for instance, a journal in an area with few citations, and consequently, with low citation numbers can be in a higher quartile, or motivates a skilful citation engineering that decreases the number of published and ensures enough citation to surpass the number of articles published annually; in such cases, a journal that publishes quality content but a higher number of papers suffers when the index is calculated.
Other criticisms have focused on the inconsistencies when calculating indexes, since the algorithms used do not always conform to the relationship they claim to express. Hence, this problem is not only derived from methodological and technological issues of the companies that create these indexes; as a matter of fact, they can, and they have, developed several other indexes (Adams et al., 2019).
The worst problem, though, is that the Colombian system ignores these criticisms and that it fails to respond technically to them. This has two serious consequences. On the one hand, contents carried by Colombian journals are undervalued, as if the system not only seemed to care little about low investment in science and technology, but also as if it wanted to undervalue journals through a system with the stated purpose of reducing incentives for public university lecturers.
The other problem is the discouragement of internationalisation, which directly contradicts one of the system’s stated goals, since the system itself hinders the ambitions of national journals of becoming indexed by WoS or Scopus by preferring Google Scholar’s H5 index over citations; which has normalisation issues, and this situation results in a troublesome paradox.
Even though editors and researchers have a lot to say, we have very little influence on the policies governing the Colombian scientific system. Decision makers at universities do not seem to understand the cost of undermining their publications and of national research; it seems they ignore how much has been invested on knowledge, money, technological development, continued training, qualification, time of editors and editorial teams, researchers’ hours spent in writing, articles that resulted from such projects, and more.
Who is going to pay for this costly process of diminishing the value of national journals? Will Colciencias cover this progressive and massive devaluation of our journals and a large part of our research?
The unavoidable question arising from this situation is who is going to pay for this costly process of diminishing the value of national journals? Will Colciencias cover this progressive and massive devaluation of our journals and a large part of our research? The lack of effective responses by researchers, editors, institutional boards, and people responsible for the system brings echoes of García Márquez’s tale, “Chronicle of a Death Foretold”, which seems applicable to Publindex’s current situation. We all know what is going on but by doing next to nothing, we will be forced in the end to watch as scientific edition and national research in Colombia wither and die.
Acevedo-Triana, C., Torres, M., Aguilar-Bustamante, C., Hurtado-Parrado, C., Silva, L. M., & López-López, W. (2018). Analysis of Productivity and Impact of Colombian Journals in Psychology between 2000 and 2016. Revista Latinoamericana de Psicología, 50(3), 145-159. http://doi.org/10.14349/rlp.2018.v50.n3.2
Adams, J., McVeigh, M., Pendlebury, D., & Szomszor, M. (2019). Profiles, not metrics. Retrieved from: https://clarivate.com/g/profiles-not-metrics/
Aguillo Cano, I., Uribe Tirado, A., & López López, W. (2017). Visibilidad de los investigadores colombianos según sus indicadores en Google Scholar y ResearchGate. Diferencias y similitudes con la clasificación oficial del sistema nacional de ciencia-COLCIENCIAS. Revista Interamericana de Bibliotecología, 40(3), 221-230. doi: 10.17533/udea.rib.v40n3a03
DORA (2012). San Francisco Declaration on Research Assessment. Retrieved from: https://sfdora.org/read/es/. See also: https://twitter.com/DORAssessment
Flórez Carranza, F. (2018). Nociones de calidad e impacto: el lugar de las revistas indexadas de ciencias jurídicas en el nuevo sistema colombiano de competitividad, ciencia, tecnología e innovación. Vniversitas, 67(137). https://doi.org/10.11144/Javeriana.vj137.ncei
Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429-431. Retrieved from: http://www.ingenio.upv.es/sites/default/files/adjunto-pagina-basica/manifiesto_es.pdf
López López, W. (2018). Sobre la evaluación de la investigación y los investigadores: Criticas a las métricas y recomendaciones. Universitas Psychologica, 17(4), 1-2. https://doi.org/10.11144/Javeriana.upsy17-4.seii
López-López, W., Caycedo, C., Acevedo Triana, C., Hurtado- Parrado, C., Silva, L., Aguilar- Bustamante, M.C. (2018). Training, academic and professional development in psychology in Colombia: Challenges and perspectives. In: G., Rich, L.K., De Souza, L., Zinkiewicz, J., Taylor, & J.L., Binti Jaafar (Eds). Teaching Psychology Around the world (Vol. 4) (pp. 53-79). U.K.: Cambridge scholars publishing. Grupo Aprendizaje y Sociedad de la Información.
Manifiesto de Leiden (2014). Manifiesto de Leiden sobre indicadores de investigación. Retrieved from: http://www.leidenmanifesto.org/
Presidency of the Republic of Colombia. Decreto 1279 (2002). By which teacher salary and benefits of state universities are established. Colombia.