The geopolitical games of research assessment
María Alejandra Tejada Gómez
Research assessment has suffered transformation in the last years. One of the main tensions is corporatization by databases (Web of Science —Wos— and Scopus) and the rankings mediated by the impact factor. Some have called this phenomenon commodification or corporatization of knowledge production, where any type of knowledge or form of dissemination has had to be framed within the templates that come, mainly, from basic sciences (Tejada-Gómez, 2019).
The stances to make use of these tools in science assessment are diverse. On the one hand, these tools based on information science, scientometrics and bibliometrics allow to visualize patterns of scientific output per field, institution and country (Moed, 2017). This has been used by multilateral organizations, rankings and scientific policies as evaluation tools with regards to quality and internationalization standards based on information science for research assessment (Gläser, 2018).
The dark side is about handing in scientific research output to commercial corporations that index knowledge—funded publicly or funded by universities and research institutions—and who, then, sell it back to universities. This is knowledge production with twofold taxation, losing its intellectual property.
Information science has been criticized from different perspectives in social sciences Leckie, Given, & Buschman (2010) arguing the lack of metaphorical discourses that take symbolic violence in knowledge production, Bourdieu; Democracy of knowledge, Habermas; Power-knowledge, Foucault; Resistance, Michel Ceratau; Creating spaces for the other, Jacques Derrida; Critical information, modernity, Heidegger; Documenting human and non-human associations, Laboratory life, Latour; Laws of the markets, Callon.
Some responses to the scientific production movements have been given by China, the measures regulating scientific data called Data Sovereignty (Jing & Yin, 2018), applicable laws and regulations that regulate data privacy and State secrets, where data confidentiality agreements are priority but gathering data for the benefit of institutions and the state as well, which are indexed and generate their own metrics and indicators of knowledge production for the country’s use and benefit.
Other responses are focused on developing new models of research practice and dissemination, such as open access information movements, which stress the good practices of research assessment as well as responsible research assessment with qualitative and quantitative criteria in disciplinary and local contexts. Among the most representatives we find: Altmetrics Manifesto (2010) (J. Priem, 2010), DORA Declaration on Research Assesment (2012), Leiden Manifiesto (2015) a guide to improving research evaluation practices (Hicks, Wouters, Waltman, de Rijcke, & Rafols, 2015), Research Excellence Framework (REF) UK report “The Metric Tide‘ Responsible metrics (2015), Responsible metrics and evaluation for open science European Union (2017) (Wilsdon, 2017).
The impact measure of research has taken new meanings beyond the Impact Factor such as the social, professional, economic, technological, political, organizational, environmental, health, cultural, symbolical, educational impact (Benoit & Doré , 2004). Researchers in the United Kingdom report on the products of dissemination that respond to the social impact of their researches (with new ways of dissemination that allow responding to different stakeholders of interest—engagement stakeholders—to public policies, and be cited by professional communities and other kind of audiences) (Rafols, 2018).
Technologies have generated new ways to produce knowledge, it’s the big data – data science era, we work with research data in real time “Current Research Information system.” Other forms of dissemination are tied to preprints, specialized repositories (Robinson-García, Repido , & Torres-Salinas , 2018).
Latin American Context
In Latin American national policies measuring groups, production and scientific journals are based on TOP products mediated by quartiles of databases, such as WoS and Scopus, which criteria are the same considered by the rankings. The indicator becomes the ends not the means, numbers are prioritized as opposed to quality of scientific content. The gist of this is the patrimonial detriment of Latin American scientific production which is not represented in such corporative systems. At the bottom of the discussion lies the incentive system that universities define for the benefit of researchers.
It’s urgent for the region to work on assessment indicators, appropriate to the forms and models of production developed in our contexts (Alperin & Fishman, 2015). These corporations are the great players of the science system, other forms of production, however, that represent, by discipline or context, a field of knowledge or a particular region shouldn’t remain unacknowledged. Accordingly, opening the door to the diversification of knowledge dissemination that responds in a disciplinary and contextual way is advisable. For instance, developing manuals on good practices of scientific dissemination and production per field of knowledge that leads to appropriateness.
Some countries in their models of research assessment and measurement have included other databases such as Scielo, whose scope and representation for the region is greater, however, it’s still selective. Unknown is the tough work of databases such as Redalyc, repositories by disciplines or institutions, digital libraries or analyses with google scholar that represent more the knowledge of the region (Romero-Torres, 2013).
In terms of research assessment it’s advisable to follow the best practices with methods for research assessment in a quantitative and qualitative way, bearing in mind the reflections proposed in the Leiden manifesto (Hicks, Wouters, Waltman, de Rijcke, & Rafols, 2015), Altmetrics (J. Priem, 2010), San Francisco DORA Declaration on Research Assessment (2012), and responsible metrics UK (Wilsdon, 2017).
REFERENCES:
Benoit , G., & Doré , C. (2004). Measuring the Impacts of Science: Beyond the Economic Dimension. Canada: CSIIC Working Paper, CSIIC.
Gläser, J. (2018). Accounting for field-specific research practices in surveys. 23th International conference on Science and technology Indicadors (STI 2018) Science, technology and innovation indicators in transition. Leiden Netherlands: CTWS. Available at: https://openaccess.leidenuniv.nl/bitstream/handle/1887/65277/STI2018_paper_223.pdf?sequence=1
Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 429 – 431.
Priem, D. T. (2010). Altmetrics: A manifesto. Retrieved from http://altmetrics.org/manifesto.
Jing, C., & Yin, A. (2018). Brief introduction to the new measures regulating scientific data in China. Shangai : Reed Smith Client Alerts .
Leckie, G. J., Given, L. M., & Buschman, J. (2010). Critical Theory for Library and Information Science: Exploring the Social from Across the Disciplines. San Barbara California: ABC CLIO LLC.
Moed, H. F. (2017). Applied Evaluative Informetrics. Springer International Publishing. Retrieved from https://arxiv.org/abs/1705.06110
Molas, J., & Salter, A. (2010). Measuring third stream activities. Science and Technology Policy Research –SPRU-.
Rafols, I. (September 10th, 2018). S&T indicators ‘in the wild’: contextualisation and participation for responsible metrics . Opgehaald van Blog CWTS. Retrieved from https://www.cwts.nl/blog?article=n-r2u254&title=st-indicators-in-the-wild-contextualisation-and-participation-for-responsible-metrics
Robinson-García, N., Repido , R., & Torres-Salinas , D. (May, 2018). Perspectivas y retos de los profesionales de la evaluación científica. El profesional de la información, 461-466.
Romero-Torres, M. A.-M.-G. (2013). Ranking de revistas científicas en Latinoamérica mediante el índice h: estudio de caso Colombia. Revista Española de Documentación Científica, 36(1). Retrieved from http://redc.revistas.csic.es.
Tejada-Gómez, M. A. (2019). Tejada-Gómez, M. A. (2018). University research governance, responds to the tensions of Scientific Journal Policy Publindex in Colombia (SJPPC). Thesis in process. Enschede, Twente, Netherlands: Twente University.
Vinck, D. (2013). Las culturas y humanidades digitales como nuevo desafío para el desarrollo de la ciencia y la tecnología en América latina. Universitas Humanística, 76(76), 51-72. Retrieved from http://revistas.javeriana.edu.co/index.php/univhumanistic
Wilsdon, J. e. (2017). Next-generation metrics: Responsible metrics and evaluation for open science. Report of the European Commission Expert Group on Altmetrics. Brussels: European Commission.