Blog Ameli
Reflexión y Análisis en torno a la comunicación de la ciencia
Compartir entrada:

The geopolitical games of research assessment

María Alejandra Tejada Gómez

Research assessment has suffered transformation in the last years. One of the main tensions is corporatization by databases (Web of Science —Wos— and Scopus) and the rankings mediated by the impact factor. Some have called this phenomenon commodification or corporatization of knowledge production, where any type of knowledge or form of dissemination has had to be framed within the templates that come, mainly, from basic sciences (Tejada-Gómez, 2019).

The stances to make use of these tools in science assessment are diverse. On the one hand, these tools based on information science, scientometrics and bibliometrics allow to visualise patterns of scientific output per field, institution and country (Moed, 2017). This has been used by multilateral organizations, rankings and scientific policies as evaluation tools related to quality and internationalisation standards based on information science for research assessment (Gläser, 2018).

The dark side consists of handing in scientific research output to commercial corporations that index knowledge that has been publicly funded or funded by universities and research institutions, who, then, sell it back to universities. This is twofold taxation knowledge production, losing its intellectual property.

Information science has been critised from different perspectives in social sciences. In Leckie, Given, & Buschman (2010) arguing the lack of metaphorical discourses that take symbolic violence in knowledge production, Bordieu; Democracy of knowledge, Habermas; Power-knowledge, Foucault; Resistance, Michel Ceratau; Creating spaces for the other, Jacques Derrida; Critical information, modernity, Heidegger; Documenting human and non-human associations, Laboratory life, Latour; Laws of the markets, Callon.

Some responses to the scientific production movements have been given by China, the measures regulating scientific data called Data Sovereignty (Jing & Yin, 2018), applicable laws and regulations that regulate data privacy and State secrets, where data confidenciality agreements are priority but gathering data for the benefit of institutions and the state as well, which are indexed and generate their own measures and indicators of knowledge production for the country’s use and benefit. 

Other responses focus on developing new models of research practice and dissemination, such as open access information movements, which stress the good practices of research assessment as well as responsible research assessment through qualitative and quantitative criteria in disciplinary and local contexts. Among the most representatives: Altmetrics Manifesto (2010) (J. Priem, 2010)DORA Declaration on Research Assesment (2012)Leiden Manifiesto (2015) a guide to improving research evaluation practices (Hicks, Wouters, Waltman, de Rijcke, & Rafols, 2015)Research Excellence Framework (REF) UK report “The Metric Tide‘  Responsible metrics (2015)Responsible metrics and evaluation for open science European Union (2017) (Wilsdon, 2017).

The impact measure of research has taken new meanings beyond the Impact Factor such as the social, professional, economic, technological, political, organizational, environmental, health, cultural, symbolical, educational impact (Benoit & Doré , 2004).  Researchers in the United Kingdom have to present reports with products of dissemination that respond to the social impact of their researches (with new ways of dissemination that allow to respond to different actors of interest —called engagement stakeholders— respond to public policies, be cited by professional communities and other kind of audiences) (Rafols, 2018).

Technologies have generated new ways of knowledge production, it’s the big data – data science era, we’ll work with research data in real time “Current Research Information system.” Other forms of dissemination are tied to preprints, specialised repositories (Robinson-García, Repido , & Torres-Salinas , 2018).

Latin American Context

In Latin American national policies measuring scientific groups, production and journals is based on TOP products mediated by quartiles by databases, such as WoS and Scopus, which are the same criteria considered by the rankings. The indicator becomes the ends not the means, numbers are prioritised as opposed to quality of scientific content. The gist of this is the patrimonial detriment of Latin American scientific production which is not represented in such corporative systems. At the bottom of the discussion lies the incentive system that universities define for the benefit of researchers.

It is urgent for the region to work in assessment indicators that be appropriate to the forms and models of production developed in our contexts (Alperin & Fishman, 2015). These corporations are the great players of the science system, but other forms of production that represent per discipline or context a field of knowledge or a particular region should not remain unacknowledged. Hence, opening the door to a diversification of knowledge dissemination that responds in a disciplinary and contextual way is advisable. For instance, developing manuals on good practices of scientific dissemination and production per field of knowledge that leads to appropriateness.

Some countries in their models of research assessment and measure have included other databases such as Scielo, whose scope and representation of the region is greater, however, is still selective. Unknown is the tough work of databases such as Redalyc, repositories by disciplines or institutions, digital libraries or analyses with google scholar that represent more the knowledge of the region (Romero-Torres, 2013).

In terms of research assessment it is advisable to follow the best practices with methods for research assessment in a quantitative and qualitative way, bearing in mind the reflections proposed in the Leiden manifesto (Hicks, Wouters, Waltman, de Rijcke, & Rafols, 2015), Altmetrics (J. Priem, 2010), San Francisco DORA Declaration on Research Assessment (2012), and responsible metrics UK (Wilsdon, 2017).

REFERENCES:

Benoit , G., & Doré , C. (2004). Measuring the Impacts of Science: Beyond the Economic Dimension. Canada: CSIIC Working Paper, CSIIC.

Gläser, J. (2018). Accounting for field-specific research practices in surveys. 23th International conference on Science and technology Indicadors (STI 2018) Science, technology and innovation indicators in transition. Leiden Netherlands: CTWS. Available at: https://openaccess.leidenuniv.nl/bitstream/handle/1887/65277/STI2018_paper_223.pdf?sequence=1

Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 429 – 431.

Priem, D. T. (2010). Altmetrics: A manifesto. Retrieved from http://altmetrics.org/manifesto.

Jing, C., & Yin, A. (2018). Brief introduction to the new measures regulating scientific data in China. Shangai : Reed Smith Client Alerts .

Leckie, G. J., Given, L. M., & Buschman, J. (2010). Critical Theory for Library and Information Science: Exploring the Social from Across the Disciplines. San Barbara California: ABC CLIO LLC.

Moed, H. F. (2017). Applied Evaluative Informetrics. Springer International Publishing. Retrieved from https://arxiv.org/abs/1705.06110

Molas, J., & Salter, A. (2010). Measuring third stream activities. Science and Technology Policy Research –SPRU-.

Rafols, I. (September 10th, 2018). S&T indicators ‘in the wild’: contextualisation and participation for responsible metrics . Opgehaald van Blog CWTS. Retrieved from https://www.cwts.nl/blog?article=n-r2u254&title=st-indicators-in-the-wild-contextualisation-and-participation-for-responsible-metrics

Robinson-García, N., Repido , R., & Torres-Salinas , D. (May, 2018). Perspectivas y retos de los profesionales de la evaluación científica. El profesional de la información, 461-466.

Romero-Torres, M. A.-M.-G. (2013). Ranking de revistas científicas en Latinoamérica mediante el índice h: estudio de caso Colombia. Revista Española de Documentación Científica, 36(1). Retrieved from http://redc.revistas.csic.es.

Tejada-Gómez, M. A. (2019). Tejada-Gómez, M. A. (2018). University research governance, responds to the tensions of Scientific Journal Policy Publindex in Colombia (SJPPC). Thesis in process. Enschede, Twente, Netherlands: Twente University.

Vinck, D. (2013). Las culturas y humanidades digitales como nuevo desafío para el desarrollo de la ciencia y la tecnología en América latina. Universitas Humanística, 76(76), 51-72. Retrieved from http://revistas.javeriana.edu.co/index.php/univhumanistic

Wilsdon, J. e. (2017). Next-generation metrics: Responsible metrics and evaluation for open science. Report of the European Commission Expert Group on Altmetrics. Brussels: European Commission.

If you want to post on the blog, write to the email ameli.conocimientoabierto@gmail.com communicating their experience and/or contributions in Open Access. Participation in the blog is subject to evaluation by coordinators. The writings published on the Ameli Blog are the responsibility of the author and do not necessarily reflect the opinion of AmeliCA.



Compartir entrada:

Reply