Leiden rankings to add open-source version in 2024 – Research Professional News

“The Centre for Science and Technology Studies at Leiden University in the Netherlands, which publishes university rankings, plans to start a new ranking based entirely on open data and open algorithms in 2024.

The open-source CWTS ranking will sit alongside listings produced, as in previous years, based on bibliographic data from the Web of Science database of Clarivate*….”

Measuring open access publications: a novel normalized open access indicator

Abstract:  The issue of open access (OA) to scientific publications is attracting growing interest within the scientific community and among policy makers. Open access indicators are being calculated. In its 2019 ranking, the ”Centre for Science and Technology Studies” (CWTS) provides the number and the share of OA publications per institution. This gives an idea of the degree of openness of institutions. However, not taking into account the disciplinary specificities and the specialization of institutions makes comparisons based on the shares of OA publications biased. We show that OA publishing practices vary considerably according to discipline. As a result, we propose two methods to normalize OA share; by WoS subject categories and by disciplines. Normalized Open Access Indicator (NOAI) corrects for disciplinary composition and allows a better comparability of institutions or countries.

How bibliometrics and school rankings reward unreliable science | The BMJ

“Metrics can reap huge rewards but, unfortunately, they’re also simple to game. And so, following Goodhart’s law—“When a measure becomes a target, it ceases to be a good measure”—citations are gamed,8 in increasingly cunning ways. Authors and editors create citation rings and cartels.9 Companies pounce on expired domains to hijack indexed journals10 and take their names, fooling unsuspecting researchers. Or researchers who are well aware of the game use this vulnerability to publish papers that cite their work.

Universities pay cash bonuses to faculty members who publish papers in highly ranked journals.11 Some institutions have reportedly even schemed to hire prominent academics who either add an affiliation to their papers or move employers outright.12 This means that those researchers’ papers—and citations—count toward the universities’ rankings. Researchers cite themselves, a lot.13 Journals have been found to encourage, or even require, authors to cite other work in the same periodical,14 and they fight over papers they think will be highly cited to win the impact factor arms race.15

Paper mills, which sell everything from authorship to complete articles, have proliferated,16 and while they’re not a new phenomenon, they have industrialised in recent years.17 They have figured out ways to ensure that authors peer review their own papers.18 In the United States, the “newest college admissions ploy” is “paying to make your teen a ‘peer-reviewed’ author.”19…”

Acceso abierto: aprender de casos de éxito en universidades | blok de bid

From Google’s English:  “The document is organized in three sections. In the first part, the Working Group makes a comparison of the evolution of the positions reached by its institutions (members of the 2019-2020 Open Access Working Group) in the 2019-21 editions of the CWTS Leiden Open Access ranking . The classification is based on the percentage of open access publications available, which makes it possible to evaluate the progress of the institutions in the implementation of open access. 

In the second part, four key factors of Group institutions that have helped to achieve outstanding open access results and thus to achieve a good position in the CWTS Leiden ranking in that category are identified and described :

Open access policies. Institutions with strong policies perform better than those without specific policies, beyond those required by funders. These policies should place deposit work flows at the center of academic activity and promote the consolidation of an institutional team to support the implementation of open access.
 
Availability and configuration of the institutional system (repositories/CRIS). The presence of an interconnected institutional repository and research information management system (CRIS) is crucial. The importance of capturing bibliographic metadata in CRIS and the transfer flow of metadata and files with the appropriate version of the text to the repository, where it is offered in open access or with an embargo period, is highlighted. 
 
Institutional research support staff. It is critical to have a dedicated Open Access/Open Science training and support team within the institution, usually within the library. This team should offer guidance and assistance to researchers in preparing their publications for open access, validation of publications in CRIS, as well as in meeting the requirements of funding policies. Open access training may include topics such as copyright, user licences, and research data management.
 
Collaboration and institutional commitment. Collaboration and commitment between different actors are essential to successfully introduce open access in institutions. This implies the active participation of researchers, libraries, IT services and other relevant departments. The institution should foster an environment in which open access is valued and supported, and where its importance to research and institutional reputation is recognized….”

Did a ‘nasty’ publishing scheme help an Indian dental school win high rankings? | Science | AAAS

Saveetha Dental College in Chennai, India, incentivizes undergraduate students to write research manuscripts, a practice resulting in over 1,400 scholarly works published by the school in a single year. However, an investigation by Retraction Watch revealed that these papers often systematically cite other works by Saveetha faculty, inflating citation metrics to boost the institution’s global reputation. Officials at the college deny knowledge of any concerted effort to use self-citation to enhance their standing, though external observers criticize the strategy as misleading and potentially harmful. Concerns also extend beyond self-citation, with critics pointing to the questionable quality of undergraduate research and the coercive nature of pressuring students to publish for the institution’s benefit.

 

Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessment

“This review was commissioned by the joint UK higher education (HE) funding bodies as part of the Future Research Assessment Programme (FRAP). It revisits the findings of the 2015 review The Metric Tide to take a fresh look at the use of indicators in research management and assessment. 

While this review feeds into the larger FRAP process, the authors have taken full advantage of their independence and sought to stimulate informed and robust discussion about the options and opportunities of future REF exercises. The report should be read in that spirit: as an input to ongoing FRAP deliberations, rather than a reflection of their likely or eventual conclusions. 

The report is written in three sections. Section 1 plots the development of the responsible research assessment agenda since 2015 with a focus on the impact of The Metric Tide review and progress against its recommendations. Section 2 revisits the potential use of metrics and indicators in any future REF exercise, and proposes an increased uptake of ‘data for good’. Section 3 considers opportunities to further support the roll-out of responsible research assessment policies and practices across the UK HE sector. Appendices include an overview of progress against the recommendations of The Metric Tide and a literature review. 

We make ten recommendations targeted at different actors in the UK research system, summarised as: 

1: Put principles into practice. 

2: Evaluate with the evaluated. 

3: Redefine responsible metrics. 

4: Revitalise the UK Forum. 

5: Avoid all-metric approaches to REF. 

6: Reform the REF over two cycles. 

7: Simplify the purposes of REF. 

8: Enhance environment statements. 

9: Use data for good. 

10: Rethink university rankings….”

Saudi universities entice top scientists to switch affiliations — sometimes with cash

“Research institutions in Saudi Arabia are gaming global university rankings by encouraging top researchers to change their main affiliations, sometimes in exchange for cash, and often with little obligation to do meaningful work. That’s the conclusion of a report that shows how, over the past decade, dozens of the world’s most highly cited researchers have switched their primary affiliations to universities in the country. That, in turn, has boosted the standing of Saudi Arabian institutions in university ranking tables, which consider the citation impacts of an institution’s researchers….”

Scientific research is deteriorating | Science & Tech | EL PAÍS English

“The field of scientific research is deteriorating because of the way the system is set up. Researchers do the research – financed with public funds – and then the public institutions that they work for pay the big scientific publishers several times, for reviewing and publishing submissions. Simultaneously, the researchers also review scientific papers for free, while companies like Clarivate or the Shanghai Ranking draft their lists, telling everyone who are the good guys (and leaving out the people who, apparently, aren’t worth consideration).

In the last 30 years – since we’ve been living with the internet – we’ve altered the ways in which we communicate, buy, teach, learn and even flirt. And yet, we continue to finance and evaluate science in the same way as in the last century. Young researchers – underpaid and pressured by the system – are forced to spend time trying to get into a “Top 40? list, rather than working in their laboratories and making positive changes in the world.

As the Argentines say: “The problem isn’t with the pig, but with the person who feeds it.” Consciously or unconsciously, we all feed this anachronistic and ineffective system, which is suffocated by the deadly embrace between scientific journals and university rankings. Our governments and institutions fill the coffers of publishers and other companies, who then turn around and sell us their products and inform us (for a price) about what counts as quality….

Despite the issues, there’s certainly reason to be optimistic: although we scientists are victims (and accomplices) of the current system, we’re also aware of its weaknesses. We want to change this reality.

 

After a long debate – facilitated by the Open Science unit of the European Commision – the Coalition for Advancing Research Assessment (COARA) has been created. In the last four months, more than 500 institutions have joined COARA, which – along with other commitments – will avoid the use of rankings in the evaluation of research. COARA is a step forward to analyze – in a coherent, collective, global and urgent manner – the reform of research evaluation. This will help us move away from an exclusively quantitative evaluation system of journals, towards a system that includes other research products and indicators, as well as qualitative narratives that define the specific contributions of researchers across all disciplines….”

 

 

Assessing Open Access Friendliness of National Institutes of Technology (NITs) A Data Carpentry Approach | DESIDOC Journal of Library & Information Technology, 2022-10

“Abstract: This research study aims to measure the Open Access (OA) friendliness of National Institutes of Technology (NITs) of India that are listed in the overall category of NIRF (National Institutional Ranking Framework), 2021 by taking into consideration four important OA parameters – i) OA publication share; ii) OA licensing scenario; iii) citation impact of OA publications; and iv) altmetric scores of OA publications. It deals with 64,485 publications of the selected 11 NITs during the period from 2012 to 2021 (10 years), citations received by these publications (5,42,638 citations), and altmetric attention scores of the documents (5,213 publications) during the period under study. A data carpentry tool, namely OpenRefine, and open access bibliographic/citation data sources such as Unpaywall, Dimensions, and Altmetric.com have been deployed to accomplish this large-scale study for ranking NITs by their Open Access Friendliness (OAF). The OAF indicator, as applied in this study, is a distributed weightage based 100-point scale built on top of the aforesaid OA parameters. The ranking framework shows that Sardar Vallabhbhai National Institute of Technology, Surat (est. in 1961) has achieved the top position with a score of 52.12 (out of 100), but in totality only 3 NITs (out of the selected 11 NITs) crossed the 50 per cent mark in the adapted OAF scale.”

Roy, A., & Mukhopadhyay, P. (2022). Assessing Open Access Friendliness of National Institutes of Technology (NITs) A Data Carpentry Approach. DESIDOC Journal of Library & Information Technology, 42(5), 331-338. https://doi.org/10.14429/djlit.42.5.18263

Indonesian research access: quantity over quality?

“Prior to the open access movement and the proliferation of the internet, almost all Indonesian higher education institutions made thesis and dissertation collections closed, accessible only with certain permissions….

Prior to the open access movement and the proliferation of the internet, almost all Indonesian higher education institutions made thesis and dissertation collections closed, accessible only with certain permissions….

The lack of selection process and quality control for the scholarly resources uploaded to the institutional repositories had led to some unhelpful material making it way into them: documents with supervisor’s comments still visible; documents that were compressed or password protected; documents that were uploaded as multiple image files; documents that were available only partially; and so on….

 

When quantity trumps quality, the repositories become less effective as a means of disseminating scholarly works….”

Chirikov (2022) Does conflict of interest distort global university rankings? | Higher Education, SpringerLink

Chirikov, I. Does conflict of interest distort global university rankings?. High Educ (2022). https://doi.org/10.1007/s10734-022-00942-5

Abstract:

Global university rankings influence students’ choices and higher education policies throughout the world. When rankers not only evaluate universities but also provide them with consulting, analytics, or advertising services, rankers are vulnerable to conflicts of interest that may potentially distort their rankings. The paper assesses the impact of contracting with rankers on university ranking outcomes using a difference-in-difference research design. The study matches data on the positions of 28 Russian universities in QS World University Rankings between 2016 and 2021 with information on contracts these universities had for services from QS—the company that produces these rankings. The study compares the fluctuations in QS rankings with data obtained from the Times Higher Education rankings and data recorded by national statistics. The results suggest that the universities with frequent QS-related contracts had an increase of 0.75 standard deviations (~?140 positions) in QS World University Rankings and an increase of 0.9 standard deviations in reported QS faculty-student ratio scores over 5 years, regardless of changes in the institutional characteristics. The observed distortions could be explained by university rankers’ self-serving bias that benefits both rankers and prestige-seeking universities and reinforces the persistence of rankings in higher education.

 

 

The Curtin Open Knowledge Initiative | LIBER Quarterly: The Journal of the Association of European Research Libraries

Abstract:  In the current era of worldwide competition in higher education, universities are caught up in market processes that encourage compliance with the measurement systems applied by world university rankings. Despite questions about the rankings’ methodologies and data sources, universities continue to adopt assessment and evaluation practices that require academic researchers to publish in sources indexed by the major commercial bibliographic databases used by world rankings. Building on a critique of the limited bibliometric measures and underlying assumptions of rankings, the Curtin Open Knowledge Initiative interdisciplinary research project aggregates and analyses scholarly research data including open access output from multiple open sources for more than 20,000 institutions worldwide. To understand who is creating knowledge and how diversity is enacted through the transmission of knowledge we analyse workforce demographic data. In this article, we discuss the project’s rationale, methodologies and examples of data analysis that can enable universities to make independent assessments, ask questions about rankings, and contribute to open knowledge-making and sharing.  Expanding on our presentation to the LIBER Online 2021 Conference, we discuss collaboration with academic libraries and other scholarly communication stakeholders to develop and extend the open knowledge project.

 

The Curtin Open Knowledge Initiative | LIBER Quarterly: The Journal of the Association of European Research Libraries

Abstract:  In the current era of worldwide competition in higher education, universities are caught up in market processes that encourage compliance with the measurement systems applied by world university rankings. Despite questions about the rankings’ methodologies and data sources, universities continue to adopt assessment and evaluation practices that require academic researchers to publish in sources indexed by the major commercial bibliographic databases used by world rankings. Building on a critique of the limited bibliometric measures and underlying assumptions of rankings, the Curtin Open Knowledge Initiative interdisciplinary research project aggregates and analyses scholarly research data including open access output from multiple open sources for more than 20,000 institutions worldwide. To understand who is creating knowledge and how diversity is enacted through the transmission of knowledge we analyse workforce demographic data. In this article, we discuss the project’s rationale, methodologies and examples of data analysis that can enable universities to make independent assessments, ask questions about rankings, and contribute to open knowledge-making and sharing.  Expanding on our presentation to the LIBER Online 2021 Conference, we discuss collaboration with academic libraries and other scholarly communication stakeholders to develop and extend the open knowledge project.

 

Wenaas (2022) Choices of immediate open access and the relationship to journal ranking and publish-and-read deals | Frontiers

Wenaas L (2022) Choices of immediate open access and the relationship to journal ranking and publish-and-read deals. Front. Res. Metr. Anal. 7:943932. doi: 10.3389/frma.2022.943932

The role of academic journals is significant in the reward system of science, which makes their rank important for the researcher’s choice in deciding where to submit. The study asks how choices of immediate gold and hybrid open access are related to journal ranking and how the uptake of immediate open access is affected by transformative publish-and-read deals, pushed by recent science policy. Data consists of 186,621 articles published with a Norwegian affiliation in the period 2013–2021, all of which were published in journals ranked in a National specific ranking, on one of two levels according to their importance, prestige, and perceived quality within a discipline. The results are that researchers chose to have their articles published as hybrid two times as often in journals on the most prestigious level compared with journals on the normal level. The opposite effect was found with gold open access where publishing on the normal level was chosen three times more than on the high level. This can be explained by the absence of highly ranked gold open access journals in many disciplines. With the introduction of publish-and-read deals, hybrid open access has boosted and become a popular choice enabling the researcher to publish open access in legacy journals. The results confirm the position of journals in the reward system of science and should inform policymakers about the effects of transformative arrangements and their costs against the overall level of open access.

 

Uses of the Journal Impact Factor in national journal rankings in China and Europe – Kulczycki – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  This paper investigates different uses of the Journal Impact Factor (JIF) in national journal rankings and discusses the merits of supplementing metrics with expert assessment. Our focus is national journal rankings used as evidence to support decisions about the distribution of institutional funding or career advancement. The seven countries under comparison are China, Denmark, Finland, Italy, Norway, Poland, and Turkey—and the region of Flanders in Belgium. With the exception of Italy, top-tier journals used in national rankings include those classified at the highest level, or according to tier, or points implemented. A total of 3,565 (75.8%) out of 4,701 unique top-tier journals were identified as having a JIF, with 55.7% belonging to the first Journal Impact Factor quartile. Journal rankings in China, Flanders, Poland, and Turkey classify journals with a JIF as being top-tier, but only when they are in the first quartile of the Average Journal Impact Factor Percentile. Journal rankings that result from expert assessment in Denmark, Finland, and Norway regularly classify journals as top-tier outside the first quartile, particularly in the social sciences and humanities. We conclude that experts, when tasked with metric-informed journal rankings, take into account quality dimensions that are not covered by JIFs.