The emergence of university rankings: a historical?sociological account | SpringerLink

Wilbers, S., Brankovic, J. The emergence of university rankings: a historical?sociological account. High Educ (2021). https://doi.org/10.1007/s10734-021-00776-7

Abstract

Nowadays, university rankings are a familiar phenomenon in higher education all over the world. But how did rankings achieve this status? To address this question, we bring in a historical-sociological perspective and conceptualize rankings as a phenomenon in history. We focus on the United States and identify the emergence of a specific understanding of organizational performance in the postwar decades. We argue that the advent of this understanding constituted a discursive shift, which was made possible—most notably but not solely—by the rise of functionalism to the status of a dominant intellectual paradigm. The shift crystallized in the rankings of graduate departments, which were commissioned by the National Science Foundation and produced by the American Council on Education (ACE) in 1966 and 1970. Throughout the 1970s, social scientists became increasingly more interested in the methods and merits of ranking higher education institutions, in which they would explicitly refer to the ACE rankings. This was accompanied by a growing recognition, already in the 1970s, that rankings had a place and purpose in the higher education system—a trend that has continued into the present day.

What’s Your Tier? Introducing Library Partnership (LP) Certification for Journal Publishers · Series 1.3: Global Transition to Open

“Four categories and an open-ended response are the heart of LP [Library Partnership] certification for journal publishers: 

Access examines when and how the public can view an article and what barriers exist to author participation in publishing. There is substantial nuance in this category, in part because publishers often approach access very differently. In broad strokes, publishers that provide full and immediate open access (OA) across all journals earn more points than those that simply allow author-led open archiving. Publishers with no or low APCs earn points, as do publishers offering APC waivers for any of their journals (not forcing waiver-eligible authors to publish only in particular journals, that is, fully OA journals). 16 possible points. 

Rights focuses on author rights and reuse rights. Publishers that allow authors to retain all rights, or use Creative Commons licenses, earn points. 11 possible points.

Community considers ethical and business aspects of publishing. Points are awarded to nonprofit and society publishers. Legal actions against libraries or lobbying against OA do not earn points for a publisher, while evidence of transparency and responsible handling of user data do earn points. Membership in COPE is a plus. 12 possible points. 

Discoverability deals with the technical side of publishing. Publishers earn points through accessibility, ORCiD integration, participation in preservation organizations, and similar practices. 15 possible points.

The Open-Ended Response allows publishers to describe other actions they take to support equitable and open science/scholarship. 3 possible points. …”

Gadd (2021) Mis-Measuring Our Universities: Why Global University Rankings Don’t Add Up | Frontiers in Research Metrics and Analytics

Gadd, Mis-Measuring Our Universities: Why Global University Rankings Don’t Add Up. Frontiers in Research Metrics and Analytics. https://doi.org/10.3389/frma.2021.680023

Abstract: Draws parallels between the problematic use of GDP to evaluate economic success with the use of global university rankings to evaluate university success. Inspired by Kate Raworth’s Doughnut Economics, this perspective argues that the pursuit of growth as measured by such indicators creates universities that ‘grow’ up the rankings rather than those which ‘thrive’ or ‘mature.’ Such growth creates academic wealth divides within and between countries, despite the direction of growth as inspired by the rankings not truly reflecting universities’ critical purpose or contribution. Highlights the incompatibility between universities’ alignment with socially responsible practices and continued engagement with socially irresponsible ranking practices. Proposes four possible ways of engendering change in the university rankings space. Concludes by calling on leaders of ‘world-leading’ universities to join together to ‘lead the world’ in challenging global university rankings, and to set their own standards for thriving and maturing universities.

TRANSPARENT RANKING: All Repositories (August 2021) | Ranking Web of Repositories

During the last months, we realized the indexing of records of several open access repositories by Google Scholar is not as complete as previously without a clear reason. From the experience of a few cases, it looks that GS penalizes error in the metadata descriptions, so it is important to the affected repositories to check their level of indexing and to try to identify potential problems. Please, consider the following Indexing GS guidelines https://scholar.google.com/intl/en/scholar/inclusion.html https://www.or2015.net/wp-content/uploads/2015/06/or-2015-anurag-google-scholar.pdf and the following material: Exposing Repository Content to Google Scholar A few suggestions for improving the web visibility of the contents of your institutional OA repository “Altmetrics of the Open Access Institutional Repositories: A Webometrics Approach” As a service for the OA community we are providing five lists of repositories (all (institutional+subject), institutional, portals, data, and CRIS) with the raw numbers of records in GS for their web domains (site:xxx.yyy.zz excluding citations and patents) ranked by decreasing number of items as collected during the second week of AUGUST 2021. The list is still incomplete as we are still adding new repositories.

Open Science rankings: yes, no, or not this way? A debate on developing and implementing transparency metrics. – JOTE | Journal of Trial and Error

“The Journal of Trial and Error is proud to present an exciting and timely event: a three-way debate on the topic of Open Science metrics, specifically, transparency metrics. Should we develop these metrics? What purposes do they fulfil? How should Open Science practices be encouraged? Are (transparency) rankings the best solution? These questions and more will be addressed in a dynamic and interactive debate with three researchers of different backgrounds: Etienne LeBel (Independent Meta-Scientist and founder of ERC-funded project ‘Curate Science’), Sarah de Rijcke (Professor of Science and Evaluation Studies and director of the Centre for Science and Technology Studies at Leiden University), and Juliëtte Schaafsma (Professor of Cultural Psychology at Tilburg University and fierce critic of rankings and audits). This is an event organized by the Journal of Trial and Error, and supported by the Open Science Community Tilburg, the Centre for Science and Technology Studies (CWTS, Leiden University), and the Open Science Community Utrecht.”

 

Ural federal university: The University’s Open Archive has Risen Again in Repository Rankings – India Education | Latest Education News | Global Educational News | Recent Educational News

“In the ranking of institutional repositories, the university’s archive has risen two positions and is ranked 26th in the world out of more than 3,100 other resources. Moreover, the Ural Federal University archive continues to hold first place in Russia among institutional archives….”

University Rankings and Governance by Metrics and Algorithms | Zenodo

Abstract:  This paper looks closely at how data analytic providers leverage rankings as a part of their strategies to further extract rent and assets from the university beyond their traditional roles as publishers and citation data providers. Multinational publishers such as Elsevier, with over 2,500 journals in its portfolio, has transitioned to become a data analytic firm. Rankings expand their abilities to monetize further their existing journal holdings, as there is a strong association between publication in high-impact journals and improvement in rankings.  The global academic publishing industry has become highly oligopolistic, and a small handful of legacy multinational firms are now publishing the majority of the world’s research output (See Larivière et. al. 2015; Fyfe et. al. 2017; Posada & Chen, 2018). It is therefore crucial that their roles and enormous market power in influencing university rankings be more closely scrutinized. We suggest that due to a combination of a lack of transparency regarding, for example, Elsevier’s data services and products and their self-positioning as a key intermediary in the commercial rankings business, they have managed to evade the social responsibilities and scrutiny that come with occupying such a critical public function in university evaluation. As the quest for ever-higher rankings often works in conflict with universities’ public missions, it is critical to raise questions about the governance of such private digital platforms and the compatibility between their private interests and the maintenance of universities’ public values.

 

Indonesia nomor 1 untuk publikasi jurnal akses terbuka di dunia: apa artinya bagi ekosistem riset lokal

“With the largest number of OA journals in the world, the knowledge of Indonesian researchers should be able to freely reach the public.

The government has started to realize this.

This is evidenced by the recent Law on National Science and Technology System ( UU Sisnas Science and Technology ) which also began requiring the application of this open access system for research publications to ensure that research results can be enjoyed by the public.

Through this obligation, the government hopes to encourage not only the transparency of the research process, but also innovations and new findings that benefit society….

According to our records, the research publication system in Indonesia since the 1970s has implemented the non-profit principle. At that time research publications were sold for a subscription fee which was usually calculated from the cost of printing only. This system is different from that found in developed countries which are dominated by commercial publishing companies.

This is where Indonesia triumphs over any research ecosystem.

Some that can match it are the Scielo research ecosystem in Brazil, the African Journal Online (AJOL) scientific publishing ecosystem and the Africaxiv from the African continent…..”

Gaming the Metrics | The MIT Press

“The traditional academic imperative to “publish or perish” is increasingly coupled with the newer necessity of “impact or perish”—the requirement that a publication have “impact,” as measured by a variety of metrics, including citations, views, and downloads. Gaming the Metrics examines how the increasing reliance on metrics to evaluate scholarly publications has produced radically new forms of academic fraud and misconduct. The contributors show that the metrics-based “audit culture” has changed the ecology of research, fostering the gaming and manipulation of quantitative indicators, which lead to the invention of such novel forms of misconduct as citation rings and variously rigged peer reviews. The chapters, written by both scholars and those in the trenches of academic publication, provide a map of academic fraud and misconduct today. They consider such topics as the shortcomings of metrics, the gaming of impact factors, the emergence of so-called predatory journals, the “salami slicing” of scientific findings, the rigging of global university rankings, and the creation of new watchdogs and forensic practices.”

OA Monitoring: why do we get different results? – Digital Scholarship Leiden

“The differing percentages of OA can be explained by several factors: different stakeholders use different definitions of OA, different data sources, and different inclusion and exclusion criteria. But the precise nature of these differences is not always obvious to the casual reader.

In the next paragraphs we will look into the reports produced by three different monitors of institutional OA, namely, CWTS Leiden Ranking, the national monitoring in The Netherlands, and Leiden University Libraries’ own monitoring.

The EU Open Science Monitor also monitors trends for open access to publications but because it does so only at a country level and not at an individual institution level, we have not included it in our comparison, however, the EU Monitor’s methodological note (including the annexes) explains their choice of sources.

We will end this blog post with a conclusion and our principles and recommendations….”

CWTS Leiden Ranking 2019 provides indicators of open access publishing and gender diversity

“The Leiden Ranking is based on data from Web of Science. We calculated the open access indicators in the Leiden Ranking 2019 by combining data from Web of Science and Unpaywall….

The open access indicators in the Leiden Ranking 2019 provide clear evidence of the growth of open access publishing. The top-left plot in the figure below shows that for most universities the share of open access publications is substantially higher in the period 2014–2017 than in the period 2006–2009. In Europe in particular, there has been a strong growth in open access publishing, as shown in the top-right plot. Compared to Europe, the prevalence of open access publishing is lower in North America and especially in Asia, and the growth in open access publishing has been more modest in these parts of the world…..”

The costly prestige ranking of scholarly journals | Ravnetrykk

Abstract:  The prestige ranking of scholarly journals is costly to science and to society. Researchers’ payoff in terms of career progress is determined largely from where they publish their findings, and less from the content of their scholarly work. This fact creates perverted incentives for the researchers. Valuable research time is spent in trying to satisfy reviewers and editors, rather than spending their time in the most productive direction. This in turn leads to unnecessary long time from research findings are made until they become public. This costly system is upheld by the scholarly community itself. Scholars supply the journals with time, serving as reviewers and editors without any paycheck asked, even though the bulk of scientific journals are published by big commercial enterprises enjoying super profit margins. The super profit results from expensive licensing deals with the scholarly institutions. The free labour offered, on top of the payment for the licensing deals, should be viewed as part of the payment to these publishers – a payment in kind. Why not use this as a negotiating chip towards the publishers? If a publisher asks more than acceptable for a licensing deal, rather than walk away with no deal, the scholarly institutions could pull out all the free labour offered by reviewers and editors.

 

Green Access Rank of Most Cited Journals in Criminology · Criminology Open

“Authors should consider this ranking when deciding where to publish articles. For more information on (1) the ranking, visit this companion page; (2) copyright/access at the ranked journals and many others, view the Wiki List of Criminology Journals and Determining Copyright at Criminology Journals; and, (3) the importance of green access to criminology, read my Open (Access) Letter to Criminologists. (Table is better viewed on computer or tablet than smartphone.)

Green Access Rank of Most Cited Journals in Criminology….”

Scientists call for reform on rankings and indices of science journals

“Researchers are used to being evaluated based on indices like the impact factors of the scientific journals in which they publish papers and their number of citations. A team of 14 natural scientists from nine countries are now rebelling against this practice, arguing that obsessive use of indices is damaging the quality of science….”