University Rankings and Governance by Metrics and Algorithms | Zenodo

Abstract:  This paper looks closely at how data analytic providers leverage rankings as a part of their strategies to further extract rent and assets from the university beyond their traditional roles as publishers and citation data providers. Multinational publishers such as Elsevier, with over 2,500 journals in its portfolio, has transitioned to become a data analytic firm. Rankings expand their abilities to monetize further their existing journal holdings, as there is a strong association between publication in high-impact journals and improvement in rankings.  The global academic publishing industry has become highly oligopolistic, and a small handful of legacy multinational firms are now publishing the majority of the world’s research output (See Larivière et. al. 2015; Fyfe et. al. 2017; Posada & Chen, 2018). It is therefore crucial that their roles and enormous market power in influencing university rankings be more closely scrutinized. We suggest that due to a combination of a lack of transparency regarding, for example, Elsevier’s data services and products and their self-positioning as a key intermediary in the commercial rankings business, they have managed to evade the social responsibilities and scrutiny that come with occupying such a critical public function in university evaluation. As the quest for ever-higher rankings often works in conflict with universities’ public missions, it is critical to raise questions about the governance of such private digital platforms and the compatibility between their private interests and the maintenance of universities’ public values.

 

Indonesia nomor 1 untuk publikasi jurnal akses terbuka di dunia: apa artinya bagi ekosistem riset lokal

“With the largest number of OA journals in the world, the knowledge of Indonesian researchers should be able to freely reach the public.

The government has started to realize this.

This is evidenced by the recent Law on National Science and Technology System ( UU Sisnas Science and Technology ) which also began requiring the application of this open access system for research publications to ensure that research results can be enjoyed by the public.

Through this obligation, the government hopes to encourage not only the transparency of the research process, but also innovations and new findings that benefit society….

According to our records, the research publication system in Indonesia since the 1970s has implemented the non-profit principle. At that time research publications were sold for a subscription fee which was usually calculated from the cost of printing only. This system is different from that found in developed countries which are dominated by commercial publishing companies.

This is where Indonesia triumphs over any research ecosystem.

Some that can match it are the Scielo research ecosystem in Brazil, the African Journal Online (AJOL) scientific publishing ecosystem and the Africaxiv from the African continent…..”

Gaming the Metrics | The MIT Press

“The traditional academic imperative to “publish or perish” is increasingly coupled with the newer necessity of “impact or perish”—the requirement that a publication have “impact,” as measured by a variety of metrics, including citations, views, and downloads. Gaming the Metrics examines how the increasing reliance on metrics to evaluate scholarly publications has produced radically new forms of academic fraud and misconduct. The contributors show that the metrics-based “audit culture” has changed the ecology of research, fostering the gaming and manipulation of quantitative indicators, which lead to the invention of such novel forms of misconduct as citation rings and variously rigged peer reviews. The chapters, written by both scholars and those in the trenches of academic publication, provide a map of academic fraud and misconduct today. They consider such topics as the shortcomings of metrics, the gaming of impact factors, the emergence of so-called predatory journals, the “salami slicing” of scientific findings, the rigging of global university rankings, and the creation of new watchdogs and forensic practices.”

OA Monitoring: why do we get different results? – Digital Scholarship Leiden

“The differing percentages of OA can be explained by several factors: different stakeholders use different definitions of OA, different data sources, and different inclusion and exclusion criteria. But the precise nature of these differences is not always obvious to the casual reader.

In the next paragraphs we will look into the reports produced by three different monitors of institutional OA, namely, CWTS Leiden Ranking, the national monitoring in The Netherlands, and Leiden University Libraries’ own monitoring.

The EU Open Science Monitor also monitors trends for open access to publications but because it does so only at a country level and not at an individual institution level, we have not included it in our comparison, however, the EU Monitor’s methodological note (including the annexes) explains their choice of sources.

We will end this blog post with a conclusion and our principles and recommendations….”

CWTS Leiden Ranking 2019 provides indicators of open access publishing and gender diversity

“The Leiden Ranking is based on data from Web of Science. We calculated the open access indicators in the Leiden Ranking 2019 by combining data from Web of Science and Unpaywall….

The open access indicators in the Leiden Ranking 2019 provide clear evidence of the growth of open access publishing. The top-left plot in the figure below shows that for most universities the share of open access publications is substantially higher in the period 2014–2017 than in the period 2006–2009. In Europe in particular, there has been a strong growth in open access publishing, as shown in the top-right plot. Compared to Europe, the prevalence of open access publishing is lower in North America and especially in Asia, and the growth in open access publishing has been more modest in these parts of the world…..”

The costly prestige ranking of scholarly journals | Ravnetrykk

Abstract:  The prestige ranking of scholarly journals is costly to science and to society. Researchers’ payoff in terms of career progress is determined largely from where they publish their findings, and less from the content of their scholarly work. This fact creates perverted incentives for the researchers. Valuable research time is spent in trying to satisfy reviewers and editors, rather than spending their time in the most productive direction. This in turn leads to unnecessary long time from research findings are made until they become public. This costly system is upheld by the scholarly community itself. Scholars supply the journals with time, serving as reviewers and editors without any paycheck asked, even though the bulk of scientific journals are published by big commercial enterprises enjoying super profit margins. The super profit results from expensive licensing deals with the scholarly institutions. The free labour offered, on top of the payment for the licensing deals, should be viewed as part of the payment to these publishers – a payment in kind. Why not use this as a negotiating chip towards the publishers? If a publisher asks more than acceptable for a licensing deal, rather than walk away with no deal, the scholarly institutions could pull out all the free labour offered by reviewers and editors.

 

Green Access Rank of Most Cited Journals in Criminology · Criminology Open

“Authors should consider this ranking when deciding where to publish articles. For more information on (1) the ranking, visit this companion page; (2) copyright/access at the ranked journals and many others, view the Wiki List of Criminology Journals and Determining Copyright at Criminology Journals; and, (3) the importance of green access to criminology, read my Open (Access) Letter to Criminologists. (Table is better viewed on computer or tablet than smartphone.)

Green Access Rank of Most Cited Journals in Criminology….”

Scientists call for reform on rankings and indices of science journals

“Researchers are used to being evaluated based on indices like the impact factors of the scientific journals in which they publish papers and their number of citations. A team of 14 natural scientists from nine countries are now rebelling against this practice, arguing that obsessive use of indices is damaging the quality of science….”

Chasing cash cows in a swamp? Perspectives on Plan S from Australia and the USA | Unlocking Research

“Rankings are a natural enemy of openness….

Australian universities are heavily financially reliant on overseas students….

University rankings are extremely important in the recruitment of overseas students….

There is incredible pressure on researchers in Australia to perform. This can take the form of reward, with many universities offering financial incentives for publication in ‘top’ journals….

For example, Griffith University’s Research and Innovation Plan 2017-2020 includes: “Maintain a Nature and Science publication incentive scheme”. Publication in these two journals comprises 20% of the score in the Academic Ranking of World Universities….”

A perspective on problems and prospects for academic publishing in Geography – Meadows – 2016 – Geo: Geography and Environment – Wiley Online Library

Abstract:  This commentary highlights problems of inequity in academic publishing in geography that arise from the increasing use of metrics as a measure of research quality. In so doing, we examine patterns in the ranking of geographical journals in the major global databases (e.g. Web of Science, Scopus) and compare these with a more inclusive database developed by the International Geographical Union. The shortcomings of ranking systems are examined and are shown to include, inter alia, linguistic bias, the lack of representation of books and chapters in books, the geographical unevenness of accredited journals, problems of multi-authorship, the mismatch between ranking and social usefulness and alternative or critical thinking, as well as differences between physical and human geography. The hegemony of the global commercial publishing houses emerges as problematic for geography in particular. It is argued that the global community of geographers should continue to challenge the use of bibliometrics as a means of assessing research quality.

[Includes a section, “Is open access an adequate response?”]

Cites & Insights: The Gold OA Landscape 2011-2014

“This issue consists of an excerpted version of The Gold OA Landscape 2011- 2014, published September 10, 2015 as a PDF ebook for $55.00 and on September 11, 2015 as a paperback book for $60.00. Both are currently available at Lulu.com (use the links, repeated here: http://www.lulu.com/content/ebook/the-gold-oa-landscape-2011-2014/17262336 for the ebook, http://www.lulu.com/content/paperback-book/the-gold-oa-landscape-2011- 2014/17264390 for the paperback book. Both editions have ISBNs: 978-1- 329-54713-1 for the PDF, 978-1-329-54762-9 for the paperback. The paperback should eventually be available through Amazon, Ingram or Barnes & Noble, but I don’t know when that will happen. This book represents the first overview of essentially all of serious gold OA—that is, what’s published by the journals listed in the Directory of Open Access Journals. I believe it’s important for all OA publishers and for many libraries and OA advocates. If it does well, or if there’s some form of alternative funding, I’ll continue tracking the field in the future … How many open access (OA) articles are published each year? How many open access (OA) journals publish how many OA articles? What proportion of those journals and articles involve fees (usually called Article Processing Charges or APCs)? Those seemingly-simple questions don’t have simple answers. The first one may not have an answer at all. This report provides a reasonably complete set of answers to the second and third questions and provides a detailed picture of the Gold OA landscape—that is, journals that make all refereed articles immediately available for anybody to read and download from the Internet, at no cost and with no barriers. This report is based on an exhaustive study of Gold OA journals as represented by the Directory of Open Access Journals (DOAJ) as of June 8, 2015, excluding journals that began publishing in 2015 (and two accidental duplications in DOAJ) …”

ResearchGate: Disseminating, Communicating and Measuring Scholarship?

Abstract:  ResearchGate is a social network site for academics to create their own profiles, list their 

publications and interact with each other. Like Academia.edu, it provides a new way for 
scholars to disseminate their publications and hence potentially changes the dynamics of 
informal scholarly communication. This article assesses whether ResearchGate usage and 
publication data broadly reflect existing academic hierarchies and whether individual 
countries are set to benefit or lose out from the site. The results show that rankings based 
on ResearchGate statistics correlate moderately well with other rankings of academic 
institutions, suggesting that ResearchGate use broadly reflects traditional academic 
capital. Moreover, while Brazil, India and some other countries seem to be 
disproportionately taking advantage of ResearchGate, academics in China, South Korea 
and Russia may be missing opportunities to use ResearchGate to maximise the academic 
impact of their publications.