“The Journal of Trial and Error is proud to present an exciting and timely event: a three-way debate on the topic of Open Science metrics, specifically, transparency metrics. Should we develop these metrics? What purposes do they fulfil? How should Open Science practices be encouraged? Are (transparency) rankings the best solution? These questions and more will be addressed in a dynamic and interactive debate with three researchers of different backgrounds: Etienne LeBel (Independent Meta-Scientist and founder of ERC-funded project ‘Curate Science’), Sarah de Rijcke (Professor of Science and Evaluation Studies and director of the Centre for Science and Technology Studies at Leiden University), and Juliëtte Schaafsma (Professor of Cultural Psychology at Tilburg University and fierce critic of rankings and audits). This is an event organized by the Journal of Trial and Error, and supported by the Open Science Community Tilburg, the Centre for Science and Technology Studies (CWTS, Leiden University), and the Open Science Community Utrecht.”
“In the ranking of institutional repositories, the university’s archive has risen two positions and is ranked 26th in the world out of more than 3,100 other resources. Moreover, the Ural Federal University archive continues to hold first place in Russia among institutional archives….”
Abstract: This paper looks closely at how data analytic providers leverage rankings as a part of their strategies to further extract rent and assets from the university beyond their traditional roles as publishers and citation data providers. Multinational publishers such as Elsevier, with over 2,500 journals in its portfolio, has transitioned to become a data analytic firm. Rankings expand their abilities to monetize further their existing journal holdings, as there is a strong association between publication in high-impact journals and improvement in rankings. The global academic publishing industry has become highly oligopolistic, and a small handful of legacy multinational firms are now publishing the majority of the world’s research output (See Larivière et. al. 2015; Fyfe et. al. 2017; Posada & Chen, 2018). It is therefore crucial that their roles and enormous market power in influencing university rankings be more closely scrutinized. We suggest that due to a combination of a lack of transparency regarding, for example, Elsevier’s data services and products and their self-positioning as a key intermediary in the commercial rankings business, they have managed to evade the social responsibilities and scrutiny that come with occupying such a critical public function in university evaluation. As the quest for ever-higher rankings often works in conflict with universities’ public missions, it is critical to raise questions about the governance of such private digital platforms and the compatibility between their private interests and the maintenance of universities’ public values.
“With the largest number of OA journals in the world, the knowledge of Indonesian researchers should be able to freely reach the public.
The government has started to realize this.
This is evidenced by the recent Law on National Science and Technology System ( UU Sisnas Science and Technology ) which also began requiring the application of this open access system for research publications to ensure that research results can be enjoyed by the public.
Through this obligation, the government hopes to encourage not only the transparency of the research process, but also innovations and new findings that benefit society….
According to our records, the research publication system in Indonesia since the 1970s has implemented the non-profit principle. At that time research publications were sold for a subscription fee which was usually calculated from the cost of printing only. This system is different from that found in developed countries which are dominated by commercial publishing companies.
This is where Indonesia triumphs over any research ecosystem.
Some that can match it are the Scielo research ecosystem in Brazil, the African Journal Online (AJOL) scientific publishing ecosystem and the Africaxiv from the African continent…..”
“The traditional academic imperative to “publish or perish” is increasingly coupled with the newer necessity of “impact or perish”—the requirement that a publication have “impact,” as measured by a variety of metrics, including citations, views, and downloads. Gaming the Metrics examines how the increasing reliance on metrics to evaluate scholarly publications has produced radically new forms of academic fraud and misconduct. The contributors show that the metrics-based “audit culture” has changed the ecology of research, fostering the gaming and manipulation of quantitative indicators, which lead to the invention of such novel forms of misconduct as citation rings and variously rigged peer reviews. The chapters, written by both scholars and those in the trenches of academic publication, provide a map of academic fraud and misconduct today. They consider such topics as the shortcomings of metrics, the gaming of impact factors, the emergence of so-called predatory journals, the “salami slicing” of scientific findings, the rigging of global university rankings, and the creation of new watchdogs and forensic practices.”
“The differing percentages of OA can be explained by several factors: different stakeholders use different definitions of OA, different data sources, and different inclusion and exclusion criteria. But the precise nature of these differences is not always obvious to the casual reader.
In the next paragraphs we will look into the reports produced by three different monitors of institutional OA, namely, CWTS Leiden Ranking, the national monitoring in The Netherlands, and Leiden University Libraries’ own monitoring.
The EU Open Science Monitor also monitors trends for open access to publications but because it does so only at a country level and not at an individual institution level, we have not included it in our comparison, however, the EU Monitor’s methodological note (including the annexes) explains their choice of sources.
We will end this blog post with a conclusion and our principles and recommendations….”
“The Leiden Ranking is based on data from Web of Science. We calculated the open access indicators in the Leiden Ranking 2019 by combining data from Web of Science and Unpaywall….
The open access indicators in the Leiden Ranking 2019 provide clear evidence of the growth of open access publishing. The top-left plot in the figure below shows that for most universities the share of open access publications is substantially higher in the period 2014–2017 than in the period 2006–2009. In Europe in particular, there has been a strong growth in open access publishing, as shown in the top-right plot. Compared to Europe, the prevalence of open access publishing is lower in North America and especially in Asia, and the growth in open access publishing has been more modest in these parts of the world…..”
Abstract: The prestige ranking of scholarly journals is costly to science and to society. Researchers’ payoff in terms of career progress is determined largely from where they publish their findings, and less from the content of their scholarly work. This fact creates perverted incentives for the researchers. Valuable research time is spent in trying to satisfy reviewers and editors, rather than spending their time in the most productive direction. This in turn leads to unnecessary long time from research findings are made until they become public. This costly system is upheld by the scholarly community itself. Scholars supply the journals with time, serving as reviewers and editors without any paycheck asked, even though the bulk of scientific journals are published by big commercial enterprises enjoying super profit margins. The super profit results from expensive licensing deals with the scholarly institutions. The free labour offered, on top of the payment for the licensing deals, should be viewed as part of the payment to these publishers – a payment in kind. Why not use this as a negotiating chip towards the publishers? If a publisher asks more than acceptable for a licensing deal, rather than walk away with no deal, the scholarly institutions could pull out all the free labour offered by reviewers and editors.
“Authors should consider this ranking when deciding where to publish articles. For more information on (1) the ranking, visit this companion page; (2) copyright/access at the ranked journals and many others, view the Wiki List of Criminology Journals and Determining Copyright at Criminology Journals; and, (3) the importance of green access to criminology, read my Open (Access) Letter to Criminologists. (Table is better viewed on computer or tablet than smartphone.)
Green Access Rank of Most Cited Journals in Criminology….”
“Researchers are used to being evaluated based on indices like the impact factors of the scientific journals in which they publish papers and their number of citations. A team of 14 natural scientists from nine countries are now rebelling against this practice, arguing that obsessive use of indices is damaging the quality of science….”
Australian universities are heavily financially reliant on overseas students….
University rankings are extremely important in the recruitment of overseas students….
There is incredible pressure on researchers in Australia to perform. This can take the form of reward, with many universities offering financial incentives for publication in ‘top’ journals….
For example, Griffith University’s Research and Innovation Plan 2017-2020 includes: “Maintain a Nature and Science publication incentive scheme”. Publication in these two journals comprises 20% of the score in the Academic Ranking of World Universities….”
Abstract: This commentary highlights problems of inequity in academic publishing in geography that arise from the increasing use of metrics as a measure of research quality. In so doing, we examine patterns in the ranking of geographical journals in the major global databases (e.g. Web of Science, Scopus) and compare these with a more inclusive database developed by the International Geographical Union. The shortcomings of ranking systems are examined and are shown to include, inter alia, linguistic bias, the lack of representation of books and chapters in books, the geographical unevenness of accredited journals, problems of multi-authorship, the mismatch between ranking and social usefulness and alternative or critical thinking, as well as differences between physical and human geography. The hegemony of the global commercial publishing houses emerges as problematic for geography in particular. It is argued that the global community of geographers should continue to challenge the use of bibliometrics as a means of assessing research quality.
[Includes a section, “Is open access an adequate response?”]