Many scientists are transitioning to a new way of working, known as open science, which will require new ways of evaluating researchers’ work. At Utrecht University we are adapting the reward system so it will incentivise this shift. The change that has received the most public attention, ditching the publishing metric known as the journal impact factor, is important, but it’s just one step in a much larger transformation. Through open science, researchers and research administrators seek to improve the quality, reproducibility and social impact of research. Open science includes open access publishing, so citizens and peers can access the fruits of publicly-funded research without paying for the privilege, and moving to a system of FAIR data, making information easy for researchers to find, access, and reuse. Open science also includes software sharing.
From Google’s English: “On July 19, ScienceGuide published an open letter from 171 academics who are concerned about the new Recognition and Valuation of scientists. In fact, the signatories warn that the new ‘Recognize and Appreciate’ leads to more arbitrariness and loss of quality. This will jeopardize the international top position of Dutch science, argue the writers, which will adversely affect young academics in particular. …
It is noticeable that these young scientists, whom the letter speaks of, do not seem to be involved in drafting the message. It is also striking that signatories to the open letter themselves are mainly at the top of the academic career ladder; 142 of the 171 signatories are even professors. As Young Science in Transition, PhD candidates Network Netherlands, PostDocNL, a large number of members of De Jonge Akademies and many other young researchers, we do not agree with the message they are proclaiming. In fact, we worry about these kinds of noises when it comes to our current and future careers. Young academics are eagerly waiting for a new system of Recognition and Appreciation. …”
“During the last few weeks, several opinion pieces have appeared questioning the new Recognition and Rewards (R&R) and Open Science in Dutch academia. On July 13, the TU/e Cursor published interviews with professors who question the usefulness of a new vision on R&R (1). A day later, on July 14, the chairman of the board of NWO compared science to top sport, with an emphasis on sacrifice and top performance (2), a line of thinking that fits the traditional way of R&R in academia. On July 19, an opinion piece was published by 171 university (head) teachers and professors (3), this time in ScienceGuide questioning again the new vision of R&R. These articles, all published within a week, show that as the new R&R gains traction within universities, established scholars are questioning its usefulness and effectiveness. Like others before us (4), we would like to respond. …”
“Can we break out of this vicious cycle? Are there alternatives? Yes, there are. For some years now, various movements worldwide have sought to change the system for evaluating research. In 2012, the “San Francisco Declaration” proposed eliminating metrics based on the impact factor. There was also the Charte de la désexcellence (“Letter of Dis-Excellence”) mentioned above. In 2015, a group of academicians signed the Leiden Manifesto, which warned of the “widespread misuse of indicators in evaluating scientific performance.” Since 2013, the group Science in Transition has sought to reform the science evaluation system. Finally, since 2016, the Collectiu InDocentia, created at the University of Valencia (Spain), has also been doing its part. …”
“The Tulane Supporting Impactful Publications (SIP) assists in covering fees to support open access options for high impact peer-reviewed publications for Tulane scholars serving as corresponding authors who do not have grant or other funds available to cover them. This program is funded and coordinated by the Office of Academic Affairs and Provost and co-funded by the Office of Academic Affairs and Tulane Libraries and Academic Information Resources. …
Eligible applicants may apply for funds once a peer reviewed journal article has been accepted for publication in a journal with impact factor of 8 or above. Applications for journals with impact factors <8 will also be considered for funding when the corresponding author provides a compelling case to do so. One application may be submitted per eligible publication….”
“Research institutes across the world have developed statements regarding the use of metrics, some in response to the Leiden Manifesto and the San Francisco Declaration on Research Assessment, and some independently.
Collected here are some examples of these statements….”
Abstract: This study investigates citation patterns between 2017 and 2020 for preprints published in three preprint servers, one specializing in biology (bioRxiv), one in chemistry (ChemRxiv), and another hosting preprints in all disciplines (Research Square). Showing evidence that preprints are now regularly cited in peer reviewed journal articles, books, and conference papers, the outcomes of this investigation further substantiate the value of open science also in relation to citation-based metrics on which the evaluation of scholarship continues to rely on. This analysis will be useful to inform new research-based education in today’s scholarly communication. View Full-Text
An examination of highly visible COVID-19 research articles reveals that 55% could be considered at risk of bias.
Only 11% of the evaluated early studies on COVID-19 adhered to good standards of reporting such as PRISMA or CONSORT.
There was no correlation between quality of reporting and either the journal Impact Factor or the article Altmetric Attention Score in early studies on COVID-19.
Most highly visible early articles on COVID-19 were published in the Lancet and Journal of the American Medical Association.”
Open science or open access? Impact factor and risk for research diversity?
The aim of this study is to compare the annual SJR and to evaluate the other parameters that show the scientific effect of journals in terms of open access (OA) or subscription access (SA) in the field of obstetrics and gynecology according to the SCImago database.
Material and methods:This study was conducted between September-December 2019 at Near East University. The SCImago Journal & Country Rank database was used to collect information about the journals. We evaluated and compared the changes in the one-year SJR (SCImago Journal Rank) and journal impact factor (JIF) of OA and SA journals.
Results:Data from 183 scientific journals in the field of obstetrics and gynecology from the period between 1999 and 2018 were evaluated, where 140 of these journals were SA and 43 were OA. The average SJR of OA journals in 1999 was 0.17, while it was 0.38 for SA journals. In 2018, these values were 0.31 and 0.78 for OA and SA journals, respectively. In the comparison of JIF, the average of the OA journals in 1999 was 0.09, while it was 0.66 for SA journals. In 2018, these values were 0.80 and 1.93 for OA and SA journals, respectively.
Conclusions:Access to information has become easier due to technological developments and this will continue to affect the access policies of journals. Despite the disadvantages of predator journals, the rise of OA journals in terms of number and quality is likely to continue.
To assess the relationship of individual article citations in the Sport Sciences field to (i) journal impact factor; (ii) each article’s open access status; and (iii) Altmetric score components.
We searched the ISI Web of Knowledge InCites Journal Citation Reports database “Sport Sciences” category for the 20 journals with the highest 2-year impact factor in 2018. We extracted the impact factor for each journal and each article’s open access status (yes or no). Between September 2019 and February 2020, we obtained individual citations, Altmetric scores and details of Altmetric components (e.g. number of tweets, Facebook posts, etc.) for each article published in 2017. Linear and multiple regression models were used to assess the relationship between the dependent variable (citation number) and the independent variables article Altmetric score and open access status, and journal impact factor.
4,022 articles were included. Total Altmetric score, journal impact factor and open access status, respectively explained 32%, 14%, and 1% of the variance in article citations (when combined, the variables explained 40% of the variance in article citations). The number of tweets related to an article was the Altmetric component that explained the highest proportion of article citations (37%).
Altmetric scores in Sports Sciences journals have a stronger relationship with number of citations than does journal impact factor or open access status. Twitter may be the best social media platform to promote a research article as it has a strong relationship with article citations.
“The new Journal Citation Indicator (JCI) accounts for the substantially different rates of publication and citation in different fields, Clarivate says. But the move is drawing little praise from the critics, who say the new metric remains vulnerable to misunderstanding and misuse….”
Abstract: Success and impact metrics in science are based on a system that perpetuates sexist and racist “rewards” by prioritizing citations and impact factors. These metrics are flawed and biased against already marginalized groups and fail to accurately capture the breadth of individuals’ meaningful scientific impacts. We advocate shifting this outdated value system to advance science through principles of justice, equity, diversity, and inclusion. We outline pathways for a paradigm shift in scientific values based on multidimensional mentorship and promoting mentee well-being. These actions will require collective efforts supported by academic leaders and administrators to drive essential systemic change.
“A Dutch university says it is formally abandoning the impact factor — a standard measure of scientific success — in all hiring and promotion decisions. By early 2022, every department at Utrecht University in the Netherlands will judge its scholars by other standards, including their commitment to teamwork and their efforts to promote open science, says Paul Boselie, a governance researcher and the project leader for the university’s new Recognition and Rewards scheme. “Impact factors don’t really reflect the quality of an individual researcher or academic,” he says. “We have a strong belief that something has to change, and abandoning the impact factor is one of those changes.” …”
“Traditional citation metrics, such as the journal impact factor, can frequently act as biased measurements of research quality and contribute to the broken system of research evaluation. Academic institutions and funding bodies are increasingly moving away from relying on citation metrics towards greater use of transparent expert opinion.
Faculty Opinions has championed this cause for two decades, with over 230k recommendations made by our 8000+ Faculty Members, and we are excited to introduce the next step in our evolution – a formidable mark of research quality – the new Faculty Opinions Score….
The Faculty Opinions Score assigns a numerical value of research publications in Biology and Medicine, to quantify their impact and quality compared to other publications in their field.
The Faculty Opinions Score is derived by combining our unique star-rated Recommendations on individual publications, made by world-class experts, with bibliometrics to produce a radically new metric in the research evaluation landscape.
Key properties of the Faculty Opinions Score:
A score of zero is assigned to articles with no citations and no recommendations.
The average score of a set of recommended articles has an expected value of 10. However, articles with many recommendations or highly cited articles may have a much higher score. There is no upper bound.
Non-recommended articles generally score lower than recommended articles.
Recommendations contribute more to the score than bibliometric performance. In other words, expert recommendations increase an article’s score (much) more than citations do….”