Utrecht University Recognition and Rewards Vision

“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”

Journal Citation Indicator. Just Another Tool in Clarivate’s Metrics Toolbox? – The Scholarly Kitchen

“The JCI has several benefits when compared against the standard Journal Impact Factor (JIF): It is based on a journal’s citation performance across three full years of citation data rather than a single year’s snapshot of a journal’s performance across the previous two years. Clarivate also promises to provide the JCI score to all journals in its Core Collection, even those journals that do not currently receive a JIF score.

The JCI also avoids the numerator-denominator problem of the JIF, where ALL citations to a journal are counted in the numerator, but only “citable items” (Articles and Review) are counted in the denominator. The JCI only focuses on Articles and Reviews.

Finally, like a good indicator, the JCI is easy to interpret. Average performance is set to 1.0, so a journal that receives a JCI score of 2.5 performed two-and-a-half times better than average, while a journal with a score of 0.5 performed only half as well.

To me, JCI’s biggest weakness is Clarivate’s bold claim that it achieved normalization across disciplines….”

Triggle et al. (2021) Requiem for impact factors and high publication charges

Chris R Triggle, Ross MacDonald, David J. Triggle & Donald Grierson (2021) Requiem for impact factors and high publication charges, Accountability in Research, DOI: 10.1080/08989621.2021.1909481

Abstract: Journal impact factors, publication charges and assessment of quality and accuracy of scientific research are critical for researchers, managers, funders, policy makers, and society. Editors and publishers compete for impact factor rankings, to demonstrate how important their journals are, and researchers strive to publish in perceived top journals, despite high publication and access charges. This raises questions of how top journals are identified, whether assessments of impacts are accurate and whether high publication charges borne by the research community are justified, bearing in mind that they also collectively provide free peer-review to the publishers. Although traditional journals accelerated peer review and publication during the COVID-19 pandemic, preprint servers made a greater impact with over 30,000 open access articles becoming available and accelerating a trend already seen in other fields of research. We review and comment on the advantages and disadvantages of a range of assessment methods and the way in which they are used by researchers, managers, employers and publishers. We argue that new approaches to assessment are required to provide a realistic and comprehensive measure of the value of research and journals and we support open access publishing at a modest, affordable price to benefit research producers and consumers.

Are stakeholders measuring the publishing metrics that matter?: Putting research into context

“Perhaps the most fundamental aspect of compiling and implementing more meaningful research metrics that the NISO panelists discussed is the importance of putting data into context. And, as the speakers noted, there are multiple facets of context to consider, including:

The strengths and limitations of different metrics by discipline/subject matter (e.g., some metrics are better suited to certain types of research)
The intended uses and overall strengths and limitations of particular data points (e.g., altmetrics are “indicators” of impact, not measures of quality and the JIF was never meant to be used to measure the impact of individual articles or scholars)
The cultural context that a researcher is operating within and the opportunities, challenges, and biases they have experienced
How and where a research output fits within scholars’ other professional contributions (e.g., recognizing how individual research outputs are part of broader bodies of work and also measuring the impacts of scholarly outputs that do not fit within traditional publication-based assessment systems) …”

Requiem for impact factors and high publication charges: Accountability in Research: Vol 0, No ja

Abstract:  Journal impact factors, publication charges and assessment of quality and accuracy of scientific research are critical for researchers, managers, funders, policy makers, and society. Editors and publishers compete for impact factor rankings, to demonstrate how important their journals are, and researchers strive to publish in perceived top journals, despite high publication and access charges. This raises questions of how top journals are identified, whether assessments of impacts are accurate and whether high publication charges borne by the research community are justified, bearing in mind that they also collectively provide free peer-review to the publishers. Although traditional journals accelerated peer review and publication during the COVID-19 pandemic, preprint servers made a greater impact with over 30,000 open access articles becoming available and accelerating a trend already seen in other fields of research. We review and comment on the advantages and disadvantages of a range of assessment methods and the way in which they are used by researchers, managers, employers and publishers. We argue that new approaches to assessment are required to provide a realistic and comprehensive measure of the value of research and journals and we support open access publishing at a modest, affordable price to benefit research producers and consumers.

 

Novelty, Disruption, and the Evolution of Scientific Impact

Abstract:  Since the 1950s, citation impact has been the dominant metric by which science is quantitatively evaluated. But research contributions play distinct roles in the unfolding drama of scientific debate, agreement and advance, and institutions may value different kinds of advances. Computational power, access to citation data and an array of modeling techniques have given rise to a widening portfolio of metrics to extract different signals regarding their contribution. Here we unpack the complex, temporally evolving relationship between citation impact alongside novelty and disruption, two emerging measures that capture the degree to which science not only influences, but transforms later work. Novelty captures how research draws upon unusual combinations of prior work. Disruption captures how research comes to eclipse the prior work on which it builds, becoming recognized as a new scientific direction. We demonstrate that: 1) novel papers disrupt existing theories and expand the scientific frontier; 2) novel papers are more likely to become “sleeping beauties” and accumulate citation impact over the long run; 3) novelty can be reformulated as distance in journal embedding spaces to map the moving frontier of science. The evolution of embedding spaces over time reveals how yesterday’s novelty forms today’s scientific conventions, which condition the novelty–and surprise–of tomorrow’s breakthroughs.

 

The Most Widely Disseminated COVID-19-Related Scientific Publications in Online Media: A Bibliometric Analysis of the Top 100 Articles with the Highest Altmetric Attention Scores

Abstract:  The novel coronavirus disease 2019 (COVID-19) is a global pandemic. This study’s aim was to identify and characterize the top 100 COVID-19-related scientific publications, which had received the highest Altmetric Attention Scores (AASs). Hence, we searched Altmetric Explorer using search terms such as “COVID” or “COVID-19” or “Coronavirus” or “SARS-CoV-2” or “nCoV” and then selected the top 100 articles with the highest AASs. For each article identified, we extracted the following information: the overall AAS, publishing journal, journal impact factor (IF), date of publication, language, country of origin, document type, main topic, and accessibility. The top 100 articles most frequently were published in journals with high (>10.0) IF (n = 67), were published between March and July 2020 (n = 67), were written in English (n = 100), originated in the United States (n = 45), were original articles (n = 59), dealt with treatment and clinical manifestations (n = 33), and had open access (n = 98). Our study provides important information pertaining to the dissemination of scientific knowledge about COVID-19 in online media. View Full-Text

 

What Is the Price of Science? | mBio

Abstract:  The peer-reviewed scientific literature is the bedrock of science. However, scientific publishing is undergoing dramatic changes, which include the expansion of open access, an increased number of for-profit publication houses, and ready availability of preprint manuscripts that have not been peer reviewed. In this opinion article, we discuss the inequities and concerns that these changes have wrought.

 

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics – Lemke – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  The amount of annually published scholarly articles is growing steadily, as is the number of indicators through which impact of publications is measured. Little is known about how the increasing variety of available metrics affects researchers’ processes of selecting literature to read. We conducted ranking experiments embedded into an online survey with 247 participating researchers, most from social sciences. Participants completed series of tasks in which they were asked to rank fictitious publications regarding their expected relevance, based on their scores regarding six prototypical metrics. Through applying logistic regression, cluster analysis, and manual coding of survey answers, we obtained detailed data on how prominent metrics for research impact influence our participants in decisions about which scientific articles to read. Survey answers revealed a combination of qualitative and quantitative characteristics that researchers consult when selecting literature, while regression analysis showed that among quantitative metrics, citation counts tend to be of highest concern, followed by Journal Impact Factors. Our results suggest a comparatively favorable view of many researchers on bibliometrics and widespread skepticism toward altmetrics. The findings underline the importance of equipping researchers with solid knowledge about specific metrics’ limitations, as they seem to play significant roles in researchers’ everyday relevance assessments.

 

“It’s hard to explain why this is taking so long” – scilog

When it comes into force at the beginning of 2021, the Open Access initiative “Plan S” is poised to help opening up and improving academic publishing. Ulrich Pöschl, a chemist and Open Access advocate of the first hour, explains why free access to research results is important and how an up-to-date academic publishing system can work.

PBJ ranks higher, enhances diversity and offers free global access – Daniell – 2021 – Plant Biotechnology Journal – Wiley Online Library

“Since I started as the Editor?in?Chief in 2012, submission of manuscripts has almost tripled, despite transition to an open access journal a few years ago. Despite COVID?19, the number of submissions to PBJ [Plant Biotechnology Journal] continued to increase in 2020….”

A communication strategy based on Twitter improves article citation rate and impact factor of medical journals – ScienceDirect

[Note even an abstract is OA.] 

“Medical journals use Twitter to optimise their visibility on the scientific community. It is by far the most used social media to share publications, since more than 20% of published articles receive at least one announcement on Twitter (compared to less than 5% of notifications on other social networks) [5] . It was initially described that, within a medical specialty, journals with a Twitter account have a higher impact factor than others and that the number of followers is correlated to the impact factor of the journal [67] . Several observational works showed that the announcement of a medical article publication on Twitter was strongly associated with its citation rate in the following years 891011 . In 2015, among anaesthesia journals, journals with an active and influential Twitter account had an higher journal impact factor and a greater number of article citations than those not embracing social media [12] . A meta-analysis of July 2020 concluded that the presence of an article on social media was probably associated with a higher number of citations [13] . Finally, two randomised studies, published in 2020 and not included in this meta-analysis, also showed that, for a given journal, articles that benefited from exposure on Twitter were 1.5 to 9 times more cited in the year following publication than articles randomised in the “no tweeting” group [1415] 

The majority of these works have only been published very recently and the strategy for using Twitter to optimise the number of citations is now a challenge for all medical journals. Several retrospective studies have looked at the impact of the use of a social media communication strategy by medical journals. They have shown that the introduction of Twitter to communicate as part of this strategy was associated with a higher number of articles consulted, a higher number of citations and shorter delays in citation after publication [1617] . Two studies (including one on anaesthesia journals) showed that journals that used a Twitter account to communicate were more likely to increase their impact factor than those that did not [1218] . Some researchers even suggest that the dissemination of medical information through social media, allowing quick and easy access after the peer-review publication process, may supplant the classical academic medical literature in the future [19] . This evolution has led to the creation of a new type of Editor in several medical journal editorial boards: the social media Editor (sometimes with the creation of a “specialised social media team” to assist him or her) [20] . This medical Editor shares, across a range of social media platforms, new journal articles with the aim of improving dissemination of journal content. Thus, beyond the scientific interest of a given article, which determines its chances of being cited, there is currently a parallel Editorial work consisting in optimising the visibility on Twitter to increase the number of citations and improve the impact factor. Some authors also start to focus on the best techniques for using Twitter and on the best ways to tweet to optimise communication, for example during a medical congress [21] ….”