Boost for academic recognition and reward revolution

“Dutch academics are putting their foot on the gas in the rebellion against the uncritical use of journal impact factors to recognise and reward researchers, which was set in motion by the 2012 San Francisco Declaration on Research Assessment, or DORA.

From early next year, Utrecht University in the Netherlands will officially stop using the so-called ‘impact factor’ in all its hiring and promotions and judge its researchers by their commitment to open science, teamwork, public engagement and data sharing.

And despite opposition from some Dutch professors, the sweeping changes are gathering pace, with Leiden University among the Dutch institutions also pledging their support with their Academia in Motion paper….”

Impact of “impact factor” on early-career scientists | Rising Kashmir

“Usage of JIF by the scientific community as a predictor of impact has also increased, even while evidence of its predictive value has eroded; both correlations between article citation rate and JIF and proportions of highly cited articles published in high-impact journals have declined since 1990. Because digitization of journal content and proliferation of open-access articles have profoundly changed how relevant literature is located and cited. Having reviewed its history, a Web of Science search was carried out for articles published last year relevant to JIF; of 88 articles, about half are critiques of JIF, yet the other half, for the most part, are journal editorials touting a new or higher impact factor for the year….

Hiring and promotion decisions are too important to be subject to the influence of a metric so vulnerable to manipulation and misrepresentation. Journals can boost their JIF by encouraging selective journal self-citation and by changing journal composition through a preferential publication of reviews, articles in fields with large constituencies, or articles on research topics with short half-lives. JIF has degenerated into a marketing tool for journals as illustrated by the use of “Unofficial Impact Factors” in promotional material for journals that are not even indexed in Web of Science; also they are marketing tools for academic institutions as illustrated by the practice of Clarivate Analytics (which now owns Science Citation Index) of awarding paper certificates and electronic “badges” for scientists determined to be Highly Cited Researchers (HCRs, #HighlyCited) by virtue of publishing papers in the top 1% by citations for their field and publication year. …

 

In Science, it has been widely noted that using JIF as a proxy for scientific excellence undermines incentives to pursue novel, time-consuming, and potentially groundbreaking work…”

 

Why Open Access: Economics and Business Researchers’ Perspectives

Abstract:  Public research policies have been promoting open-access publication in recent years as an adequate model for the dissemination of scientific knowledge. However, depending on the disciplines, its use is very diverse. This study explores the determinants of open-access publication among academic researchers of economics and business, as well as their assessment of different economic measures focused on publication stimulus. To do so, a survey of Spanish business and economics researchers was conducted. They reported an average of 19% of their publications in open-access journals, hybrids or fully Gold Route open access. Almost 80% of the researchers foresee a future increase in the volume of open-access publications. When determining where to publish their research results, the main criterion for the selection of a scientific journal is the impact factor. Regarding open access, the most valued aspect is the visibility and dissemination it provides. Although the cost of publication is not the most relevant criterion in the choice of a journal, three out of four researchers consider that a reduction in fees and an increase in funding are measures that would boost the open-access model.

 

Dashboard will track hiring and promotion criteria

“A US$1.2 million grant will fund an effort to identify and publicize the criteria that universities around the world use to hire and promote researchers. The Declaration on Research Assessment (DORA), a global initiative to reform the evaluation of researchers, will use part of the funds to create an interactive dashboard that will shine much-needed light on a process that is often opaque and controversial, says programme director Anna Hatch, who is based in Washington DC. “When criteria are visible and transparent, universities can be held accountable,” she says. “Researchers will know how their contributions will be measured, so they can make a better case for themselves.”

DORA, conceived in 2012 at the annual meeting of the American Society for Cell Biology, called for improvements to the evaluation of researchers and the outputs of scholarly research. The declaration specifically calls for doing away with impact factors as a way to judge the merit of academics. So far, it has been signed by more than 20,000 individuals and institutions around the world.

The grant is from the Arcadia Fund, a UK-based charity that has supported many academic initiatives since its founding in 2001….”

Association between productivity and journal impact across disciplines and career age

Abstract:  The association between productivity and impact of scientific production is a long-standing debate in science that remains controversial and poorly understood. Here we present a large-scale analysis of the association between yearly publication numbers and average journal-impact metrics for the Brazilian scientific elite. We find this association to be discipline-specific, career-age dependent, and similar among researchers with outlier and non-outlier performance. Outlier researchers either outperform in productivity or journal prestige, but they rarely do so in both categories. Non-outliers also follow this trend and display negative correlations between productivity and journal prestige but with discipline-dependent intensity. Our research indicates that academics are averse to simultaneous changes in their productivity and journal-prestige levels over consecutive career years. We also find that career patterns concerning productivity and journal prestige are discipline-specific, having in common a raise of productivity with career age for most disciplines and a higher chance of outperforming in journal impact during early career stages.

 

Irrational rationality: critique of metrics-based evaluation of researchers and universities | Sustaining the Knowledge Commons / Soutenir les savoirs communs

The unique contribution of this chapter is critique of the underlying belief behind both traditional and alternative metrics-based approaches to assessing research and researchers, that is, the assumption that impact is good and an indicator of quality research and therefore it makes sense to measure impact, with the only questions being whether particular technical measures of impact are accurate or not. For example, if impact is necessarily good, then the retracted study by Wakefield et al. that falsely correlated vaccination with autism is good research by any metric – many academic citations both before and after publication, citations in popular and social media and arguably a factor in the real-world impact of the anti-vaccination movement and the subsequent return of preventable illnesses like measles and a factor in the challenge of fighting COVID through vaccination. An alternative approach is suggested, using the traditional University of Ottawa’s collective agreement with APUO (union of full-time professors) as a means of evaluation that considers many different types of publications and considers quantity of publication in a way that gives evaluators the flexibility to take into account the kind of research and research output.

ERC Work Programme 2022

“Under Horizon Europe, beneficiaries of ERC grants must ensure open access to all peer-reviewed scientific publications13 relating to their results as set out in the Model Grant Agreement used for ERC actions. Beneficiaries must ensure that they or the authors retain sufficient intellectual property rights to comply with their open access requirements….

In the Track record (see “Proposal description”) the applicant Principal Investigator should list (if applicable, and in addition to any other scientific achievements deemed relevant by the applicant in relation to their research field and project): 1. Up to five publications in major international peer-reviewed multi-disciplinary scientific journals and/or in the leading international peer-reviewed journals, peer-reviewed conferences proceedings and/or monographs of their respective research fields, highlighting those as main author or without the presence as co-author of their PhD supervisor (properly referenced, field relevant bibliometric indicators21 [“except the Journal Impact Factor”] may also be included); preprints may be included, if freely available from a preprint server (preprints should be properly referenced and either a link to the preprint or a DOI should be provided);…”

European Research Council bans journal impact factor from bids | Times Higher Education (THE)

“One of the world’s most prestigious research funders has told academics that they must not include journal impact factors (JIF) in their applications, in the latest sign that the controversial metric has become discredited.

In the European Research Council’s (ERC) latest work programme, applicants are for the first time explicitly told to avoid mentioning the metric when listing their publications.

“Properly referenced, field relevant bibliometric indicators” can be used “except the journal impact factor”, states the new guidance, released on 14 July….”

Viewpoint: As part of global shift, Utrecht University is changing how it evaluates its researchers | Science|Business

Many scientists are transitioning to a new way of working, known as open science, which will require new ways of evaluating researchers’ work. At Utrecht University we are adapting the reward system so it will incentivise this shift. The change that has received the most public attention, ditching the publishing metric known as the journal impact factor, is important, but it’s just one step in a much larger transformation. Through open science, researchers and research administrators seek to improve the quality, reproducibility and social impact of research. Open science includes open access publishing, so citizens and peers can access the fruits of publicly-funded research without paying for the privilege, and moving to a system of FAIR data, making information easy for researchers to find, access, and reuse. Open science also includes software sharing.

We moeten af van telzucht in de wetenschap – ScienceGuide

From Google’s English:  “On July 19, ScienceGuide published an open letter from 171 academics who are concerned about the new Recognition and Valuation of scientists. In fact, the signatories warn that the new ‘Recognize and Appreciate’ leads to more arbitrariness and loss of quality. This will jeopardize the international top position of Dutch science, argue the writers, which will adversely affect young academics in particular.  …

It is noticeable that these young scientists, whom the letter speaks of, do not seem to be involved in drafting the message. It is also striking that signatories to the open letter themselves are mainly at the top of the academic career ladder; 142 of the 171 signatories are even professors. As Young Science in Transition, PhD candidates Network Netherlands, PostDocNL, a large number of members of De Jonge Akademies and many other young researchers, we do not agree with the message they are proclaiming. In fact, we worry about these kinds of noises when it comes to our current and future careers. Young academics are eagerly waiting for a new system of Recognition and Appreciation. …”

Why the new Recognition & Rewards actually boosts excellent science

“During the last few weeks, several opinion pieces have appeared questioning the new Recognition and Rewards (R&R) and Open Science in Dutch academia. On July 13, the TU/e Cursor published interviews with professors who question the usefulness of a new vision on R&R (1). A day later, on July 14, the chairman of the board of NWO compared science to top sport, with an emphasis on sacrifice and top performance (2), a line of thinking that fits the traditional way of R&R in academia. On July 19, an opinion piece was published by 171 university (head) teachers and professors (3), this time in ScienceGuide questioning again the new vision of R&R. These articles, all published within a week, show that as the new R&R gains traction within universities, established scholars are questioning its usefulness and effectiveness. Like others before us (4), we would like to respond. …”

Why Do We Need to Change Research Evaluation Systems? — Observatory | Institute for the Future of Education

“Can we break out of this vicious cycle? Are there alternatives? Yes, there are. For some years now, various movements worldwide have sought to change the system for evaluating research. In 2012, the “San Francisco Declaration” proposed eliminating metrics based on the impact factor. There was also the Charte de la désexcellence  (“Letter of Dis-Excellence”) mentioned above. In 2015, a group of academicians signed the Leiden Manifesto, which warned of the “widespread misuse of indicators in evaluating scientific performance.” Since 2013, the group Science in Transition has sought to reform the science evaluation system. Finally, since 2016, the Collectiu InDocentia, created at the University of Valencia (Spain), has also been doing its part. …”

Supporting Impactful Publications (SIP) Program | provost

“The Tulane Supporting Impactful Publications (SIP) assists in covering fees to support open access options for high impact peer-reviewed publications for Tulane scholars serving as corresponding authors who do not have grant or other funds available to cover them. This program is funded and coordinated by the Office of Academic Affairs and Provost and co-funded by the Office of Academic Affairs and Tulane Libraries and Academic Information Resources. …

Eligible applicants may apply for funds once a peer reviewed journal article has been accepted for publication in a journal with impact factor of 8 or above. Applications for journals with impact factors <8 will also be considered for funding when the corresponding author provides a compelling case to do so. One application may be submitted per eligible publication….”

Is rapid scientific publication also high quality? Bibliometric analysis of highly disseminated COVID?19 research papers – Khatter – – Learned Publishing – Wiley Online Library

“Key points

 

An examination of highly visible COVID-19 research articles reveals that 55% could be considered at risk of bias.
Only 11% of the evaluated early studies on COVID-19 adhered to good standards of reporting such as PRISMA or CONSORT.
There was no correlation between quality of reporting and either the journal Impact Factor or the article Altmetric Attention Score in early studies on COVID-19.
Most highly visible early articles on COVID-19 were published in the Lancet and Journal of the American Medical Association.”