It’s Time to Terminate Social Work’s Relationship with the Impact Factor | Social Work | Oxford Academic

“As a journal-level metric, the IF is unable to assess the value of any given article or author. To make this inference, one would need to read the article and assess its claims, scientific rigor, methodological soundness, and broader implications. What’s more, the IF (which represents the average number of citations across a finite set of eligible articles) is vulnerable to the skewness in citation rates among articles (Nature, 2005) and to the manipulation, negotiation, and gaming of its calculation among stakeholders (Ioannidis & Thombs, 2019). At a more fundamental level the IF does not capture journal functioning such as improvements to (or worsening of) internal evaluative processes (e.g., effectiveness of peer review, changes to submission instructions and policies, use and adherence to reporting guidelines, etc.; Dunleavy, 2022). These and other issues are explored in more depth by Seglen (1997)….

In light of these limitations, social work should de-emphasize the IF and instead embrace a new set of evaluative tools. The San Francisco Declaration on Research Assessment (American Society for Cell Biology, 2013)—and more recently the Leiden Manifesto (Hicks et al., 2015)—typify such efforts. They encourage stakeholders (i.e., academic institutions, journals, funders, researchers) to consider using a multitude of qualitative and quantitative alternative metrics (i.e., “altmetrics”; Priem et al., 2012; see also when judging scholarly output—whether it be a journal article, a grant proposal, or even a hiring or tenure packet. …”


Wiley Signs Declaration on Research Assessment, Deepens Commitment to Responsible Research Assessment | John Wiley & Sons, Inc.

“Global research and education leader Wiley today announced it has signed the Declaration on Research Assessment (DORA), which is a world-wide initiative designed to improve the ways in which the outputs of scholarly research are evaluated. 

As the publisher of nearly 2,000 academic journals, Wiley will deliver more ways to assess and recognize research outputs, which in turn supports healthy scholarship and allows more researchers to thrive in their careers. To this end, Wiley will roll out a broad range of journal and article metrics across its journal portfolio with the aim of providing a holistic, well-rounded view of the value and impact of any author’s research. This includes metrics that measure levels of impact beyond citation value, including usage, re-use, reproducibility, peer review assessment, geographic reach, and public recognition via references in media outlets….”

Rethinking Research Assessment for the Greater Good: Findings from the RPT Project – Scholarly Communications Lab | ScholCommLab

“The review, promotion, and tenure (RPT) process is central to academic life and workplace advancement. It influences where faculty direct their attention, research, and publications. By unveiling the RPT process, we can inform actions that lead towards a greater opening of research. 

Between 2017 and 2022, we conducted a multi-year research project involving the collection and analysis of more than 850 RPT guidelines and 338 surveys with scholars from 129 research institutions across Canada and the US. Starting with a literature review of academic promotion and tenure processes, we launched six studies applying mixed methods approaches such as surveys and matrix coding.

So how do today’s universities and colleges incentivize open access research? Read on for 6 key takeaways from our studies….”

The Mechanics Behind A Precipitous Rise In Impact Factor: A Case Study From the British Journal of Sports Medicine

Abstract:  The impact factor is a popular but highly flawed proxy for the importance of academic journals. A variety of techniques exist to increase an individual journal’s impact factor but are usually described in abstract terms. Here, we investigate two of those: (1) the preferential publication of brief, citable, non-substantive academic content, and likewise for (2) review or meta-analytic content in the historical publications of the British Journal of Sports Medicine. Simple content analysis reveals an exponential rise in published editorial and other brief content, a persistent growth in ‘highly citable’ content, and dramatic drop in the proportion of empirical research published. These changes parallel the changes in impact factor over the time period available. The implications of this are discussed.

Guest Post – New Winds from the Latin American Scientific Publishing Community – The Scholarly Kitchen

“To help evaluate interest in the idea of a regional association and to better understand editors’ perspectives on the use of journal metrics for science evaluations, a survey of journal editors was carried out, with 20 questions aimed at characterizing the journal they edit, such as subject area(s), audience, business model and adoption of open science, coverage by databases, strategies for increasing visibility, and use of metrics and indicators for journal management. The survey also included four questions about the use of citation impact indicators for national evaluations of science performed by governmental agencies in Latin America and their effects on the publication and research activities in the region….

A large majority of the editors who responded to the survey felt that the use of citation impact indicators for evaluating science in Latin America is inadequate or partially adequate (70%-88% depending on the specific area of evaluation)….

This feedback was used to support the development of the ALAEC Manifesto for the responsible use of metrics in research evaluation in Latin America and the Caribbean, which calls for a more inclusive and responsible use of journal-based metrics in research evaluation. It supports previous manifestos, such as the San Francisco Declaration on Research Assessment – DORA (2012), the Leiden Manifesto for Research Metrics (2015), and the Helsinki Initiative on Multilingualism in Scholarly Communication (2019). Acknowledging that the current criteria imposed by Latin American evaluating bodies have perverse consequences for the region’s journals and that authors will therefore have less incentive to submit articles to them, the manifesto has five main calls to action:


Re-establish quality criteria, valuing journals that:

Publish relevant research regardless of area or subject matter, language, target audience, or geographic scope
Bring a broad spectrum of scholarly and research contributions, such as replication, innovation, translation, synthesis, and meta-research
Practice open science, including open access
Adopt high ethical standards, prioritizing quality and integrity in scientific publication

Value and stimulate the work of scientific editors and their teams, promoting their training and development, and recognizing their fundamental role in the adoption and dissemination of good practices in scientific publication.
Ensure that national journals and publishers do not lose financial incentives and the flow of article submissions, allowing them to achieve and maintain high standards of quality and integrity in their editorial processes, especially for journals that practice open science and multilingualism.
Strengthen, disseminate, and protect national and regional infrastructures for scientific communication (SciELO, RedALyC, LatIndex, LA Referencia, and non-commercial CRIS systems), that favor open science and multilingualism, and that can generate the most appropriate metrics and indicators to evaluate local and regional science.
Encourage and value collaborative networks and exchanges between all actors in the ecosystem of knowledge production and dissemination: institutions, authors, reviewers and funding agencies, etc., in the region….”

The Price of Publishing: An Investigation of the Open Access… : Plastic and Reconstructive Surgery

Abstract:  Background: 

Open access publishing in plastic surgery has rapidly gained traction in the past decade. This study investigated the digital landscape of plastic surgery open access publishing.


This was a cross-sectional bibliometric investigation of plastic surgery–focused journals. Three publication models were investigated: subscription-only journals, hybrid journals offering both paywalled and open access publishing, and open access–only journals.


Eighty-two journals were investigated. In 2010, open access journals comprised 18 percent of all plastic surgery journals online, subscription journals comprised 79 percent, and hybrid journals comprised 3 percent. Conversely, in 2020, open access journals comprised 55 percent of all journals, hybrid journals comprised 45 percent, and there were no subscription-only journals. Multivariable linear regression adjusting for article type/content demonstrated that open access articles from hybrid journals [beta coefficient, 1.3; F(4, 18) = 790; p = 0.05] and high-quality open access journals [beta coefficient, 0.9; F(4, 19) = 738; p = 0.04] were significantly positively associated with number of full-text views. Although impact factor and article processing charges were positively correlated [Pearson correlation coefficient: r(25) = 0.39, p = 0.04] for open access publishing, some high-quality open access journals were found to offer fee waivers/free publishing. Lastly, level of evidence offered by articles from open access versus hybrid journals differed.


Overall, this study highlighted important distinctions between trustworthy and predatory journals offering open access publishing in plastic surgery. Open access publishing in trustworthy sources offers greater visibility and is not necessarily cost-prohibitive, but some open access journals can be limited in scope (i.e., less coverage of subspecialty topics) and quality of content. Study findings were used to generate recommendations for navigating open access publishing in plastic surgery.

Surveillance Publishing · Elephant in the Lab

“Clarivate’s business model is coming for scholarly publishing. Google is one peer, but the company’s real competitors are Elsevier, Springer Nature, Wiley, Taylor & Francis, and SAGE. Elsevier, in particular, has been moving into predictive analytics for years now. Of course the publishing giants have long profited off of academics and our university employers—by packaging scholars’ unpaid writing-and-editing labor only to sell it back to us as usuriously priced subscriptions or article processing charges (APCs). That’s a lucrative business that Elsevier and the others won’t give up. But they’re layering another business on top of their legacy publishing operations, in the Clarivate mold. The data trove that publishers are sitting on is, if anything, far richer than the citation graph alone.

Why worry about surveillance publishing? One reason is the balance sheet, since the companies’ trading in academic futures will further pad profits at the expense of taxpayers and students. The bigger reason is that our behavior—once alienated from us and abstracted into predictive metrics—will double back onto our work lives. Existing biases, like male academics’ propensity for self-citation, will receive a fresh coat of algorithmic legitimacy. More broadly, the academic reward system is already distorted by metrics. To the extent that publishers’ tallies and indices get folded into grant-making, tenure-and-promotion, and other evaluative decisions, the metric tide will gain power. The biggest risk is that scholars will internalize an analytics mindset, one already encouraged by citation counts and impact factors….”

Changes in Article Share and Growth by Publisher and Access Type in Journal Citation Reports 2016, 2018, and 2020





This study explored changes in the journal publishing market by publisher and access type using the major journals that publish about 95% of Journal Citation Reports (JCR) articles.



From JCR 2016, 2018, and 2020, a unique journal list by publisher was created in Excel and used to analyze the compound annual growth rate by pivot tables. In total, 10,953 major JCR journals were analyzed, focusing on publisher type, open access (OA) status, and mega journals (publishing over 1,000 articles per year).



Among the 19 publishers that published over 10,000 articles per year, in JCR 2020, six large publishers published 59.6% of the articles and 13 publishers 22.5%. The other publishers published 17.9%. Large and OA publishers increased their article share through leading mega journals, but the remaining publishers showed the opposite tendency. In JCR 2020, mega journals had a 26.5% article share and an excellent distribution in terms of the Journal Impact Factor quartile. Despite the high growth (22.6%) and share (26.0%) of OA articles, the natural growth of non-OA articles (7.3%) and total articles (10.7%) caused a rise in journal subscription fees. Articles, citations, the impact factor, and the immediacy index all increased gradually, and the compound annual growth rate of the average immediacy index was almost double than that of the average impact factor in JCR 2020.



The influence of OA publishers has grown under the dominance of large publishers, and mega journals may substantially change the journal market. Journal stakeholders should pay attention to these changes.




New strategy pushes universities to embrace open science

“The European University Association (EUA) has set out a radical vision to support its 850 member institutions in 48 European countries to move to an open science system that aspires to open access not only to scholarly outputs, but the whole research process.

The strategy unveiled in the EUA Open Science Agenda 2025 document has set the goal of placing Europe’s universities in “a scholarly ecosystem”, characterised by academic ownership of scholarly communication and publishing – with open science becoming an integral part of research assessment practices – within three years.

The move is part of a growing trend by the research community to challenge the global dominance of increasingly expensive academic publications, which, despite recent progress in open access to scholarly outputs, still sees an estimated 85% of new research articles published in journals being behind paywalls.

Dr Vinciane Gaillard, EUA deputy director of research and innovation, told University World News that the EUA open science agenda strategy has been a year in the making and will be followed up by an action plan, with specific targets and a timeline to monitor progress, to be published in June….”

Towards more inclusive metrics and open science to measure research assessment in Earth and natural sciences

Abstract:  Science’s success and effect measures are built on a system that prioritizes citations and impact factors. These measurements are inaccurate and biased against already under-represented groups, and they fail to convey the range of individuals’ significant scientific contributions, especially open science. We argue for a transition in this out-of-date value system that promotes science by promoting diversity, equity, and inclusion. To achieve systemic change, it will necessitate a concerted effort led by academic leaders and administrators.

Open Access Publications is our mission in 2022: perspective from the editors of the European Journal of Clinical Investigation – Montecucco – – European Journal of Clinical Investigation – Wiley Online Library

“The impact factor of the European Journal of Clinical Investigation (EJCI) and also of other scientific journals has dramatically increased in 2020. At the Editorial Board Meeting in September 2021, we felt very proud of our work since January 2020, when we became the new Editors of EJCI. In contrast to other journals, we managed to attract not only COVID-19- 1, 2 but also non-COVID-19-related articles 3, 4 receiving many citations and contributing to the impact factor of 2020. Obviously, we are indebted to all authors choosing EJCI for their submissions/publications as well to all reviewers involved in judging the submitted manuscripts (list of reviewers displayed in Table 1). We have the impression that our Journal not only gained in terms of quality as expressed in the higher impact factor, but also in the organization of handling the increasing number of submissions. Despite the COVID-19 pandemic-related involvement in clinical and scientific work, the time of revision and final decision were markedly reduced in 2020 and established in 2021. At the Editorial Meeting, we discussed how to further improve our Journal in the near future. There was a broad consensus stating that we should push on the quality of Research Articles, Reviews and Editorials. Although it seems quite obvious, this mission appears to be a real challenge for Editors. What is “quality” of an article? Consensus was obtained on appropriate methodological requirements, high clinical relevance and usefulness for patients’ care. This should hold for both basic/translational and clinical research. Based on these criteria that will be our “North Star” for next year, we also realized that after selecting a top-level article, it is mandatory to promote its diffusion as well. In this regard, we acknowledge that all articles should be pushed to be freely shared in an “Open Access” (OA) mode. OA means that the article has not any financial, legal or technical barriers to be consulted by any reader from all over the world. The relevance of OA has been excellently demonstrated during the COVID-19 pandemic. Also, the European Community has indicated that the principal investigators of European Grants should publish their articles from the funded project as OA. One major issue is that publishing OA requires a payment of a fee per article by the author to the publisher, the so-called article processing charge (APC). The mean APC is around €2000 but varies greatly between journals from €1000 to €5000 per article. Therefore, some financial issues in particular from low-income countries might limit worldwide spread of science. In order to avoid this practical, but fundamental burden, we would inform the readers that our publisher Wiley has implemented some conventions with institutes, universities and even countries (Figure 1) in order to cover the APC of the OA publications….”


Bedreigingen voor fundamenteel wetenschappelijk onderzoek in Nederland brengen onze toekomstige welvaart in gevaar – ScienceGuide

From Google’s English:  “The approach by which Dutch science has risen to the top 5 in the world since the 1980s is under threat, write Raymond Poot and more than a hundred other scientists. Not through Open Access or Recognition and Valuation, but through the link between this and the signing of DORA and the roll-out of Open Science. In this contribution, Poot shares the conclusions and recommendations from a study into the consequences of Open Science and DORA. “A scenario of an internationally competitive Dutch science, where different talents can come into their own, is entirely possible. However, the current policy has to be drastically adjusted for that.” …

Dutch scientists are no longer assessed on the basis of international, scientific and measurable criteria, as was done very successfully at NWO for thirty years. These criteria have been partly removed by Open Science and DORA and replaced by criteria that are politically motivated and difficult to measure. As we described in our previous contribution in ScienceGuide, the negative effects of Open Science and DORA at NWO are amplified because measurable criteria are replaced by narratives. Sometimes the CV is even omitted entirely.  …

To show that ‘policy’ based on Open Science and DORA contains major risks that we should not get used to, I wrote a report with Bas de Bruin and Frank Grosveld that goes deeper into the matter. The report is supported by 105 scientists (further support for the report can be emailed to Raymond Poot). In the report we discuss the effects of DORA on evaluations, and we examine the underlying reasoning behind DORA. We also discuss the focus of Open Science on the (direct) benefit of research for society, the focus on public involvement in research and the focus on team science and leadership.’. We discuss the current Open Access policy of Open Science, Plan S, to enforce Open Access for all Dutch scientific publications. 

The conclusions of our report are alarming.  

1) The combination of different Open Science  policies  with DORA puts the fundamental sciences at a disadvantage compared to the more applied sciences. Through the ERC and Marie Curie competitions, Europe spends twenty-five percent of its innovation budget on scientist-initiated fundamental research, which is selected for excellence. The Netherlands spends only five percent of its budget on such research. Europe has a reason to spend so much on scientist-initiated research, according to conclusion two of our report. 

2) Scientist-initiated fundamental research that is selected on the basis of scientific quality provides considerably more social benefit per euro spent in the medium term than research that is selected on the basis of direct social or industrial relevance. This apparent paradox is related to the observation that the usefulness of scientific discoveries is very difficult to predict, while it is clear that without real discoveries there is little progress. While this message is difficult to sell to politicians, it is a very important one. 

3) Various Open Science measures reduce the quality of Dutch science by not selecting for scientific quality and at the same time creating a lot of bureaucracy. …”

Surveillance Publishing

Abstract:  This essay develops the idea of surveillance publishing, with special attention to the example of Elsevier. A scholarly publisher can be defined as a surveillance publisher if it derives a substantial proportion of its revenue from prediction products, fueled by data extracted from researcher behavior. The essay begins by tracing the Google search engine’s roots in bibliometrics, alongside a history of the citation analysis company that became, in 2016, Clarivate. The point is to show the co-evolution of scholarly communication and the surveillance advertising economy. The essay then refines the idea of surveillance publishing by engaging with the work of Shoshana Zuboff, Jathan Sadowski, Mariano-Florentino Cuéllar, and Aziz Huq. The recent history of Elsevier is traced to describe the company’s research-lifecycle data-harvesting strategy, with the aim to develop and sell prediction products to universities and other customers. The essay concludes by considering some of the potential costs of surveillance publishing, as other big commercial publishers increasingly enter the predictive-analytics mark. It is likely, I argue, that windfall subscription-and-APC profits in Elsevier’s “legacy” publishing business have financed its decade-long acquisition binge in analytics, with the implication that university customers are budgetary victims twice over. The products’ purpose, I stress, is to streamline the top-down assessment and evaluation practices that have taken hold in recent decades, in tandem with the view that the university’s main purpose is to grow regional and national economies. A final pair of concerns is that publishers’ prediction projects may camouflage and perpetuate existing biases in the system—and that scholars may internalize an analytics mindset, one already encouraged by citation counts and impact factors.

All the research that’s fit to print: Open access and the news media | Quantitative Science Studies | MIT Press

Teresa Schultz; All the research that’s fit to print: Open access and the news media. Quantitative Science Studies 2021; 2 (3): 828–844. doi:

Abstract: The goal of the open access (OA) movement is to help everyone access scholarly research, not just those who can afford to. However, most studies looking at whether OA has met this goal have focused on whether other scholars are making use of OA research. Few have considered how the broader public, including the news media, uses OA research. I sought to answer whether the news media mentions OA articles more or less than paywalled articles by looking at articles published from 2010 through 2018 in journals across all four quartiles of the Journal Impact Factor using data obtained through and Web of Science. Gold, green and hybrid OA articles all had a positive correlation with the number of news mentions received. News mentions for OA articles did see a dip in 2018, although they remained higher than those for paywalled articles.