It’s Time to Terminate Social Work’s Relationship with the Impact Factor | Social Work | Oxford Academic

“As a journal-level metric, the IF is unable to assess the value of any given article or author. To make this inference, one would need to read the article and assess its claims, scientific rigor, methodological soundness, and broader implications. What’s more, the IF (which represents the average number of citations across a finite set of eligible articles) is vulnerable to the skewness in citation rates among articles (Nature, 2005) and to the manipulation, negotiation, and gaming of its calculation among stakeholders (Ioannidis & Thombs, 2019). At a more fundamental level the IF does not capture journal functioning such as improvements to (or worsening of) internal evaluative processes (e.g., effectiveness of peer review, changes to submission instructions and policies, use and adherence to reporting guidelines, etc.; Dunleavy, 2022). These and other issues are explored in more depth by Seglen (1997)….

In light of these limitations, social work should de-emphasize the IF and instead embrace a new set of evaluative tools. The San Francisco Declaration on Research Assessment (American Society for Cell Biology, 2013)—and more recently the Leiden Manifesto (Hicks et al., 2015)—typify such efforts. They encourage stakeholders (i.e., academic institutions, journals, funders, researchers) to consider using a multitude of qualitative and quantitative alternative metrics (i.e., “altmetrics”; Priem et al., 2012; see also https://metrics-toolkit.org/metrics/) when judging scholarly output—whether it be a journal article, a grant proposal, or even a hiring or tenure packet. …”

 

Challenges of scholarly communication: bibliometric transparency and impact

Abstract:  Citation metrics have value because they aim to make scientific assessment a level playing field, but urgent transparency-based adjustments are necessary to ensure that measurements yield the most accurate picture of impact and excellence. One problematic area is the handling of self-citations, which are either excluded or inappropriately accounted for when using bibliometric indicators for research evaluation. In this talk, in favour of openly tracking self-citations, I report on a study of self-referencing behaviour among various academic disciplines as captured by the curated bibliometric database Web of Science. Specifically, I examine the behaviour of thousands of authors grouped into 15 subject areas like Biology, Chemistry, Science and Technology, Engineering, and Physics. In this talk, I focus on the methodological set-up of the study and discuss data science related problems like author name disambiguation and bibliometric indicator modelling. This talk bases on the following publication: Kacem, A., Flatt, J. W., & Mayr, P. (2020). Tracking self-citations in academic publishing. Scientometrics, 123(2), 1157–1165. https://doi.org/10.1007/s11192-020-03413-9

 

Publisher Scoring System Wiki

“This system has gone through several revisions. Below are links to the current version, previous versions, and related files.

In brief, journal publishers earn points in this scoring system by engaging in practices that demonstrate partnership with libraries, educators, and researchers. Library Partnership (LP) certification is calculated using a method similar to the U.S. Green Building Council’s LEED (Leadership in Energy and Environmental Design) certification for architectural and building projects. In LEED certification, architectural projects “earn points for various green building strategies across several categories. Based on the number of points achieved, a project earns one of four LEED rating levels: Certified, Silver, Gold or Platinum” (https://www.usgbc.org/). Where LEED certification assesses a building project’s practices in “credit categories” such as water efficiency or indoor air quality, LP certification assesses a publisher’s practices in four categories: Access, Rights, Community, and Discoverability.

A publisher’s partnership score reflects an overall achievement of credits. This score places them in one of four levels or tiers, Tier 1 (highest partnership practices) through Tier 4 (lowest partnership practices)….”

Becoming metrics literate: An analysis of brief videos that teach about the h-index | PLOS ONE

Abstract:  Introduction

Academia uses scholarly metrics, such as the h-index, to make hiring, promotion, and funding decisions. These high-stakes decisions require that those using scholarly metrics be able to recognize, interpret, critically assess and effectively and ethically use them. This study aimed to characterize educational videos about the h-index to understand available resources and provide recommendations for future educational initiatives.

Methods

The authors analyzed videos on the h-index posted to YouTube. Videos were identified by searching YouTube and were screened by two authors. To code the videos the authors created a coding sheet, which assessed content and presentation style with a focus on the videos’ educational quality based on Cognitive Load Theory. Two authors coded each video independently with discrepancies resolved by group consensus.

Results

Thirty-one videos met inclusion criteria. Twenty-one videos (68%) were screencasts and seven used a “talking head” approach. Twenty-six videos defined the h-index (83%) and provided examples of how to calculate and find it. The importance of the h-index in high-stakes decisions was raised in 14 (45%) videos. Sixteen videos (52%) described caveats about using the h-index, with potential disadvantages to early researchers the most prevalent (n = 7; 23%). All videos incorporated various educational approaches with potential impact on viewer cognitive load. A minority of videos (n = 10; 32%) displayed professional production quality.

Discussion

The videos featured content with potential to enhance viewers’ metrics literacies such that many defined the h-index and described its calculation, providing viewers with skills to recognize and interpret the metric. However, less than half described the h-index as an author quality indicator, which has been contested, and caveats about h-index use were inconsistently presented, suggesting room for improvement. While most videos integrated practices to facilitate balancing viewers’ cognitive load, few (32%) were of professional production quality. Some videos missed opportunities to adopt particular practices that could benefit learning.

Wiley Signs Declaration on Research Assessment, Deepens Commitment to Responsible Research Assessment | John Wiley & Sons, Inc.

“Global research and education leader Wiley today announced it has signed the Declaration on Research Assessment (DORA), which is a world-wide initiative designed to improve the ways in which the outputs of scholarly research are evaluated. 

As the publisher of nearly 2,000 academic journals, Wiley will deliver more ways to assess and recognize research outputs, which in turn supports healthy scholarship and allows more researchers to thrive in their careers. To this end, Wiley will roll out a broad range of journal and article metrics across its journal portfolio with the aim of providing a holistic, well-rounded view of the value and impact of any author’s research. This includes metrics that measure levels of impact beyond citation value, including usage, re-use, reproducibility, peer review assessment, geographic reach, and public recognition via references in media outlets….”

News – OLH annual report 2021

“The Open Library of Humanities is an award-winning, academic-led, diamond open-access publisher of 28 journals based in the Department of English, Theatre and Creative Writing at Birkbeck, University of London. We are part of a community of scholar-led, community-owned and non-profit publishing ecosystem that are exploring different business models and innovative approaches to open access publishing that are adapted to the needs, in this case, of academics in the humanities. The platform was launched in 2015 by Birkbeck academics Professor Martin Eve and Dr Caroline Edwards and has been operating as an independent charity until May 2021, which is when the platform merged with the university. The decision to merge was taken, specifically, to protect the “academic-led” quality of the organisation and to protect the charity from financial and personnel risks.

With initial funding from the Andrew W. Mellon Foundation and subsequent support from Arcadia, a charitable fund of Lisbet Rausing and Professor Peter Baldwin, the platform currently covers its costs by payments from an international library consortium, rather than any author fee. This funding mechanism enables equitable open access in the humanities disciplines, with charges neither to readers nor authors….

Part of the OLH model that makes it so appealing lies in our journal ‘flipping’ programme, where we have sought to convert existing subscription titles to an open access model without fees. In September 2021 OLH re-opened its journal flipping programme and remains open to expressions of interest from subscription journals in the humanities seeking to move to a gold open access (OA) publishing model without author-facing charges (‘diamond’ OA). …”

Towards a new reward system for open science

The transition to an open science system affects the entire research process. The reward systems also need to be adjusted in order to support and mirror the open research landscape, but what will this work look like, and what will change? We met Gustav Nilsonne, chair of the European working group dealing with the issue and a participant in the SUHF working group on merit reviews.

Rethinking Research Assessment for the Greater Good: Findings from the RPT Project – Scholarly Communications Lab | ScholCommLab

“The review, promotion, and tenure (RPT) process is central to academic life and workplace advancement. It influences where faculty direct their attention, research, and publications. By unveiling the RPT process, we can inform actions that lead towards a greater opening of research. 

Between 2017 and 2022, we conducted a multi-year research project involving the collection and analysis of more than 850 RPT guidelines and 338 surveys with scholars from 129 research institutions across Canada and the US. Starting with a literature review of academic promotion and tenure processes, we launched six studies applying mixed methods approaches such as surveys and matrix coding.

So how do today’s universities and colleges incentivize open access research? Read on for 6 key takeaways from our studies….”

At what point do academics forego citations for journal status? | Impact of Social Sciences

“The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evidence from a new analysis of business and management academics, Rossella Salandra and Ammon Salter and James Walker¸ explore how journal status is valued by these academics and the point at which journal status becomes more prized than academic influence….”

At what point do academics forego citations for journal status? | Impact of Social Sciences

“The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evidence from a new analysis of business and management academics, Rossella Salandra and Ammon Salter and James Walker¸ explore how journal status is valued by these academics and the point at which journal status becomes more prized than academic influence….”

The methodological quality of physical therapy related trial… : American Journal of Physical Medicine & Rehabilitation

Abstract:  Objective 

We aimed to compare the methodological quality of physical therapy-related trials published in open access with that of trials published in subscription-based journals, adjusting for subdiscipline, intervention type, endorsement of the consolidated standards of reporting trials (CONSORT), impact factor, and publication language.

Design 

In this meta-epidemiological study, we searched the Physiotherapy Evidence Database (PEDro) on May 8, 2021, to include any physical therapy-related trials published from January 1, 2020. We extracted variables such as CONSORT endorsement, the PEDro score, and publication type. We compared the PEDro score between the publication types using a multivariable generalized estimating equation (GEE) by adjusting for covariates.

Results 

A total of 2,743 trials were included, with a mean total PEDro score (SD) of 5.8 (±1.5). Trials from open access journals had a lower total PEDro score than those from subscription-based journals (5.5 ± 1.5 vs. 5.9 ± 1.5, mean difference [MD]: ?0.4; 95% confidence interval: 0.3–0.5). GEE revealed that open access publication was significantly associated with the total PEDro score (MD: ?0.42; P < 0.001).

Conclusions 

In the recent physical therapy-related trials, open access publications demonstrated lower methodological quality than subscription-based publications, although with a small difference.

Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies | Health Research Policy and Systems | Full Text

Abstract:  Background

In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions do not sufficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimize their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are.

Methods

For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments as well as websites for their core facilities and research in general between December 2020 and February 2021. We screened the documents for mentions of indicators of robust and transparent research (study registration; reporting of results; sharing of research data, code and protocols; open access; and measures to increase robustness) and for mentions of more traditional metrics of career progression (number of publications; number and value of awarded grants; impact factors; and authorship order).

Results

While open access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Institutional policies for academic degrees and academic appointments had frequent mentions of traditional metrics.

Conclusions

References to robust and transparent research practices are, with a few exceptions, generally uncommon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail.

Wakeling & Abbasi (2022) Why do journals discontinue? A study of Australian ceased journals – Jamali – 2022 – Learned Publishing – Wiley Online Library

Jamali, H.R., Wakeling, S. and Abbasi, A. (2022), Why do journals discontinue? A study of Australian ceased journals. Learned Publishing, 35: 219-228. https://doi.org/10.1002/leap.1448

 

Abstract: Little is known about why journals discontinue despite its significant implications. We present an analysis of 140 Australian journals that ceased from 2011 to mid-2021 and present the results of a survey of editors of 53 of them. The death age of journals was 19.7 (median = 16) with 57% being 10?years or older. About 54% of them belonged to educational institutions and 34% to non-profit organizations. In terms of subject, 75% of the journals belonged to social sciences, humanities and arts. The survey showed that funding was an important reason for discontinuation, and lack of quality submission and lack of support from the owners of the journal also played a role. Too much reliance on voluntary works appeared to be an issue for editorial processes. The dominant metric culture in the research environment and pressure for journals to perform well in journal rankings negatively affect local journals in attracting quality submissions. A fifth of journals indicated that they did not have a plan for the preservation of articles at the time of publication and the current availability of the content of ceased journals appeared to be sub-optimal in many cases with reliance on the website of ceased journals or web-archive platforms.

 

 

Key points

 

One hundred and forty Australian journals ceased publishing between 2011 and 2020, with an average age of 19?years on cessation.
The majority of Australian journals that ceased publication 2011–2020 were in the social sciences, humanities and arts where local journals play an important role.
Funding was found to be a key reason for journal discontinuation followed by lack of support and quality submissions and over-reliance on voluntary work.
Metric driven culture and journal rankings adversely impact local journals and can lead to discontinuation.
Many journals have neither sustainable business models (or funding), nor a preservation plan, both of which jeopardize journal continuation and long-term access to archive content.

 

Implementing the Declaration on Research Assessment: a publisher case study

Abstract:  There has been much debate around the role of metrics in scholarly communication, with particular focus on the misapplication of journal metrics, such as the impact factor in the assessment of research and researchers. Various initiatives have advocated for a change in this culture, including the Declaration on Research Assessment (DORA), which invites stakeholders throughout the scholarly communication ecosystem to sign up and show their support for practices designed to address the misuse of metrics. This case study provides an overview of the process undertaken by a large academic publisher (Taylor & Francis Group) in signing up to DORA and implementing some of its key practices in the hope that it will provide some guidance to others considering becoming a signatory. Our experience suggests that research, consultation and flexibility are crucial components of the process. Additionally, approaching signing with a project mindset versus a ‘sign and forget’ mentality can help organizations to understand the practical implications of signing, to anticipate and mitigate potential obstacles and to support cultural change.

 

The Mechanics Behind A Precipitous Rise In Impact Factor: A Case Study From the British Journal of Sports Medicine

Abstract:  The impact factor is a popular but highly flawed proxy for the importance of academic journals. A variety of techniques exist to increase an individual journal’s impact factor but are usually described in abstract terms. Here, we investigate two of those: (1) the preferential publication of brief, citable, non-substantive academic content, and likewise for (2) review or meta-analytic content in the historical publications of the British Journal of Sports Medicine. Simple content analysis reveals an exponential rise in published editorial and other brief content, a persistent growth in ‘highly citable’ content, and dramatic drop in the proportion of empirical research published. These changes parallel the changes in impact factor over the time period available. The implications of this are discussed.