The impact of the COVID-19 pandemic on academic productivity

Abstract:  ‘Publish or perish’ is an expression describing the pressure on academics to consistently publish research to ensure a successful career in academia. With a global pandemic that has changed the world, how has it changed academic productivity? Here we show that academics are posting just as many publications on the arXiv pre-print server as if there were no pandemic: 168,630 were posted in 2020, a +12.6% change from 2019 and +1.4? deviation above the predicted 162,577 ± 4,393. However, some immediate impacts are visible in individual research fields. Conference cancellations have led to sharp drops in pre-prints, but laboratory closures have had mixed effects. Only some experimental fields show mild declines in outputs, with most being consistent on previous years or even increasing above model expectations. The most significant change is a 50% increase (+8?) in quantitative biology research, all related to the COVID-19 pandemic. Some of these publications are by biologists using arXiv for the first time, and some are written by researchers from other fields (e.g., physicists, mathematicians). While quantitative biology pre-prints have returned to pre-pandemic levels, 20% of the research in this field is now focussed on the COVID-19 pandemic, demonstrating a strong shift in research focus.

Open Access Books – Part II – Delta Think

“If a publisher decides to implement an OA books program, what does it do with older titles? Does it make its backlist retrospectively OA? Or reserve OA for frontlist titles only? (Or both?)….

The chart above analyzes the lead times in indexing books. It shows how many years after publication books were added to the index (the DOAB) and deemed to be made OA.

If titles are made OA in their year of publication (deemed to be frontlist titles), the lead time will be zero. Just over 25% of DOAB titles are frontlist.
If titles were made OA after their year of publication (deemed to be backlist titles), then the lead time will be a positive number. Around 16% of titles were made OA the year after publication. The remaining 69% or so of titles are deep backlist.
Although not shown above, the oldest titles in the DOAB date back decades. Earlier years (before 2000) typically have a handful of titles per publication year, with annual numbers increasing significantly in more recent years. The oldest title in the index was published in 1787….

Patterns in license usage are different if analyzed by publication year (left) compared with the year they were made OA or added to the index (right). We can clearly see license use by publication year shows distinct patterns, but license use by indexed year appears more random….

 

We see that the proportion of CC BY licenses (colors at the bottom of each bar) is significantly lower in books (32%) than in journals (51%). Likewise, CC BY-NC (2nd from bottom) – books (4%) vs. journal articles (15%). But CC BY-NC-ND licenses show the opposite: books have a greater proportion (29%) than journals (18%)….”

Lessons from arXiv’s 30 years of information sharing | Nature Reviews Physics

“Since the launch of arXiv 30 years ago, modes of information spread in society have changed dramatically — and not always for the better. Paul Ginsparg, who founded arXiv, discusses how academic experience with online preprints can still inform information sharing more generally….”

Data sharing practices and data availability upon request differ across scientific disciplines | Scientific Data

Abstract:  Data sharing is one of the cornerstones of modern science that enables large-scale analyses and reproducibility. We evaluated data availability in research articles across nine disciplines in Nature and Science magazines and recorded corresponding authors’ concerns, requests and reasons for declining data sharing. Although data sharing has improved in the last decade and particularly in recent years, data availability and willingness to share data still differ greatly among disciplines. We observed that statements of data availability upon (reasonable) request are inefficient and should not be allowed by journals. To improve data sharing at the time of manuscript acceptance, researchers should be better motivated to release their data with real benefits such as recognition, or bonus points in grant and job applications. We recommend that data management costs should be covered by funding agencies; publicly available research data ought to be included in the evaluation of applications; and surveillance of data sharing should be enforced by both academic publishers and funders. These cross-discipline survey data are available from the plutoF repository.

 

ARL and Six Universities Awarded National Science Foundation Grant to Study Discipline-Specific Models and Costs for Public Access to Research Data – Association of Research Libraries

“The US National Science Foundation (NSF) has awarded the Association of Research Libraries (ARL) and six universities involved in the Data Curation Network a $297,019 grant to conduct research, develop models, and collect costing information for public access to research data across five disciplinary areas. The project, Completing the Life Cycle: Developing Evidence-Based Models of Research Data Sharing, will start in August 2021….

This research seeks to answer the following questions:

Where are funded researchers across these institutions making their data publicly accessible and what is the quality of the metadata?
How are researchers making decisions about why and how to share research data?
What is the cost to the institution to implement the federally mandated public access to research data policy? …”

Journal impact factor gets a sibling that adjusts for scientific field | Science | AAAS

“The new Journal Citation Indicator (JCI) accounts for the substantially different rates of publication and citation in different fields, Clarivate says. But the move is drawing little praise from the critics, who say the new metric remains vulnerable to misunderstanding and misuse….”

Journal Citation Indicator. Just Another Tool in Clarivate’s Metrics Toolbox? – The Scholarly Kitchen

“The JCI has several benefits when compared against the standard Journal Impact Factor (JIF): It is based on a journal’s citation performance across three full years of citation data rather than a single year’s snapshot of a journal’s performance across the previous two years. Clarivate also promises to provide the JCI score to all journals in its Core Collection, even those journals that do not currently receive a JIF score.

The JCI also avoids the numerator-denominator problem of the JIF, where ALL citations to a journal are counted in the numerator, but only “citable items” (Articles and Review) are counted in the denominator. The JCI only focuses on Articles and Reviews.

Finally, like a good indicator, the JCI is easy to interpret. Average performance is set to 1.0, so a journal that receives a JCI score of 2.5 performed two-and-a-half times better than average, while a journal with a score of 0.5 performed only half as well.

To me, JCI’s biggest weakness is Clarivate’s bold claim that it achieved normalization across disciplines….”

Introducing the Journal Citation Indicator: A new, field-normalized measurement of journal citation impact – Web of Science Group

“In a recent blog post we discussed refinements in this year’s forthcoming release of the Journal Citation Reports (JCR)™, describing the addition of new content and hinting at a new metric for measuring the citation impact of a journal’s recent publications.

I’m now pleased to fully introduce the Journal Citation Indicator. By normalizing for different fields of research and their widely varying rates of publication and citation, the Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines….”

Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms | Emerald Insight

Abstract:  Purpose

The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations.

Design/methodology/approach

A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups.

Findings

Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations.

Research limitations/implications

The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms.

Originality/value

The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations.

Open-access publisher PLOS pushes to extend clout beyond biomedicine

“Non-profit life-sciences publisher PLOS is gunning for a bigger share of science beyond the biomedical realm with the launch of five journals in fields where open science is less widely adopted. They will be its first new titles in 14 years. It is also piloting a new open-access business model, in a bid to spread the cost of publishing more equally among researchers….

The new business model is the first shake-up at the publisher for a while, and has been eagerly anticipated….

 The publisher’s financial history is chequered. It first broke even in 2010. In recent years it has fallen into deficit, with 2019 the first year that it made an operating surplus since 2015….

The idea behind the new model is that the cost of publishing a paper is spread more equally across all of the authors’ institutions, rather than the corresponding author’s institution or funder footing the bill, as is standard with an article processing charge. PLOS says that as more members join the scheme, it will become cheaper for researchers to publish papers. So far, more than 75 institutions in 8 countries have signed up….

PLOS’s chief publishing officer, Niamh O’Connor, says that PLOS hopes to circumvent the idea that open access moves the cost of publishing a paper from the reader to the author. “While the article-processing model has allowed open access to develop, we don’t see that as the future,” she says. “We are working to a future where those barriers are removed.” …”

Influence of accessibility (open and toll-based) of scholarly publications on retractions | SpringerLink

“We have examined retracted publications in different subject fields and attempted to analyse whether online free accessibility (Open Access) influences retraction by examining the scholarly literature published from 2000 through 2019, an incidence of the recent 20 years of publications. InCites, a research analytics tool developed by Clarivate Analytics®, in consultation with Web of Science, PubMed Central, and Retraction Watch databases were used to harvest data for the study. Retracted ‘Article’ and ‘Review’ publications were examined concerning their online accessibility mode (Toll Access and Open Access), based on non-parametric tests like Odds Ratio, Wilcoxon Signed Rank Test, Mann–Whitney U Test, Mann–Kendall and Sen’s methods. The Odds for OA articles to have retraction are about 1.62 as large (62% higher) compared with TA articles (95% CI 1.5, 1.7). 0.028% of OA publications are retracted compared with 0.017% TA publications. Retractions have occurred in all subject areas. In eight subject areas, the Odds for retraction of OA articles are larger compared with retraction of TA articles. In three subject areas, the Odds for retraction of OA articles are lesser compared with the retraction of TA articles. In the remaining 11 subject areas, no significant difference is observed. Post-retraction, though a decline is observed in the citation count of OA & TA publications (p?<?.01), yet the Odds for OA articles to get cited after retraction are about 1.21 as large (21% higher) compared with TA articles (95% CI 1.53, 1.72). TA publications are retracted earlier compared to OA publications (p?<?.01). We observed an increasing trend of retracted works published in both modes. However, the rate of retraction of OA publications is double than the rate of retraction of TA publications.

Equity concerns persist over open-access publishing | Nature Index

“An analysis of more than 182,000 scholars in the United States has found that the researchers who publish in OA journals with APCs – which can cost several thousand dollars – are more likely to be male, at an advanced career stage, have access to federal funding, and/or be employed by prestigious universities.”

Open access journal publishing in the business disciplines: A closer look at the low uptake and discipline-specific considerations – Mikael Laakso, Bo-Christer Björk, 2021

Abstract:  The Internet has enabled efficient electronic publishing of scholarly journals and Open Access business models. Recent studies have shown that adoption of Open Access journals has been uneven across scholarly disciplines, where the business and economics disciplines in particular seem to lag behind all other fields of research. Through bibliometric analysis of journals indexed in Scopus, we find the share of articles in Open Access journals in business, management, and accounting to be only 6%. We further studied the Open Access availability of articles published during 2014–2019 in journals included in the Financial Times 50 journal list (19,969 articles in total). None of the journals are full Open Access, but 8% of the articles are individually open and for a further 35% earlier manuscript versions are available openly on the web. The results suggest that the low adoption rate of Open Access journals in the business fields is a side-effect of evaluation practices emphasizing publishing in journals included, in particular, ranking lists, creating disincentives for business model innovation, and barriers for new entrants among journals. Currently, most business school research has to be made Open Access through other ways than through full Open Access journals, and libraries play an important role in facilitating this in a sustainable way.