Journal Citation Indicator. Just Another Tool in Clarivate’s Metrics Toolbox? – The Scholarly Kitchen

“The JCI has several benefits when compared against the standard Journal Impact Factor (JIF): It is based on a journal’s citation performance across three full years of citation data rather than a single year’s snapshot of a journal’s performance across the previous two years. Clarivate also promises to provide the JCI score to all journals in its Core Collection, even those journals that do not currently receive a JIF score.

The JCI also avoids the numerator-denominator problem of the JIF, where ALL citations to a journal are counted in the numerator, but only “citable items” (Articles and Review) are counted in the denominator. The JCI only focuses on Articles and Reviews.

Finally, like a good indicator, the JCI is easy to interpret. Average performance is set to 1.0, so a journal that receives a JCI score of 2.5 performed two-and-a-half times better than average, while a journal with a score of 0.5 performed only half as well.

To me, JCI’s biggest weakness is Clarivate’s bold claim that it achieved normalization across disciplines….”

Introducing the Journal Citation Indicator: A new, field-normalized measurement of journal citation impact – Web of Science Group

“In a recent blog post we discussed refinements in this year’s forthcoming release of the Journal Citation Reports (JCR)™, describing the addition of new content and hinting at a new metric for measuring the citation impact of a journal’s recent publications.

I’m now pleased to fully introduce the Journal Citation Indicator. By normalizing for different fields of research and their widely varying rates of publication and citation, the Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines….”

Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms | Emerald Insight

Abstract:  Purpose

The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations.

Design/methodology/approach

A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups.

Findings

Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations.

Research limitations/implications

The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms.

Originality/value

The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations.

Open-access publisher PLOS pushes to extend clout beyond biomedicine

“Non-profit life-sciences publisher PLOS is gunning for a bigger share of science beyond the biomedical realm with the launch of five journals in fields where open science is less widely adopted. They will be its first new titles in 14 years. It is also piloting a new open-access business model, in a bid to spread the cost of publishing more equally among researchers….

The new business model is the first shake-up at the publisher for a while, and has been eagerly anticipated….

 The publisher’s financial history is chequered. It first broke even in 2010. In recent years it has fallen into deficit, with 2019 the first year that it made an operating surplus since 2015….

The idea behind the new model is that the cost of publishing a paper is spread more equally across all of the authors’ institutions, rather than the corresponding author’s institution or funder footing the bill, as is standard with an article processing charge. PLOS says that as more members join the scheme, it will become cheaper for researchers to publish papers. So far, more than 75 institutions in 8 countries have signed up….

PLOS’s chief publishing officer, Niamh O’Connor, says that PLOS hopes to circumvent the idea that open access moves the cost of publishing a paper from the reader to the author. “While the article-processing model has allowed open access to develop, we don’t see that as the future,” she says. “We are working to a future where those barriers are removed.” …”

Influence of accessibility (open and toll-based) of scholarly publications on retractions | SpringerLink

“We have examined retracted publications in different subject fields and attempted to analyse whether online free accessibility (Open Access) influences retraction by examining the scholarly literature published from 2000 through 2019, an incidence of the recent 20 years of publications. InCites, a research analytics tool developed by Clarivate Analytics®, in consultation with Web of Science, PubMed Central, and Retraction Watch databases were used to harvest data for the study. Retracted ‘Article’ and ‘Review’ publications were examined concerning their online accessibility mode (Toll Access and Open Access), based on non-parametric tests like Odds Ratio, Wilcoxon Signed Rank Test, Mann–Whitney U Test, Mann–Kendall and Sen’s methods. The Odds for OA articles to have retraction are about 1.62 as large (62% higher) compared with TA articles (95% CI 1.5, 1.7). 0.028% of OA publications are retracted compared with 0.017% TA publications. Retractions have occurred in all subject areas. In eight subject areas, the Odds for retraction of OA articles are larger compared with retraction of TA articles. In three subject areas, the Odds for retraction of OA articles are lesser compared with the retraction of TA articles. In the remaining 11 subject areas, no significant difference is observed. Post-retraction, though a decline is observed in the citation count of OA & TA publications (p?<?.01), yet the Odds for OA articles to get cited after retraction are about 1.21 as large (21% higher) compared with TA articles (95% CI 1.53, 1.72). TA publications are retracted earlier compared to OA publications (p?<?.01). We observed an increasing trend of retracted works published in both modes. However, the rate of retraction of OA publications is double than the rate of retraction of TA publications.

Equity concerns persist over open-access publishing | Nature Index

“An analysis of more than 182,000 scholars in the United States has found that the researchers who publish in OA journals with APCs – which can cost several thousand dollars – are more likely to be male, at an advanced career stage, have access to federal funding, and/or be employed by prestigious universities.”

Open access journal publishing in the business disciplines: A closer look at the low uptake and discipline-specific considerations – Mikael Laakso, Bo-Christer Björk, 2021

Abstract:  The Internet has enabled efficient electronic publishing of scholarly journals and Open Access business models. Recent studies have shown that adoption of Open Access journals has been uneven across scholarly disciplines, where the business and economics disciplines in particular seem to lag behind all other fields of research. Through bibliometric analysis of journals indexed in Scopus, we find the share of articles in Open Access journals in business, management, and accounting to be only 6%. We further studied the Open Access availability of articles published during 2014–2019 in journals included in the Financial Times 50 journal list (19,969 articles in total). None of the journals are full Open Access, but 8% of the articles are individually open and for a further 35% earlier manuscript versions are available openly on the web. The results suggest that the low adoption rate of Open Access journals in the business fields is a side-effect of evaluation practices emphasizing publishing in journals included, in particular, ranking lists, creating disincentives for business model innovation, and barriers for new entrants among journals. Currently, most business school research has to be made Open Access through other ways than through full Open Access journals, and libraries play an important role in facilitating this in a sustainable way.

How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications – Fang – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.