All the Research That’s Fit to Print: Open Access and the News Media

Abstract:  The goal of the open access (OA) movement is to help everyone access the scholarly research, not just those who can afford to. However, most studies looking at whether OA has met this goal have focused on whether other scholars are making use of OA research. Few have considered how the broader public, including the news media, uses OA research. This study sought to answer whether the news media mentions OA articles more or less than paywalled articles by looking at articles published from 2010 through 2018 in journals across all four quartiles of the Journal Impact Factor using data obtained through Altmetric.com and the Web of Science. Gold, green and hybrid OA articles all had a positive correlation with the number of news mentions received. News mentions for OA articles did see a dip in 2018, although they remained higher than those for paywalled articles.

 

 

Is rapid scientific publication also high quality? Bibliometric analysis of highly disseminated COVID?19 research papers – Khatter – – Learned Publishing – Wiley Online Library

Abstract:  The impact of COVID-19 has underlined the need for reliable information to guide clinical practice and policy. This urgency has to be balanced against disruption to journal handling capacity and the continued need to ensure scientific rigour. We examined the reporting quality of highly disseminated COVID-19 research papers using a bibliometric analysis examining reporting quality and risk of bias (RoB) amongst 250 top scoring Altmetric Attention Score (AAS) COVID-19 research papers between January and April 2020. Method-specific RoB tools were used to assess quality. After exclusions, 84 studies from 44 journals were included. Forty-three (51%) were case series/studies, and only one was an randomized controlled trial. Most authors were from institutions based in China (n =?44, 52%). The median AAS and impact factor was 2015 (interquartile range [IQR] 1,105–4,051.5) and 12.8 (IQR 5–44.2) respectively. Nine studies (11%) utilized a formal reporting framework, 62 (74%) included a funding statement, and 41 (49%) were at high RoB. This review of the most widely disseminated COVID-19 studies highlights a preponderance of low-quality case series with few research papers adhering to good standards of reporting. It emphasizes the need for cautious interpretation of research and the increasingly vital responsibility that journals have in ensuring high-quality publications.

 

Social media platforms: a primer for researchers

Abstract:  Social media platforms play an increasingly important role in research, education, and clinical practice. As an inseparable part of open science, these platforms may increase the visibility of research outputs and facilitate scholarly networking. The editors who ethically moderate Twitter, Facebook, and other popular social media accounts for their journals may engage influential authors in the post-publication communication and expand societal implications of their publications. Several social media aggregators track and generate alternative metrics which can be used by researchers for visualizing trending articles in their fields. More and more publishers showcase their achievements by displaying such metrics along with traditional citations. The Scopus database also tracks both metrics to offer a comprehensive coverage of the indexed articles’ impact.

Understanding the advantages and limitations of various social media channels is essential for actively contributing to the post-publication communication, particularly in research-intensive fields such as rheumatology.

 

Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms | Emerald Insight

Abstract:  Purpose

The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations.

Design/methodology/approach

A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups.

Findings

Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations.

Research limitations/implications

The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms.

Originality/value

The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations.

Filtering Academic Content by Re-use – Citation Counts and Altmetric Scores

“The demands on researchers to make all of the products of their research openly available, continue to grow. As a result, the balance between the carrots and sticks for incentivising open research continues to be investigated.

The State of Open Data report(1) identifies a perceived lack of credit for sharing data for over 50% of those surveyed. The same respondents identified ‘Full Data Citation’ as the biggest motivating factor to publish data….”

Manipulation of bibliometric data by editors of scientific journals

“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….

An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….

Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! … 

Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”

The open access advantage for studies of human electrophysiology: Impact on citations and Altmetrics – ScienceDirect

“Highlights

• Barriers to accessing science contributes to knowledge inequalities

• 35% of articles published in the last 20 years in electrophysiology are open access.

• Open access articles received 9–21% more citations and 39% more Altmetric mentions.

• Green open access (author archived) enjoyed similar benefit as Gold open access.

• Studies of human electrophysiology enjoy the “open access advantage” in citations….”

 

Altmetrics: Part 2- Celebrating Altmetric’s Decade- AN ATG Original – Charleston Hub

“Altmetric, the company, has been in existence for ten years now. The company has grown, and to get the view from company officials themselves, we submitted questions and various company officials responded to give us an inside look at  Altmetric today – and what we might expect in the future….”

Factors associated with high Altmetric Attention Score in dermatology research – Iglesias?Puzas – – Australasian Journal of Dermatology – Wiley Online Library

Abstract:  Background

Alternative metrics are emerging scores to assess the impact of research beyond the academic environment.

Objective

To analyse whether a correlation exists between manuscript characteristics and alternative citation metrics.

Materials and methods

This bibliometric analysis included original articles published in the five journals with the highest impact factors during 2019.

We extracted the following characteristics from each record: journal, publication month, title, number of authors, type of institution, type of publication, research topic, number of references, financial support, free/open access status and literature citations. The main measure was the identification of variables of higher social attention (measured by the Altmetric Attention Score ?25) using binary logistic regression. Model performance was assessed by the change in the area under the curve (AUC).

Results

A total of 840 manuscripts were included. The Altmetric scores across all five journals ranged from 0 to 465 (mean 12.51 ± 33.7; median 3). The most prevalent topic was skin cancer, and the study design was clinical science. The scientific journal (P < 0.001), the presence of conflicts of interest (OR 2.2 [95%CI 1.3–3.7]; P = 0.002) and open access status OR 3.2 [95%CI 1.6–6.7]; P = 0.002) were found as independent predictors of high Altmetric scores.

Conclusions

Our study suggests an article´s social recognition may be dependent on some manuscript characteristics, thus providing useful information on the dissemination of dermatology research to the general public.

A Farewell to ALM, but not to article-level metrics! – The Official PLOS Blog

“In fact, the altmetrics movement has been so successful that it has spawned a market of providers who specialize in collecting and curating metrics and metadata about how research outputs are used and discussed. 

One of these services, in particular, has far outpaced the reach and capabilities of ALM, and PLOS is now excited to pass the baton of our altmetrics operations to the experts at Altmetric….”