Is rapid scientific publication also high quality? Bibliometric analysis of highly disseminated COVID?19 research papers – Khatter – – Learned Publishing – Wiley Online Library

“Key points

 

An examination of highly visible COVID-19 research articles reveals that 55% could be considered at risk of bias.
Only 11% of the evaluated early studies on COVID-19 adhered to good standards of reporting such as PRISMA or CONSORT.
There was no correlation between quality of reporting and either the journal Impact Factor or the article Altmetric Attention Score in early studies on COVID-19.
Most highly visible early articles on COVID-19 were published in the Lancet and Journal of the American Medical Association.”

Altmetric Score Has a Stronger Relationship With Article Citations Than Journal Impact Factor and Open Access Status: A Cross-Sectional Analysis of 4,022 Sports Science Articles | Journal of Orthopaedic & Sports Physical Therapy

Abstract:  Objective

To assess the relationship of individual article citations in the Sport Sciences field to (i) journal impact factor; (ii) each article’s open access status; and (iii) Altmetric score components.

 

Design

Cross-sectional.

 

Methods

We searched the ISI Web of Knowledge InCites Journal Citation Reports database “Sport Sciences” category for the 20 journals with the highest 2-year impact factor in 2018. We extracted the impact factor for each journal and each article’s open access status (yes or no). Between September 2019 and February 2020, we obtained individual citations, Altmetric scores and details of Altmetric components (e.g. number of tweets, Facebook posts, etc.) for each article published in 2017. Linear and multiple regression models were used to assess the relationship between the dependent variable (citation number) and the independent variables article Altmetric score and open access status, and journal impact factor.

 

Results

4,022 articles were included. Total Altmetric score, journal impact factor and open access status, respectively explained 32%, 14%, and 1% of the variance in article citations (when combined, the variables explained 40% of the variance in article citations). The number of tweets related to an article was the Altmetric component that explained the highest proportion of article citations (37%).

 

Conclusion

Altmetric scores in Sports Sciences journals have a stronger relationship with number of citations than does journal impact factor or open access status. Twitter may be the best social media platform to promote a research article as it has a strong relationship with article citations.

Meet the new Faculty Opinions Score – Faculty Opinions Blog

“Traditional citation metrics, such as the journal impact factor, can frequently act as biased measurements of research quality and contribute to the broken system of research evaluation. Academic institutions and funding bodies are increasingly moving away from relying on citation metrics towards greater use of transparent expert opinion. 

Faculty Opinions has championed this cause for two decades, with over 230k recommendations made by our 8000+ Faculty Members, and we are excited to introduce the next step in our evolution – a formidable mark of research quality – the new Faculty Opinions Score….

The Faculty Opinions Score assigns a numerical value of research publications in Biology and Medicine, to quantify their impact and quality compared to other publications in their field. 

The Faculty Opinions Score is derived by combining our unique star-rated Recommendations on individual publications, made by world-class experts, with bibliometrics to produce a radically new metric in the research evaluation landscape. 

 

 

 

Key properties of the Faculty Opinions Score: 

A score of zero is assigned to articles with no citations and no recommendations. 
The average score of a set of recommended articles has an expected value of 10. However, articles with many recommendations or highly cited articles may have a much higher score. There is no upper bound. 
Non-recommended articles generally score lower than recommended articles. 
Recommendations contribute more to the score than bibliometric performance. In other words, expert recommendations increase an article’s score (much) more than citations do….”

New metric ‘leverages opinions of 8,000 experts’ | Research Information

“Faculty Opinions has introduced a new metric in the research evaluation landscape, leveraging the opinions of more than 8,000 experts. 

The Faculty Opinions Score is designed to be an early indicator of an article’s future impact and a mark of research quality. The company describes the implications for researchers, academic institutions and funding bodies as ‘promising’….”

Social media attention and citations of published outputs from re-use of clinical trial data: a matched comparison with articles published in the same journals | BMC Medical Research Methodology | Full Text

Abstract:  Background

Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.

Methods

Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.

Results

89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.

Conclusions

Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.

All the Research That’s Fit to Print: Open Access and the News Media

Abstract:  The goal of the open access (OA) movement is to help everyone access the scholarly research, not just those who can afford to. However, most studies looking at whether OA has met this goal have focused on whether other scholars are making use of OA research. Few have considered how the broader public, including the news media, uses OA research. This study sought to answer whether the news media mentions OA articles more or less than paywalled articles by looking at articles published from 2010 through 2018 in journals across all four quartiles of the Journal Impact Factor using data obtained through Altmetric.com and the Web of Science. Gold, green and hybrid OA articles all had a positive correlation with the number of news mentions received. News mentions for OA articles did see a dip in 2018, although they remained higher than those for paywalled articles.

 

 

Is rapid scientific publication also high quality? Bibliometric analysis of highly disseminated COVID?19 research papers – Khatter – – Learned Publishing – Wiley Online Library

Abstract:  The impact of COVID-19 has underlined the need for reliable information to guide clinical practice and policy. This urgency has to be balanced against disruption to journal handling capacity and the continued need to ensure scientific rigour. We examined the reporting quality of highly disseminated COVID-19 research papers using a bibliometric analysis examining reporting quality and risk of bias (RoB) amongst 250 top scoring Altmetric Attention Score (AAS) COVID-19 research papers between January and April 2020. Method-specific RoB tools were used to assess quality. After exclusions, 84 studies from 44 journals were included. Forty-three (51%) were case series/studies, and only one was an randomized controlled trial. Most authors were from institutions based in China (n =?44, 52%). The median AAS and impact factor was 2015 (interquartile range [IQR] 1,105–4,051.5) and 12.8 (IQR 5–44.2) respectively. Nine studies (11%) utilized a formal reporting framework, 62 (74%) included a funding statement, and 41 (49%) were at high RoB. This review of the most widely disseminated COVID-19 studies highlights a preponderance of low-quality case series with few research papers adhering to good standards of reporting. It emphasizes the need for cautious interpretation of research and the increasingly vital responsibility that journals have in ensuring high-quality publications.

 

Social media platforms: a primer for researchers

Abstract:  Social media platforms play an increasingly important role in research, education, and clinical practice. As an inseparable part of open science, these platforms may increase the visibility of research outputs and facilitate scholarly networking. The editors who ethically moderate Twitter, Facebook, and other popular social media accounts for their journals may engage influential authors in the post-publication communication and expand societal implications of their publications. Several social media aggregators track and generate alternative metrics which can be used by researchers for visualizing trending articles in their fields. More and more publishers showcase their achievements by displaying such metrics along with traditional citations. The Scopus database also tracks both metrics to offer a comprehensive coverage of the indexed articles’ impact.

Understanding the advantages and limitations of various social media channels is essential for actively contributing to the post-publication communication, particularly in research-intensive fields such as rheumatology.

 

Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms | Emerald Insight

Abstract:  Purpose

The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations.

Design/methodology/approach

A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups.

Findings

Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations.

Research limitations/implications

The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms.

Originality/value

The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations.

Filtering Academic Content by Re-use – Citation Counts and Altmetric Scores

“The demands on researchers to make all of the products of their research openly available, continue to grow. As a result, the balance between the carrots and sticks for incentivising open research continues to be investigated.

The State of Open Data report(1) identifies a perceived lack of credit for sharing data for over 50% of those surveyed. The same respondents identified ‘Full Data Citation’ as the biggest motivating factor to publish data….”

Manipulation of bibliometric data by editors of scientific journals

“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….

An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….

Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! … 

Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”

The open access advantage for studies of human electrophysiology: Impact on citations and Altmetrics – ScienceDirect

“Highlights

• Barriers to accessing science contributes to knowledge inequalities

• 35% of articles published in the last 20 years in electrophysiology are open access.

• Open access articles received 9–21% more citations and 39% more Altmetric mentions.

• Green open access (author archived) enjoyed similar benefit as Gold open access.

• Studies of human electrophysiology enjoy the “open access advantage” in citations….”

 

Altmetrics: Part 2- Celebrating Altmetric’s Decade- AN ATG Original – Charleston Hub

“Altmetric, the company, has been in existence for ten years now. The company has grown, and to get the view from company officials themselves, we submitted questions and various company officials responded to give us an inside look at  Altmetric today – and what we might expect in the future….”

Factors associated with high Altmetric Attention Score in dermatology research – Iglesias?Puzas – – Australasian Journal of Dermatology – Wiley Online Library

Abstract:  Background

Alternative metrics are emerging scores to assess the impact of research beyond the academic environment.

Objective

To analyse whether a correlation exists between manuscript characteristics and alternative citation metrics.

Materials and methods

This bibliometric analysis included original articles published in the five journals with the highest impact factors during 2019.

We extracted the following characteristics from each record: journal, publication month, title, number of authors, type of institution, type of publication, research topic, number of references, financial support, free/open access status and literature citations. The main measure was the identification of variables of higher social attention (measured by the Altmetric Attention Score ?25) using binary logistic regression. Model performance was assessed by the change in the area under the curve (AUC).

Results

A total of 840 manuscripts were included. The Altmetric scores across all five journals ranged from 0 to 465 (mean 12.51 ± 33.7; median 3). The most prevalent topic was skin cancer, and the study design was clinical science. The scientific journal (P < 0.001), the presence of conflicts of interest (OR 2.2 [95%CI 1.3–3.7]; P = 0.002) and open access status OR 3.2 [95%CI 1.6–6.7]; P = 0.002) were found as independent predictors of high Altmetric scores.

Conclusions

Our study suggests an article´s social recognition may be dependent on some manuscript characteristics, thus providing useful information on the dissemination of dermatology research to the general public.

A Farewell to ALM, but not to article-level metrics! – The Official PLOS Blog

“In fact, the altmetrics movement has been so successful that it has spawned a market of providers who specialize in collecting and curating metrics and metadata about how research outputs are used and discussed. 

One of these services, in particular, has far outpaced the reach and capabilities of ALM, and PLOS is now excited to pass the baton of our altmetrics operations to the experts at Altmetric….”