Open access science leads to more citations – The Science Show – ABC Radio National

Abstract:  The traditional method in releasing scientific results, still widely practiced, is to have a paper published in a peer reviewed journal, one usually accessible only by subscription. But that is changing. Some results are allowed to be seen by all. But it goes further. Some scientists release their results step by step and welcome feedback as experiments are underway. This is open access science. Kiera McNeice, Research Data Manager at Cambridge University Press says the publisher is pushing for more open access research while maintaining high standards of peer review. She says it leads to more citations, which for many scientists is a key measure of their work.

 

Citation Advantage? | Clarke & Esposito

“You might think that after 20 years of research and more than 130 studies on the subject, we’d have a clear picture of the effect that open access publishing has on an article’s citation performance. Unfortunately, decades of poor studies and a mystifying unwillingness to perform experimental controls for confounding factors continues to muddy the waters around the alleged open access citation advantage (OACA).

 
In a new paper published in PLOS ONE, authors from the University of Minnesota Libraries attempted to perform a meta-analysis of the 134 studies they could locate on the subject. But to be valid, a meta-analysis must look at comparable experiments, and because the OACA studies were so heterogenous, this proved impossible. Definitions of “open access,” fields of study, time periods studied, etc. were all over the place, negating any possible conclusions that could be drawn….”

Wikipedia citations in Wikidata – Diff

From Google’s English:  “The Wikipedia Citations dataset currently includes approximately 30 million citations from Wikipedia pages to a variety of sources, including 4 million scientific publications. Increasing the connection with external data services and providing structured data to one of the key elements of Wikipedia articles has two significant advantages: first, better identification of relevant encyclopedic articles related to academic studies; furthermore, the strengthening of Wikipedia as a social authority and political hub, which would allow policy makers to gauge the importance of an article, a person, a research group and an institution by looking at how many Wikipedia articles cite them.

These are the motivations behind the “Wikipedia Citations in Wikidata” project , supported by a grant from the WikiCite Initiative. From January 2021 until the end of April, the team of Silvio Peroni (co-founder and director of OpenCitations), Giovanni Colavizza, Marilena Daquino, Gabriele Pisciotta and Simone Persiani of the University of Bologna (Department of Classical and Italian Philology) worked on the development of a codebase to enrich Wikidata with citations to academic publications that are currently referenced in English in Wikipedia . This codebase is divided into four software modules in Python and integrates new components (a classifier to distinguish citations based on the cited source and a search module to equip citations with identifiers from Crossref or other APIs). In doing so, Wikipedia Citations extends previous work that focused only on citations that already have identifiers….”

Did You Ask for Citations? An Insight into Preprint Citations en route to Open Science

Abstract:  This study investigates citation patterns between 2017 and 2020 for preprints published in three preprint servers, one specializing in biology (bioRxiv), one in chemistry (ChemRxiv), and another hosting preprints in all disciplines (Research Square). Showing evidence that preprints are now regularly cited in peer reviewed journal articles, books, and conference papers, the outcomes of this investigation further substantiate the value of open science also in relation to citation-based metrics on which the evaluation of scholarship continues to rely on. This analysis will be useful to inform new research-based education in today’s scholarly communication. View Full-Text

 

The “Sci-Hub effect” can almost double the citations of research articles, study suggests

“Scientific articles that get downloaded from the scholarly piracy website Sci-Hub tend to receive more citations, according to a new study published in Scientometrics. The number of times an article was downloaded from Sci-Hub also turned out to be a robust predictor of future citations….”

Altmetric Score Has a Stronger Relationship With Article Citations Than Journal Impact Factor and Open Access Status: A Cross-Sectional Analysis of 4,022 Sports Science Articles | Journal of Orthopaedic & Sports Physical Therapy

Abstract:  Objective

To assess the relationship of individual article citations in the Sport Sciences field to (i) journal impact factor; (ii) each article’s open access status; and (iii) Altmetric score components.

 

Design

Cross-sectional.

 

Methods

We searched the ISI Web of Knowledge InCites Journal Citation Reports database “Sport Sciences” category for the 20 journals with the highest 2-year impact factor in 2018. We extracted the impact factor for each journal and each article’s open access status (yes or no). Between September 2019 and February 2020, we obtained individual citations, Altmetric scores and details of Altmetric components (e.g. number of tweets, Facebook posts, etc.) for each article published in 2017. Linear and multiple regression models were used to assess the relationship between the dependent variable (citation number) and the independent variables article Altmetric score and open access status, and journal impact factor.

 

Results

4,022 articles were included. Total Altmetric score, journal impact factor and open access status, respectively explained 32%, 14%, and 1% of the variance in article citations (when combined, the variables explained 40% of the variance in article citations). The number of tweets related to an article was the Altmetric component that explained the highest proportion of article citations (37%).

 

Conclusion

Altmetric scores in Sports Sciences journals have a stronger relationship with number of citations than does journal impact factor or open access status. Twitter may be the best social media platform to promote a research article as it has a strong relationship with article citations.

An analysis of scientometric data and publication policies of rheumatology journals | SpringerLink

“We show that OA publication does not affect citations or scientometric indexes of rheumatology journals….When choosing a rheumatology journal to publish OA, rheumatologists should consider individual OA citation patterns and APC charges together.”

Disturbance of greedy publishing to academia

Questionable publications have been criticized for their greedy behaviour, yet have not been investigated their influence on academia quantitatively. Here, we probe the impact of questionable publications through the systematic and comprehensive analysis for the various participants in academia compared with their most similar unquestioned counterparts using billions of citation records: the brokers, e.g. journals and publishers, and prosumers, e.g. authors. Our analysis reveals that the questionable publishers decorate their citation score by the publisher-level self-citations to their journals while they control the journal-level self-citations to evade the evaluation of the journal indexing services; thus, it is hard to detect by conventional journal-level metrics, which our novel metric can capture. We also show that both novelty and influence are lower for the questionable publications than their counterparts implying the negative effect of questionable publications in the academic ecosystem, which provides a valuable basis for future policy-making.

Promoting inclusive metrics of success and impact to dismantle a discriminatory reward system in science

Abstract:  Success and impact metrics in science are based on a system that perpetuates sexist and racist “rewards” by prioritizing citations and impact factors. These metrics are flawed and biased against already marginalized groups and fail to accurately capture the breadth of individuals’ meaningful scientific impacts. We advocate shifting this outdated value system to advance science through principles of justice, equity, diversity, and inclusion. We outline pathways for a paradigm shift in scientific values based on multidimensional mentorship and promoting mentee well-being. These actions will require collective efforts supported by academic leaders and administrators to drive essential systemic change.

 

Meet the new Faculty Opinions Score – Faculty Opinions Blog

“Traditional citation metrics, such as the journal impact factor, can frequently act as biased measurements of research quality and contribute to the broken system of research evaluation. Academic institutions and funding bodies are increasingly moving away from relying on citation metrics towards greater use of transparent expert opinion. 

Faculty Opinions has championed this cause for two decades, with over 230k recommendations made by our 8000+ Faculty Members, and we are excited to introduce the next step in our evolution – a formidable mark of research quality – the new Faculty Opinions Score….

The Faculty Opinions Score assigns a numerical value of research publications in Biology and Medicine, to quantify their impact and quality compared to other publications in their field. 

The Faculty Opinions Score is derived by combining our unique star-rated Recommendations on individual publications, made by world-class experts, with bibliometrics to produce a radically new metric in the research evaluation landscape. 

 

 

 

Key properties of the Faculty Opinions Score: 

A score of zero is assigned to articles with no citations and no recommendations. 
The average score of a set of recommended articles has an expected value of 10. However, articles with many recommendations or highly cited articles may have a much higher score. There is no upper bound. 
Non-recommended articles generally score lower than recommended articles. 
Recommendations contribute more to the score than bibliometric performance. In other words, expert recommendations increase an article’s score (much) more than citations do….”

Adding interactive citation maps to arXiv | arXiv.org blog

“We’re pleased to announce a new arXivLabs collaboration with Litmaps. The new arXivLabs feature allows arXiv users to quickly generate a citation map of the top connected articles, and then explore the citation network using the Litmaps research platform.

A citation network is a visualization of the literature cited by a research paper. The network shows how papers are related to each other in terms of concepts, subject areas, and history — and they’re valuable for analyzing the development of research areas, making decisions on research directions, and assessing the impacts of research, researchers, institutes, countries, and individual papers.

Readers can now view a Litmap citation network for a specific paper, directly from the arXiv abstract page by clicking on the “Bibliographic Tools” tab at the bottom of an abstract page and activating “Litmaps.” Using this tool, arXiv readers can now easily jump from articles they are interested in and use Litmaps’ custom visualization and automated search tools to find other critical articles they may have missed….”

OASPA endorses Make Data Count: join our webinar (July 13, 2021) to find out more

We invite you to join us for an interactive webinar hosted in collaboration with Make Data Count centered on best practices for data citation.

When: July 13th, 2021 

Time: 3:30 – 4:30  pm UK (2.30 – 3.30 pm UTC)  

Other timezones: 7.30 am Pacific Time, 9.30 am Central Time, 10.30 am Eastern Time, 11.30 am Brasilia Time, 4.30 pm Central European Time, 3.30 pm West Africa Time, 4.30 pm South Africa Standard Time, 8 pm India Standard Time, 10.30 pm Central Indonesia Time (Time converter tool)

We’ll be introducing Make Data Count, sharing a publisher case study on data publishing and citation, and covering the how, why and when for data citation. We will also look at the importance of supporting data citations from the OASPA perspective. We’ll also collect feedback from participants on data citation in their communities in preparation for a further piece of work with OASPA members – we want to understand and help remove barriers to data citation, and support those already doing this valuable work. 

Please come prepared with questions as there will be much time for discussion!

Social media attention and citations of published outputs from re-use of clinical trial data: a matched comparison with articles published in the same journals | BMC Medical Research Methodology | Full Text

Abstract:  Background

Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.

Methods

Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.

Results

89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.

Conclusions

Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.

Flowcite Integrates Smart Citations from Scite to Speed Up Research – Flowcite

“Flowcite – a German-based service providing an all-in-one  platform for academic research, writing, editing, and publishing –partners  up with Brooklyn-based scite.ai to offer quick source evaluation for its users to ensure quality, improve the relevance of results, and thus save time on research….”