Research data communication strategy at the time of pandemics: a retrospective analysis of the Italian experience | Monaldi Archives for Chest Disease

Abstract:  Coronavirus pandemic has radically changed the scientific world. During these difficult times, standard peer-review processes could be too long for the continuously evolving knowledge about this disease. We wanted to assess whether the use of other types of network could be a faster way to disseminate the knowledge about Coronavirus disease. We retrospectively analyzed the data flow among three distinct groups of networks during the first three months of the pandemic: PubMed, preprint repositories (biorXiv and arXiv) and social media in Italy (Facebook and Twitter). The results show a significant difference in the number of original research articles published by PubMed and preprint repositories. On social media, we observed an incredible number of physicians participating to the discussion, both on three distinct Italian-speaking Facebook groups and on Twitter. The standard scientific process of publishing articles (i.e., the peer-review process) remains the best way to get access to high-quality research. Nonetheless, this process may be too long during an emergency like a pandemic. The thoughtful use of other types of network, such as preprint repositories and social media, could be taken into consideration in order to improve the clinical management of COVID-19 patients.

 

A ‘no update’ update: setting the record straight – Altmetric

“You may have seen a blog post by Kent Anderson last week which indicated that Altmetric has changed the way we score Twitter as part of the Altmetric Attention Score. This is incorrect. We have not changed the Altmetric scoring algorithm. What we have done recently is update our documentation. Like everyone, we do this from time to time whenever we feel we can provide users with better clarity about what we do.  …”

Tackling information overload: identifying relevant preprints and reviewers – ASAPbio

“Christine Ferguson and Martin Fenner outlined their proposal to develop ways for researchers to find preprints relevant to their research immediately after the preprints appear. They propose an automated system that would identify preprints posted in the previous few days that had received attention via Twitter (i.e. based on the preprint receiving a minimal number of tweets).

During the discussion, the session attendees mentioned a number of currently-available tools that collect reactions and attention on preprints and/or allow researchers to discover the latest preprints:

CrossRef collects Event Data for individual preprints, including social media mentions, Hypothes.is annotations and more.
The Rxivist.org tool allows searching for bioRxiv and medRxiv preprints based on Twitter activity.
The search.bioPreprint tool developed by the University of PIttsburgh Medical Library allows searching preprints from different servers based on keywords or topics.
bioRxiv provides search options based on discipline and also has a dashboard that collects reactions and reviews on individual preprints, including Twitter comments.
EMBO has developed the Early Evidence Base platform which allows searching for refereed preprints.
Google Scholar indexes preprints and provides some filtering tools.

The attendees raised some questions about the use of Twitter as a filter and the risks for a metric based on tweets. How can we account for the risk of social media users gaming the system by artificially boosting attention on Twitter? How can we normalize for the fact that methods papers tend to receive more attention? Is there a risk that this system will be focused on papers from high-income countries that already receive a disproportionate share of attention?…”

Can Twitter data help in spotting problems early with publications? What retracted COVID-19 papers can teach us about science in the public sphere | Impact of Social Sciences

“Publications that are based on wrong data, methodological mistakes, or contain other types of severe errors can spoil the scientific record if they are not retracted. Retraction of publications is one of the effective ways to correct the scientific record. However, before a problematic publication can be retracted, the problem has to be found and brought to the attention of the people involved (the authors of the publication and editors of the journal). The earlier a problem with a published paper is detected, the earlier the publication can be retracted and the less wasted effort goes into new research that is based on disinformation within the scientific record. Therefore, it would be advantageous to have an early warning system that spots potential problems with published papers, or maybe even before based on a preprint version….”

Correlation between Twitter mentions and academic citations in sexual medicine journals | International Journal of Impotence Research

Abstract:  Social media services, especially Twitter, are used as a commonly sharing tool in the scientific world. This widespread use of Twitter would be an effective method in spreading academic publications. So, we aimed to investigate the relationship between Twitter mentions and traditional citations of articles in sexual medicine journals in this study. We reviewed the articles published in seven journals of sexual medicine (2 years after the publication of the articles) between January 2018 and June 2018. In the first half of 2018, 410 articles were extracted. Of these, 352 (85.9%) were original articles, while 58 (14.1%) were review articles. The median number of citations of the articles mentioned at least once on Twitter was 7 (interquartile range: 0–111) for Google Scholar, whereas it was 0 (interquartile range: 0–63) for Scopus, respectively. It was 4 (interquartile range: 0–25) for Google Scholar and 0 (interquartile range: 0–7) for Scopus. The publications mentioned on Twitter were cited more than the non-mentioned publications in the traditional-based citation system (p?<?0.001). A significant relationship between the citation numbers and tweet numbers was also observed (p?<?0.001). Also, in the linear regression model, the tweet numbers (p?<?0.001) and article types (p?<?0.001) were found to be related to the Google Scholar citation numbers. In conclusion, using Twitter as a professional tool in academic life would allow information to be propagated and responded quickly, especially for sexual medicine journals.

 

Altmetric Score Has a Stronger Relationship With Article Citations Than Journal Impact Factor and Open Access Status: A Cross-Sectional Analysis of 4,022 Sports Science Articles | Journal of Orthopaedic & Sports Physical Therapy

Abstract:  Objective

To assess the relationship of individual article citations in the Sport Sciences field to (i) journal impact factor; (ii) each article’s open access status; and (iii) Altmetric score components.

 

Design

Cross-sectional.

 

Methods

We searched the ISI Web of Knowledge InCites Journal Citation Reports database “Sport Sciences” category for the 20 journals with the highest 2-year impact factor in 2018. We extracted the impact factor for each journal and each article’s open access status (yes or no). Between September 2019 and February 2020, we obtained individual citations, Altmetric scores and details of Altmetric components (e.g. number of tweets, Facebook posts, etc.) for each article published in 2017. Linear and multiple regression models were used to assess the relationship between the dependent variable (citation number) and the independent variables article Altmetric score and open access status, and journal impact factor.

 

Results

4,022 articles were included. Total Altmetric score, journal impact factor and open access status, respectively explained 32%, 14%, and 1% of the variance in article citations (when combined, the variables explained 40% of the variance in article citations). The number of tweets related to an article was the Altmetric component that explained the highest proportion of article citations (37%).

 

Conclusion

Altmetric scores in Sports Sciences journals have a stronger relationship with number of citations than does journal impact factor or open access status. Twitter may be the best social media platform to promote a research article as it has a strong relationship with article citations.

Optimizing the use of twitter for research dissemination: The “Three Facts and a Story” Randomized-Controlled Trial – Journal of Hepatology

Abstract:  Background

Published research promoted on twitter reaches more readers. Tweets with graphics are more engaging than those without. Data are limited, however, regarding how to optimize a multimedia tweets for engagement

Methods

The “Three facts and a Story” trial is a randomized-controlled trial comparing a tweet featuring a graphical abstract to paired tweets featuring the personal motivations behind the research and a summary of the findings. Fifty-four studies published by the Journal of Hepatology were randomized at the time of online publication. The primary endpoint was assessed at 28-days from online publication with a primary outcome of full-text downloads from the website. Secondary outcomes included page views and twitter engagement including impressions, likes, and retweets.

Results

Overall, 31 studies received standard tweets and 23 received story tweets. Five studies were randomized to story tweets but crossed over to standard tweets for lack of author participation. Most papers tweeted were original articles (94% standard, 91% story) and clinical topics (55% standard, 61% story). Story tweets were associated with a significant increase in the number of full text downloads, 51 (34-71) versus 25 (13-41), p=0.002. There was also a non-significant increase in the number of page views. Story tweets generated an average of >1,000 more impressions than standard tweets (5,388 vs 4,280, p=0.002). Story tweets were associated with a similar number of retweets, and a non-significant increase in the number of likes.

Conclusion

Tweets featuring the authors and their motivations may increase engagement with published research.

WILL PODCASTING AND SOCIAL MEDIA REPLACE JOURNALS AND TRADITIONAL SCIENCE COMMUNICATION? NO, BUT… | American Journal of Epidemiology | Oxford Academic

Abstract:  The digital world in which we live is changing rapidly. The changing media environment is having a direct impact on traditional forms of communication and knowledge translation in public health and epidemiology. Openly accessible digital media can be used to reach a broader and more diverse audience of trainees, scientists, and the lay public than traditional forms of scientific communication. The new digital landscape for delivering content is vast and new platforms are continuously being added. We focus on several, including Twitter and podcasting and discuss their relevance to epidemiology and science communication. We highlight three key reasons why we think epidemiologists should be engaging with these mediums: 1) science communication, 2) career advancement, 3) development of a community and public service. Other positive and negative consequences of engaging in these forms of new media are also discussed. The authors of this commentary are all engaged in social media and podcasting for scientific communication and in this manuscript, we reflect on our experience with these mediums as tools to advance the field of epidemiology.

 

Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms | Emerald Insight

Abstract:  Purpose

The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations.

Design/methodology/approach

A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups.

Findings

Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations.

Research limitations/implications

The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms.

Originality/value

The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations.

How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications – Fang – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.

 

How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications – Fang – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.

 

Early Indicators of Scientific Impact: Predicting Citations with Altmetrics

Abstract:  Identifying important scholarly literature at an early stage is vital to the academic research community and other stakeholders such as technology companies and government bodies. Due to the sheer amount of research published and the growth of ever-changing interdisciplinary areas, researchers need an efficient way to identify important scholarly work. The number of citations a given research publication has accrued has been used for this purpose, but these take time to occur and longer to accumulate. In this article, we use altmetrics to predict the short-term and long-term citations that a scholarly publication could receive. We build various classification and regression models and evaluate their performance, finding neural networks and ensemble models to perform best for these tasks. We also find that Mendeley readership is the most important factor in predicting the early citations, followed by other factors such as the academic status of the readers (e.g., student, postdoc, professor), followers on Twitter, online post length, author count, and the number of mentions on Twitter, Wikipedia, and across different countries.

 

Sci-Hub Founder Criticises Sudden Twitter Ban Over Over “Counterfeit” Content * TorrentFreak

“Twitter has suspended the account of Sci-Hub, a site that offers a free gateway to paywalled research. The site is accused of violating the counterfeit policy of the social media platform. However, founder Alexandra Elbakyan believes that this is an effort to silence the growing support amidst a high profile court case in India.”