Towards a better understanding of Facebook Altmetrics in LIS field: assessing the characteristics of involved paper, user and post | SpringerLink

Abstract:  Facebook mentions to scholarly papers have provided a novel way for reflecting and measuring the process of informal scientific communication. To uncover the underlying mechanism of Facebook Altmetrics, it is essential to investigate characteristics of its contextual data. Take library and information science papers for empirical study, three categories of contextual data were gathered, namely data of mentioned LIS papers, data of Facebook users and data of Facebook post. Hybrid methods including statistical analysis, content analysis and visualization analysis were adopted to analyze the data. Results show that: (1) Positive open access status and active Facebook account would help get scholarly paper mentioned but would not boost the number of Facebook mentions. Number of citations, number of collaborative institutions, and number of collaborative countries showed a significantly positive correlation with the number of Facebook mentions. Health information management was identified to be the most mentioned research topic while bibliometrics and scientific evaluation has received on average the highest number of Facebook mentions. (2) Scientific Facebook users that mention LIS papers were widely scattered geographically but dominated by USA, Spain, Germany, Brazil and Australia. Institutional users (89%) and academic users (84%) are prevailing, especially universities (14%), research institutes (12%), libraries (11%), academic associations (9%) and commercial organizations (8%). (3) Most scientific Facebook posts were relatively short, while the language distribution was less skewed than that of scientific tweets. The post content is mostly a combination of text, links, and pictures and with neutral sentiment. Different types of users have demonstrated significantly different style of content and concerned topics. These findings indicate that Facebook mentions to LIS papers mainly reflect the institutional level advocacy and attention, with low level of engagement, and could be influenced by several features including collaborative patterns and research topics.

 

Twitter turbulences and their impact on Altmetric… · Open Access @ Strathclyde

“So the most likely explanation one is able to fathom is that the ‘Twitter crisis’ may be hitting services largely based on social media impact. It’s not just that the desertion of large swathes of very active communicators in the scholarly comms domain to Mastodon has dried up the references (tweets) that should be there for Altmetric to catch. It’s – presumably – also that such a hit to the information-gathering workflow and to its associated business model has somehow rendered the Altmetric snapshot unreliable. The impact of this paper a week after release has surely been higher than 17 (as of Mar 21st, screenshot posted at the top)….”

Promoting Publications Through Plastic Surgery Journal Insta… : Annals of Plastic Surgery

Abstract:  Purpose 

Journals are increasingly using social media to increase article engagement. We aim to determine the impact of Instagram promotion on, and identify social media tools that effectively enhance, plastic surgery article engagement and impact.

Methods 

Instagram accounts for Plastic and Reconstructive Surgery, Annals of Plastic Surgery, Aesthetic Surgery Journal, and Aesthetic Plastic Surgery were reviewed for posts published by February 8, 2022. Open access journal articles were excluded. Post caption word count and number of likes, tagged accounts, and hashtags were recorded. Inclusion of videos, article links, or author introductions was noted. All articles from journal issues published between the dates of the first and last posts promoting articles were reviewed. Altmetric data approximated article engagement. Citation numbers from the National Institutes of Health iCite tool approximated impact. Differences in engagement and impact of articles with and without Instagram promotion were compared by Mann-Whitney U tests. Univariate and multivariable regressions identified factors predictive of more engagement (Altmetric Attention Score, ?5) and citations (?7).

Results 

A total of 5037 articles were included, with 675 (13.4%) promoted on Instagram. Of posts featuring articles, 274 (40.6%) included videos, 469 (69.5%) included article links, and 123 included (18.2%) author introductions. Promoted articles had higher median Altmetric Attention Scores and citations (P < 0.001). On multivariable analysis, using more hashtags predicted higher article Altmetric Attention Scores (odds ratio [OR], 1.85; P = 0.002) and more citations (OR, 1.90; P < 0.001). Including article links (OR, 3.52; P < 0.001) and tagging more accounts (OR, 1.64; P = 0.022) predicted higher Altmetric Attention Scores. Including author introductions negatively predicted Altmetric Attention Scores (OR, 0.46; P < 0.001) and citations (OR, 0.65; P = 0.047). Caption word count had no significant impact on article engagement or impact.

Conclusions 

Instagram promotion increases plastic surgery article engagement and impact. Journals should use more hashtags, tag more accounts, and include manuscript links to increase article metrics. We recommend that authors promote on journal social media to maximize article reach, engagement, and citations, which positively impacts research productivity with minimal additional effort in designing Instagram content.

Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021 – Thelwall – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from Altmetric.com and Mendeley associate with individual article quality scores. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014–2017/2018, split into 34 broadly field-based Units of Assessment (UoAs). Altmetrics correlated more strongly with research quality than previously found, although less strongly than raw and field normalized Scopus citation counts. Surprisingly, field normalizing citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best altmetric (e.g., three Spearman correlations with quality scores above 0.5), tweet counts are also a moderate strength indicator in eight UoAs (Spearman correlations with quality scores above 0.3), ahead of news (eight correlations above 0.3, but generally weaker), blogs (five correlations above 0.3), and Facebook (three correlations above 0.3) citations, at least in the United Kingdom. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities.

Characterization and Reach of Orthopaedic Research Posted to Preprint Servers: Are We “Undercooking” Our Science?

Abstract:  Background 

Although biomedical preprint servers have grown rapidly over the past several years, the harm to patient health and safety remains a major concern among several scientific communities. Despite previous studies examining the role of preprints during the Coronavirus-19 pandemic, there is limited information characterizing their impact on scientific communication in orthopaedic surgery.

Questions/purposes 

(1) What are the characteristics (subspecialty, study design, geographic origin, and proportion of publications) of orthopaedic articles on three preprint servers? (2) What are the citation counts, abstract views, tweets, and Altmetric score per preprinted article and per corresponding publication?

Methods 

Three of the largest preprint servers (medRxiv, bioRxiv, and Research Square) with a focus on biomedical topics were queried for all preprinted articles published between July 26, 2014, and September 1, 2021, using the following search terms: “orthopaedic,” “orthopedic,” “bone,” “cartilage,” “ligament,” “tendon,” “fracture,” “dislocation,” “hand,” “wrist,” “elbow,” “shoulder,” “spine,” “spinal,” “hip,” “knee,” “ankle,” and “foot.” Full-text articles in English related to orthopaedic surgery were included, while nonclinical studies, animal studies, duplicate studies, editorials, abstracts from conferences, and commentaries were excluded. A total of 1471 unique preprints were included and further characterized in terms of the orthopaedic subspecialty, study design, date posted, and geographic factors. Citation counts, abstract views, tweets, and Altmetric scores were collected for each preprinted article and the corresponding publication of that preprint in an accepting journal. We ascertained whether a preprinted article was published by searching title keywords and the corresponding author in three peer-reviewed article databases (PubMed, Google Scholar, and Dimensions) and confirming that the study design and research question matched.

Results 

The number of orthopaedic preprints increased from four in 2017 to 838 in 2020. The most common orthopaedic subspecialties represented were spine, knee, and hip. From 2017 to 2020, the cumulative counts of preprinted article citations, abstract views, and Altmetric scores increased. A corresponding publication was identified in 52% (762 of 1471) of preprints. As would be expected, because preprinting is a form of redundant publication, published articles that are also preprinted saw greater abstract views, citations, and Altmetric scores on a per-article basis.

Conclusion 

Although preprints remain an extremely small proportion of all orthopaedic research, our findings suggest that nonpeer-reviewed, preprinted orthopaedic articles are being increasingly disseminated. These preprinted articles have a smaller academic and public footprint than their published counterparts, but they still reach a substantial audience through infrequent and superficial online interactions, which are far from equivalent to the engagement facilitated by peer review. Furthermore, the sequence of preprint posting and journal submission, acceptance, and publication is unclear based on the information available on these preprint servers. Thus, it is difficult to determine whether the metrics of preprinted articles are attributable to preprinting, and studies such as the present analysis will tend to overestimate the apparent impact of preprinting. Despite the potential for preprint servers to function as a venue for thoughtful feedback on research ideas, the available metrics data for these preprinted articles do not demonstrate the meaningful engagement that is achieved by peer review in terms of the frequency or depth of audience feedback.

Clinical Relevance 

Our findings highlight the need for safeguards to regulate research dissemination through preprint media, which has never been shown to benefit patients and should not be considered as evidence by clinicians. Clinician-scientists and researchers have the most important responsibility of protecting patients from the harm of potentially inaccurate biomedical science and therefore must prioritize patient needs first by uncovering scientific truths through the evidence-based processes of peer review, not preprinting. We recommend all journals publishing clinical research adopt the same policy as Clinical Orthopaedics and Related Research®, The Bone & Joint Journal, The Journal of Bone and Joint Surgery, and the Journal of Orthopaedic Research, removing any papers posted to preprint servers from consideration.

Assessing Open Access Friendliness of National Institutes of Technology (NITs) A Data Carpentry Approach | DESIDOC Journal of Library & Information Technology, 2022-10

“Abstract: This research study aims to measure the Open Access (OA) friendliness of National Institutes of Technology (NITs) of India that are listed in the overall category of NIRF (National Institutional Ranking Framework), 2021 by taking into consideration four important OA parameters – i) OA publication share; ii) OA licensing scenario; iii) citation impact of OA publications; and iv) altmetric scores of OA publications. It deals with 64,485 publications of the selected 11 NITs during the period from 2012 to 2021 (10 years), citations received by these publications (5,42,638 citations), and altmetric attention scores of the documents (5,213 publications) during the period under study. A data carpentry tool, namely OpenRefine, and open access bibliographic/citation data sources such as Unpaywall, Dimensions, and Altmetric.com have been deployed to accomplish this large-scale study for ranking NITs by their Open Access Friendliness (OAF). The OAF indicator, as applied in this study, is a distributed weightage based 100-point scale built on top of the aforesaid OA parameters. The ranking framework shows that Sardar Vallabhbhai National Institute of Technology, Surat (est. in 1961) has achieved the top position with a score of 52.12 (out of 100), but in totality only 3 NITs (out of the selected 11 NITs) crossed the 50 per cent mark in the adapted OAF scale.”

Roy, A., & Mukhopadhyay, P. (2022). Assessing Open Access Friendliness of National Institutes of Technology (NITs) A Data Carpentry Approach. DESIDOC Journal of Library & Information Technology, 42(5), 331-338. https://doi.org/10.14429/djlit.42.5.18263

Slow, slow, quick, quick, slow: five altmetric sources observed over a decade show evolving trends, by research age, attention source maturity and open access status | SpringerLink

The study of temporal trends in altmetrics is under-developed, and this multi-year observation study addresses some of the deficits in our understanding of altmetric behaviour over time. The attention surrounding research outputs, as partially captured by altmetrics, or alternative metrics, constitutes many varied forms of data. Over the years 2008–2013, a set of 7739 papers were sampled on six occasions. Five altmetric data sources were recorded (Twitter, Mendeley, News, Blogs and Policy) and analysed for temporal trends, with particular attention being paid to their Open Access status and discipline. Twitter attention both starts and ends quickly. Mendeley readers accumulate quickly, and continue to grow over the following years. News and blog attention is quick to start, although news attention persists over a longer timeframe. Citations in policy documents are slow to start, and are observed to be growing over a decade after publication. Over time, growth in Twitter activity is confirmed, alongside an apparent decline in blogging attention. Mendeley usage is observed to grow, but shows signs of recent decline. Policy attention is identified as the slowest form of impact studied by altmetrics, and one that strongly favours the Humanities and Social Sciences. The Open Access Altmetrics Advantage is seen to emerge and evolve over time, with each attention source showing different trends. The existence of late-emergent attention in all attention sources is confirmed.

A study of the correlation between publication delays and measurement indicators of journal articles in the social network environment—based on online data in PLOS | SpringerLink

Abstract:  The development of network technique and open access has made numerous research results freely obtained online, thereby facilitating the growth of the emerging evaluation methods of Altmetrics. However, it is unknown whether the time interval from reception to publication has an impact on the evaluation indicators of articles in the social network environment. We construct a range of time-series indexes that represent the features of the evaluation indicators and then explore the correlation of acceptance delay, technical delay, and overall delay with the relevant indicators of citations, usage, sharing and discussions, and collections that are obtained from the open access journal platform PLOS. Moreover, this research also explores the differences in the correlations of the delays for the literature in six subject areas with the corresponding indicators and the discrepancies of the correlations of delays and indexes in various metric quartiles. The results of the Mann–Whitney U test reveal that the length of delays affects the performance of the literature on some indicators. This study indicates that reducing the acceptance time and final publication time of articles can improve the efficiency of knowledge diffusion through the formal academic citation channel, but in the context of social networking communication, an appropriate interval at a particular stage in the publishing process can enhance the heat of sharing, discussion, and collection of articles to a small extent, hence boosting the influence and attention received by the literature on the internet.

 

Slow, slow, quick, quick, slow: five altmetric sources observed over a decade show evolving trends, by research age, attention source maturity and open access status

Abstract:  The study of temporal trends in altmetrics is under-developed, and this multi-year observation study addresses some of the deficits in our understanding of altmetric behaviour over time. The attention surrounding research outputs, as partially captured by altmetrics, or alternative metrics, constitutes many varied forms of data. Over the years 2008-2013, a set of 7739 papers were sampled on six occasions. Five altmetric data sources were recorded (Twitter, Mendeley, News, Blogs and Policy) and analysed for temporal trends, with particular attention being paid to their Open Access status and discipline. Twitter attention both starts and ends quickly. Mendeley readers accumulate quickly, and continue to grow over the following years. News and blog attention is quick to start, although news attention persists over a longer timeframe. Citations in policy documents are slow to start, and are observed to be growing over a decade after publication. Over time, growth in Twitter activity is confirmed, alongside an apparent decline in blogging attention. Mendeley usage is observed to grow, but shows signs of recent decline. Policy attention is identified as the slowest form of impact studied by altmetrics, and one that strongly favours the Humanities and Social Sciences. The Open Access Altmetrics Advantage is seen to emerge and evolve over time, with each attention source showing different trends. The existence of late-emergent attention in all attention sources is confirmed.

 

Research on the relationships between discourse leading indicators and citations: perspectives from altmetrics indicators of international multidisciplinary academic journals | Emerald Insight

Abstract:  Purpose

This paper aims to analyze the relationships between discourse leading indicators and citations from perspectives of integrating altmetrics indicators and tries to provide references for comprehending the quantitative indicators of scientific communication in the era of open science, constructing the evaluation indicator system of the discourse leading for academic journals and then improving the discourse leading of academic journals.

Design/methodology/approach

Based on the theory of communication and the new pattern of scientific communication, this paper explores the formation process of academic journals’ discourse leading. This paper obtains 874,119 citations and 6,378,843 altmetrics indicators data from 65 international multidisciplinary academic journals. The relationships between indicators of discourse leading (altmetrics) and citations are studied by using descriptive statistical analysis, correlation analysis, principal component analysis, negative binomial regression analysis and marginal effects analysis. Meanwhile, the connotation and essential characteristics of the indicators, the strength and influence of the relationships are further analyzed and explored. It is proposed that academic journals’ discourse leading is composed of news discourse leading, social media discourse leading, peer review discourse leading, encyclopedic discourse leading, video discourse leading and policy discourse leading.

Findings

It is discovered that the 15 altmetrics indicators data have a low degree of centralization to the center and a high degree of polarization dispersion overall; their distribution patterns do not follow the normal distributions, and their distributions have the characteristics of long-tailed right-peaked curves. Overall, 15 indicators show positive correlations and wide gaps exist in the number of mentions and coverage. The academic journals’ discourse leading significantly affects total cites. When altmetrics indicators of international mainstream academic and social media platforms are used to explore the connotation and characteristics of academic journals’ discourse leading, the influence or contribution of social media discourse, news discourse, video discourse, policy discourse, peer review discourse and encyclopedia discourse on the citations decreases in turn.

Originality/value

This study is innovative from the academic journal level to analyze the deep relationships between altmetrics indicators and citations from the perspective of correlation. First, this paper explores the formation process of academic journals’ discourse leading. Second, this paper integrates altmetrics indicators to study the correlation between discourse leading indicators and citations. This study will help to enrich and improve basic theoretical issues and indicators’ composition, provide theoretical support for the construction of the discourse leading evaluation system for academic journals and provide ideas for the evaluation practice activities.

[2212.07811] Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from this http URL and Mendeley associate with journal article quality. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014-17/18, split into 34 Units of Assessment (UoAs). The results show that altmetrics are better indicators of research quality than previously thought, although not as good as raw and field normalised Scopus citation counts. Surprisingly, field normalising citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best, tweet counts are also a relatively strong indicator in many fields, and Facebook, blogs and news citations are moderately strong indicators in some UoAs, at least in the UK. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities. The Altmetric Attention Score, although hybrid, is almost as good as Mendeley reader counts as a quality indicator and reflects more non-scholarly impacts.

 

A Framework for Amplifying the Teaching-Research Nexus Impact: Leveraging Altmetrics via Figshare | ASCILITE Publications

Abstract:  This concise paper explores as work in progress a collaborative discussion-based webinar series that develops and amplifies understandings of the teaching-research nexus. Aimed at early and mid-career university academics, the series implements a design-based research framework using alternative publishing via Figshare and dissemination through social media networks of the recordings of live online webinars. The webinars feature a panel of invited Deans Teaching and Learning from across the university in collaboration with three higher education research specialists. Results of ongoing analysis of development, engagement, outcomes, and impact are presented, and a preliminary framework is advanced for developing and amplifying understanding of the teaching-research nexus.

 

Impact Factors, Altmetrics, and Prestige, Oh My: The Relationship Between Perceived Prestige and Objective Measures of Journal Quality | SpringerLink

Abstract:  The focus of this work is to examine the relationship between subjective and objective measures of prestige of journals in our field. Findings indicate that items pulled from Clarivate, Elsevier, and Google all have statistically significant elements related to perceived journal prestige. Just as several widely used bibliometric metrics related to prestige, so were altmetric scores.

 

Influence of Social Networking Sites on Scholarly Communication: An Altmetrics Analysis of Selected LIS Journals

Abstract:  Abstract. This study aims to examine the influence of social networking sites on scholarly papers published in Library and Information Science journals. Top 100 articles published in two renowned journals International Journal of Information Management and the Journal of Medical Library Association, that received high Altmetric Attention Score (AAS) have been taken for the study. The analysis found that LIS research is most often mentioned on Twitter, followed by news outlets and blogs. Student groups and librarians are among the most frequent readers of the publications. The Pearson correlation coefficient test revealed a very high and significant positive correlation between Scopus citation and Dimensions.ai citation, Mendeley readership and Scopus citation. However, AAS and Dimensions.ai citation is low correlation and statistically not significant. The findings indicate that journals need social media profiles to disseminate information among academia and society to increase online attention to LIS  research.

Keywords: Altmetrics, LIS research, online attention, dimensions, social media metrics.

Cureus | Association Between Twitter Mention and Open-Access Status on Article Citation Metrics in the Field of Ophthalmology

Abstract:  Introduction: It is possible that social media use can boost not just articles’ social impact but the number of citations and academic influence as well. If a positive correlation between Twitter usage and citation metrics exists in the ophthalmology literature, it is important to broadcast this information to the ophthalmology community so they can use Twitter to increase academic engagement with their research. There has also been an increase in the number of articles available as open access. Therefore, it is important to evaluate the presence of an open-access citation advantage in the field of ophthalmology. This study aims to evaluate the relationship between Twitter mention and open access status on citation metrics in the ophthalmology literature.

Methods: We conducted a retrospective cross-sectional study comparing article citation metrics to Twitter mentions and open access status. We gathered data on ophthalmology research articles from the six highest-ranked ophthalmology journals published as part of a January 2019 issue. Data were collected in April 2022, 38 months after online publication. Data on citations for each article was based on Google Scholar and Scopus websites. The Altmetric Bookmarklet extension was used to determine the amount of social engagement each article received. The open-access status of each article was based on the status listed in its corresponding journal. Two-tailed t-tests were used to compare social media engagement and open access status with the number of Google Scholar and Scopus citations.

Results: A total of 102 original research articles were analyzed. 89 (87.3%) articles received a Twitter mention. Articles tweeted at least once had a significantly higher Google Scholar score (27.2 +/- 4) compared to articles not tweeted (16.4 +/- 1.7; 1.7-fold increase, p<0.05). Likewise, the average Scopus score was significantly higher for tweeted articles (18.6 +/- 2.6) compared to articles not tweeted (11.8 +/- 1.6; 1.6-fold increase, p<0.05). Articles listed as open access had a significantly higher number of Twitter mentions (11.8 +/- 1.8) compared to articles that were not open access (5.6 +/- 0.7; 2.1-fold increase, p<0.05). Open-access articles also had higher citation scores compared to articles that are not open access, but this relationship was not statistically significant.

Conclusion: This is the first study to evaluate the relationship between article Twitter mention and citation score in the field of ophthalmology. It demonstrates a significant positive correlation between the article Twitter mention and citation score and provides further evidence that social media engagement can be beneficial to the dissemination of academic information. Further studies on the relationship between social media engagement and article dissemination are warranted in the field of ophthalmology.