Adding interactive citation maps to arXiv | arXiv.org blog

“We’re pleased to announce a new arXivLabs collaboration with Litmaps. The new arXivLabs feature allows arXiv users to quickly generate a citation map of the top connected articles, and then explore the citation network using the Litmaps research platform.

A citation network is a visualization of the literature cited by a research paper. The network shows how papers are related to each other in terms of concepts, subject areas, and history — and they’re valuable for analyzing the development of research areas, making decisions on research directions, and assessing the impacts of research, researchers, institutes, countries, and individual papers.

Readers can now view a Litmap citation network for a specific paper, directly from the arXiv abstract page by clicking on the “Bibliographic Tools” tab at the bottom of an abstract page and activating “Litmaps.” Using this tool, arXiv readers can now easily jump from articles they are interested in and use Litmaps’ custom visualization and automated search tools to find other critical articles they may have missed….”

OASPA endorses Make Data Count: join our webinar (July 13, 2021) to find out more

We invite you to join us for an interactive webinar hosted in collaboration with Make Data Count centered on best practices for data citation.

When: July 13th, 2021 

Time: 3:30 – 4:30  pm UK (2.30 – 3.30 pm UTC)  

Other timezones: 7.30 am Pacific Time, 9.30 am Central Time, 10.30 am Eastern Time, 11.30 am Brasilia Time, 4.30 pm Central European Time, 3.30 pm West Africa Time, 4.30 pm South Africa Standard Time, 8 pm India Standard Time, 10.30 pm Central Indonesia Time (Time converter tool)

We’ll be introducing Make Data Count, sharing a publisher case study on data publishing and citation, and covering the how, why and when for data citation. We will also look at the importance of supporting data citations from the OASPA perspective. We’ll also collect feedback from participants on data citation in their communities in preparation for a further piece of work with OASPA members – we want to understand and help remove barriers to data citation, and support those already doing this valuable work. 

Please come prepared with questions as there will be much time for discussion!

Social media attention and citations of published outputs from re-use of clinical trial data: a matched comparison with articles published in the same journals | BMC Medical Research Methodology | Full Text

Abstract:  Background

Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.

Methods

Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.

Results

89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.

Conclusions

Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.

Flowcite Integrates Smart Citations from Scite to Speed Up Research – Flowcite

“Flowcite – a German-based service providing an all-in-one  platform for academic research, writing, editing, and publishing –partners  up with Brooklyn-based scite.ai to offer quick source evaluation for its users to ensure quality, improve the relevance of results, and thus save time on research….”

Assessing number and quality of urology open access journals… : Current Urology

Abstract:  Background/Aims: 

There is clear evidence that publishing research in an open access (OA) journal or as an OA model is associated with higher impact, in terms of number of reads and citation rates. The development of OA journals and their quality are poorly studied in the field of urology. In this study, we aim to assess the number of OA journals, their quality in terms of CiteScore, percent cited and quartiles, and their scholarly production during the period from 2011 to 2018.

Methods: 

We obtained data about journals from www.scopus.com, and we filtered the list for urology journals. We obtained data for all Scopus indexed journals during the period from 2011 to 2018. For each journal, we extracted the following indices: CiteScore, Citations, scholarly output, and SCImago quartiles. We analyzed the difference in quality indices between OA and non-OA urology journals.

Results: 

Urology journals have increased from 66 journals in 2011 to 99 journals in 2018. The number of OA urology journals has increased from only 10 (15.2%) journals in 2011 to 33 (33.3%) journals in 2018. The number of quartile 1 (the top 25%) journals has increased from only 1 journal in 2011 to 5 journals in 2018. Non-OA urology journals had significantly higher CiteScore compared with OA journals till the year 2015, after which the mean difference in CiteScore became smaller with insignificant p-value.

Conclusion: 

Number and quality of OA journals in the field of urology have increased throughout the last few years. Despite this increase, non-OA urology journals still have higher quality and output.

Introducing the Journal Citation Indicator: A new, field-normalized measurement of journal citation impact – Web of Science Group

“In a recent blog post we discussed refinements in this year’s forthcoming release of the Journal Citation Reports (JCR)™, describing the addition of new content and hinting at a new metric for measuring the citation impact of a journal’s recent publications.

I’m now pleased to fully introduce the Journal Citation Indicator. By normalizing for different fields of research and their widely varying rates of publication and citation, the Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines….”

Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms | Emerald Insight

Abstract:  Purpose

The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations.

Design/methodology/approach

A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups.

Findings

Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations.

Research limitations/implications

The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms.

Originality/value

The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations.

Filtering Academic Content by Re-use – Citation Counts and Altmetric Scores

“The demands on researchers to make all of the products of their research openly available, continue to grow. As a result, the balance between the carrots and sticks for incentivising open research continues to be investigated.

The State of Open Data report(1) identifies a perceived lack of credit for sharing data for over 50% of those surveyed. The same respondents identified ‘Full Data Citation’ as the biggest motivating factor to publish data….”

Influence of accessibility (open and toll-based) of scholarly publications on retractions | SpringerLink

“We have examined retracted publications in different subject fields and attempted to analyse whether online free accessibility (Open Access) influences retraction by examining the scholarly literature published from 2000 through 2019, an incidence of the recent 20 years of publications. InCites, a research analytics tool developed by Clarivate Analytics®, in consultation with Web of Science, PubMed Central, and Retraction Watch databases were used to harvest data for the study. Retracted ‘Article’ and ‘Review’ publications were examined concerning their online accessibility mode (Toll Access and Open Access), based on non-parametric tests like Odds Ratio, Wilcoxon Signed Rank Test, Mann–Whitney U Test, Mann–Kendall and Sen’s methods. The Odds for OA articles to have retraction are about 1.62 as large (62% higher) compared with TA articles (95% CI 1.5, 1.7). 0.028% of OA publications are retracted compared with 0.017% TA publications. Retractions have occurred in all subject areas. In eight subject areas, the Odds for retraction of OA articles are larger compared with retraction of TA articles. In three subject areas, the Odds for retraction of OA articles are lesser compared with the retraction of TA articles. In the remaining 11 subject areas, no significant difference is observed. Post-retraction, though a decline is observed in the citation count of OA & TA publications (p?<?.01), yet the Odds for OA articles to get cited after retraction are about 1.21 as large (21% higher) compared with TA articles (95% CI 1.53, 1.72). TA publications are retracted earlier compared to OA publications (p?<?.01). We observed an increasing trend of retracted works published in both modes. However, the rate of retraction of OA publications is double than the rate of retraction of TA publications.

Is Sci-Hub Increasing Visibility of Indian Research Papers? An Analytical Evaluation

Abstract:  Sci-Hub, founded by Alexandra Elbakyan in 2011 in Kazakhstan has, over the years, emerged as a very popular source for researchers to download scientific papers. It is believed that Sci-Hub contains more than 76 million academic articles. However, recently three foreign academic publishers (Elsevier, Wiley and American Chemical Society) have filed a lawsuit against Sci-Hub and LibGen before the Delhi High Court and prayed for complete blocking these websites in India. It is in this context, that this paper attempts to find out how many Indian research papers are available in Sci-Hub and who downloads them. The citation advantage of Indian research papers available on Sci-Hub is analysed, with results confirming that such an advantage do exist. 

Data Citation: Let’s Choose Adoption Over Perfection | Zenodo

“In the last decade, attitudes towards open data publishing have continued to shift, including a rising interest in data citation as well as incorporating open data in research assessment (see Parsons et al. for an overview). This growing emphasis on data citation is driving incentives and evaluation systems for researchers publishing their data. While increased efforts and interest in data citation are a move in the right direction for understanding research data impact and assessment, there are clear difficulties and roadblocks in having universal and accessible data citation across all research disciplines. But these roadblocks can be mitigated and do not need to keep us in a consistent limbo. The unique properties of data as a citable object have attracted much needed attention, although it has also created an unhelpful perception that data citation is a challenge and requires uniquely burdensome processes to implement. This perception of difficulty begins with defining a ‘citation’ for data. The reality is that all citations are relationships between scholarly objects. A ‘data citation’ can be as simple as a journal article or other dataset declaring that a dataset was important to the creation of that work. This is not a unique challenge. However, many publishers and funders have elevated the relationship of data that “underlies the research” into a Data Availability Statement (DAS). This has helped address some issues publishers have found with typesetting or production techniques that stripped non-articles from citations. However, because of this segmentation of data from typical citation lists, and the exclusion of data citations in article metadata, many communities have felt they are in a stalemate about how to move forward….”

Google Scholar, Web of Science, and Scopus: Which is best for me? | Impact of Social Sciences

“Being able to find, assess and place new research within a field of knowledge, is integral to any research project. For social scientists this process is increasingly likely to take place on Google Scholar, closely followed by traditional scholarly databases. In this post, Alberto Martín-Martín, Enrique Orduna-Malea , Mike Thelwall, Emilio Delgado-López-Cózar, analyse the relative coverage of the three main research databases, Google Scholar, Web of Science and Scopus, finding significant divergences in the social sciences and humanities and suggest that researchers face a trade-off when using different databases: between more comprehensive, but disorderly systems and orderly, but limited systems….”

Correlation Between Social Media Postings and Academic Citations of Hand Surgery Research Publications: A Pilot Study Using Twitter and Google Scholar – Journal of Hand Surgery

Abstract:  Purpose

The relationship between social media postings and academic citations of hand surgery research publications is not known. The objectives of this study were (1) to quantify adoption of social media for the dissemination of original research publications by 3 hand surgery journals, and (2) to determine the correlation between social media postings and academic citations in recent hand surgery research publications.

Methods

An Internet-based study was performed of all research articles from 3 hand surgery journals published from January 2018 to March 2019. A final sample of 472 original full-length scientific research articles was included. For each article, the total number of social media postings was determined using Twitter, as well as the number of tweets, number of retweets, number of tweets from an official outlet, and number of tweets from an author. The number of academic citations for each article was determined using Google Scholar.

Results

Average number of academic citations per article was 3.9. Average number of social media posts per article was 3.2, which consisted of an average of 1.3 tweets and 1.9 retweets per article. The number of academic citations per article was weakly correlated with the number of social medial postings, the number of tweets, and the number of retweets. The number of tweets from an official outlet and from an author were weakly correlated with academic citation.

Conclusions

In the early adoption of social media for the dissemination of hand surgery research, there is a weak correlation between social media posting of hand surgery research and academic citation.