Challenges of scholarly communication: bibliometric transparency and impact

Abstract:  Citation metrics have value because they aim to make scientific assessment a level playing field, but urgent transparency-based adjustments are necessary to ensure that measurements yield the most accurate picture of impact and excellence. One problematic area is the handling of self-citations, which are either excluded or inappropriately accounted for when using bibliometric indicators for research evaluation. In this talk, in favour of openly tracking self-citations, I report on a study of self-referencing behaviour among various academic disciplines as captured by the curated bibliometric database Web of Science. Specifically, I examine the behaviour of thousands of authors grouped into 15 subject areas like Biology, Chemistry, Science and Technology, Engineering, and Physics. In this talk, I focus on the methodological set-up of the study and discuss data science related problems like author name disambiguation and bibliometric indicator modelling. This talk bases on the following publication: Kacem, A., Flatt, J. W., & Mayr, P. (2020). Tracking self-citations in academic publishing. Scientometrics, 123(2), 1157–1165. https://doi.org/10.1007/s11192-020-03413-9

 

Visual citation navigation of open education resources using Litmaps | Emerald Insight

Abstract:  Purpose

The purpose of this study is to visualize the key literature on the topic “Open Educational Resources” using the research discovery tool “Litmaps”.

Design/methodology/approach

Litmaps visual citation navigation, the ultimate science discovery platform, is used for the present study. It provides an interface for discovering scientific literature, explores the research landscape and discovers articles that are highly connected to maps. Litmaps provides quick-start options to import articles from reference manager, keyword search, ORCID ID, DOI or using a seed article. In this paper, “keyword search” and research strategy “Open Educational Resources” or “OER” are put to use.

Findings

The findings of the study revealed that Litmaps gives citations between articles over time visually. The map generated is dynamic as it is adjustable for making the map according to the researcher’s needs.

Research limitations/implications

Litmaps helps researchers in doing the literature review in a very brief and systematic way. It is helpful in finding the related or relevant studies through the seed paper/keyword search.

Originality/value

The study makes a useful contribution to the literature on this topic as one can independently find research topics and also compare topic overlapping. The study provides insights that help researchers in building citation maps and see connections between articles over time. The originality of the present paper lies in highlighting the importance of the research discovery tool Litmaps for the researchers as so far, to the best of the authors’ knowledge, no research has been taken place on using it.

A comparison of scientometric data and publication policies of ophthalmology journals

Abstract: Purpose: 

This retrospective database analysis study aims to present the scientometric data of journals publishing in the field of ophthalmology and to compare the scientometric data of ophthalmology journals according to the open access (OA) publishing policies.

Methods: 

The scientometric data of 48 journals were obtained from Clarivate Analytics InCites and Scimago Journal & Country Rank websites. Journal impact factor (JIF), Eigenfactor score (ES), scientific journal ranking (SJR), and Hirsch index (HI) were included. The OA publishing policies were separated into full OA with publishing fees, full OA without fees, and hybrid OA. The fees were stated as US dollars (USD).

Results: 

Four scientometric indexes had strong positive correlations; the highest correlation coefficients were observed between the SJR and JIF (R = 0.906) and the SJR and HI (R = 0.798). However, journals in the first quartile according to JIF were in the second and third quartiles according to the SJR and HI and in the fourth quartile in the ES. The OA articles published in hybrid journals received a median of 1.17-fold (0.15–2.71) more citations. Only HI was higher in hybrid OA; other scientometric indexes were similar with full OA journals. Full OA journals charged a median of 1525 USD lower than hybrid journals.

Conclusion: 

Full OA model in ophthalmology journals does not have a positive effect on the scientometric indexes. In hybrid OA journals, choosing to publish OA may increase citations, but it would be more accurate to evaluate this on a journal basis.

At what point do academics forego citations for journal status? | Impact of Social Sciences

“The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evidence from a new analysis of business and management academics, Rossella Salandra and Ammon Salter and James Walker¸ explore how journal status is valued by these academics and the point at which journal status becomes more prized than academic influence….”

At what point do academics forego citations for journal status? | Impact of Social Sciences

“The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evidence from a new analysis of business and management academics, Rossella Salandra and Ammon Salter and James Walker¸ explore how journal status is valued by these academics and the point at which journal status becomes more prized than academic influence….”

Correlation Between Altmetric Attention Scores and Citations for Articles Published in High–Impact Factor Ophthalmology Journals From 2018 to 2019 | Medical Journals and Publishing | JAMA Ophthalmology | JAMA Network

Importance  The Altmetric attention score (AAS) provides new information to gauge the impact of a research article not found through typical metrics, such as impact factor or citation counts. Objective  To explore the association between AAS and common impact markers among high-impact ophthalmology journals from 2018 to 2019. Design, Setting, and Participants  All articles published in the American Journal of Ophthalmology (AJO), JAMA Ophthalmology (JAMAO), and Ophthalmology (OPH) from January 1, 2018, to December 31, 2019, were collected for this cross-sectional study. Excluded articles were those missing Altmetric data at the time of data collection. The AAS and associated social media impact for each article were collected with the AAS calculator bookmarklet. Spearman rank correlation analyses and analysis of variance tests were conducted to assess differences in various metrics between AJO, JAMAO, and OPH. The study included articles published of all document types (article, conference paper, editorial, erratum, letter, note, retracted, review, and short survey) and access status (open access and not open access). Main Outcomes and Measures  The correlation between citation counts and Altmetric variables including AAS. Results  A total of 2467 articles were published in the study period. There were 351 articles excluded owing to missing Altmetric data. Of the 2116 articles included in the analysis, 1039 (49.1%) were published in 2018, and 1077 (50.9%) were published in 2019; the mean number of citations was 8.8 (95% CI, 7.9-9.6) for AJO, 6.2 (95% CI, 5.3-7.1) for JAMAO, and 15.1 (95% CI, 13.3-17.0) for OPH. The mean AAS was 4.5 (95% CI, 3.3-5.6) for AJO (723 publications), 27.4 (95% CI, 22.1-32.8) for JAMAO (758 publications), and 15.1 (95% CI, 10.9-19.3) for OPH (635 publications). Citation rate was moderately correlated with AAS across the 3 journals (AJO, ??=?0.39; P?<?.001; JAMAO, ??=?0.41; P?<?.001; OPH, ??=?0.40; P?<?.001), as well as minimally or moderately correlated with engagement or mention by Facebook posts (AJO, ??=?0.38; P?<?.001; JAMAO, ??=?0.24; P?<?.001; OPH, ??=?0.20; P?<?.001), news outlet reporting (AJO, ??=?0.12; P?<?.001; JAMAO, ??=?0.38; P?<?.001; OPH, ??=?0.19; P?<?.001), and Twitter posts (AJO, ??=?0.40; P?<?.001; JAMAO, ??=?0.38; P?<?.001; OPH, ??=?0.42; P?<?.001). Conclusions and Relevance  Results of this cross-sectional study suggest that citation rate has a moderate positive correlation with online and social media sharing of research in ophthalmology literature. Peer-reviewed journals may increase their reach and impact by sharing their literature through social media and online platforms.

Cite-seeing and Reviewing: A Study on Citation Bias in Peer Review

Citations play an important role in researchers’ careers as a key factor in evaluation of scientific impact. Many anecdotes advice authors to exploit this fact and cite prospective reviewers to try obtaining a more positive evaluation for their submission. In this work, we investigate if such a citation bias actually exists: Does the citation of a reviewer’s own work in a submission cause them to be positively biased towards the submission? In conjunction with the review process of two flagship conferences in machine learning and algorithmic economics, we execute an observational study to test for citation bias in peer review. In our analysis, we carefully account for various confounding factors such as paper quality and reviewer expertise, and apply different modeling techniques to alleviate concerns regarding the model mismatch. Overall, our analysis involves 1,314 papers and 1,717 reviewers and detects citation bias in both venues we consider. In terms of the effect size, by citing a reviewer’s work, a submission has a non-trivial chance of getting a higher score from the reviewer: an expected increase in the score is approximately 0.23 on a 5-point Likert item. For reference, a one-point increase of a score by a single reviewer improves the position of a submission by 11% on average.

Asclepias: Citing Software, Making Science

“The Asclepias Project builds networks of citations between the astronomical academic literature and software, helping you find the tools to push your research forward….

The Asclepias Project is a joint effort of the American Astronomical Society, the NASA Astrophysics Data System, Zenodo, and Sidrat Research, funded by the Alfred P. Sloan Foundation.”

Measures of Impact for Journals, Articles, and Authors | SpringerLink

“Journals and authors hope the work they do is important and influential. Over time, a number of measures have been developed to measure author and journal impact. These impact factor instruments are expanding and can be difficult to understand. The varying measures provide different perspectives and have varying strengths and weaknesses. A complete picture of impact for individual researchers and journals requires using multiple measures and does not fully capture all aspects of influence. There are only a few players in the scholarly publishing world that collect data on article citations: Clarivate Analytics, Elsevier, and Google Scholar (Table 1). Measures of influence for authors and journals based on article citations use one of these sources and may vary slightly because of differing journal coverage….”

The rise of citational justice: how scholars are making references fairer

“Studies in bibliometrics have revealed persistent biases in citation patterns — women and people of colour, for instance, garner citations at lower rates than men do. An increasing number of researchers are calling on academics to acknowledge the inequities in citational practices — and, by paying more heed to work from groups that are typically under-cited, take action to reduce them. Some are referring to this idea as ‘citational ethics’ or ‘citational justice’. Initiatives include computer code that helps academics to estimate the balances of gender and race in their papers’ reference lists, a push for ‘citation diversity statements’ in research papers, and websites dedicated to highlighting papers from under-recognized groups. Journals, too, have started to take action, with some introducing guidance and tools for authors to highlight and address citational inequities in their own papers.”

Guest Post – New Winds from the Latin American Scientific Publishing Community – The Scholarly Kitchen

“To help evaluate interest in the idea of a regional association and to better understand editors’ perspectives on the use of journal metrics for science evaluations, a survey of journal editors was carried out, with 20 questions aimed at characterizing the journal they edit, such as subject area(s), audience, business model and adoption of open science, coverage by databases, strategies for increasing visibility, and use of metrics and indicators for journal management. The survey also included four questions about the use of citation impact indicators for national evaluations of science performed by governmental agencies in Latin America and their effects on the publication and research activities in the region….

A large majority of the editors who responded to the survey felt that the use of citation impact indicators for evaluating science in Latin America is inadequate or partially adequate (70%-88% depending on the specific area of evaluation)….

This feedback was used to support the development of the ALAEC Manifesto for the responsible use of metrics in research evaluation in Latin America and the Caribbean, which calls for a more inclusive and responsible use of journal-based metrics in research evaluation. It supports previous manifestos, such as the San Francisco Declaration on Research Assessment – DORA (2012), the Leiden Manifesto for Research Metrics (2015), and the Helsinki Initiative on Multilingualism in Scholarly Communication (2019). Acknowledging that the current criteria imposed by Latin American evaluating bodies have perverse consequences for the region’s journals and that authors will therefore have less incentive to submit articles to them, the manifesto has five main calls to action:

 

Re-establish quality criteria, valuing journals that:

Publish relevant research regardless of area or subject matter, language, target audience, or geographic scope
Bring a broad spectrum of scholarly and research contributions, such as replication, innovation, translation, synthesis, and meta-research
Practice open science, including open access
Adopt high ethical standards, prioritizing quality and integrity in scientific publication

Value and stimulate the work of scientific editors and their teams, promoting their training and development, and recognizing their fundamental role in the adoption and dissemination of good practices in scientific publication.
Ensure that national journals and publishers do not lose financial incentives and the flow of article submissions, allowing them to achieve and maintain high standards of quality and integrity in their editorial processes, especially for journals that practice open science and multilingualism.
Strengthen, disseminate, and protect national and regional infrastructures for scientific communication (SciELO, RedALyC, LatIndex, LA Referencia, and non-commercial CRIS systems), that favor open science and multilingualism, and that can generate the most appropriate metrics and indicators to evaluate local and regional science.
Encourage and value collaborative networks and exchanges between all actors in the ecosystem of knowledge production and dissemination: institutions, authors, reviewers and funding agencies, etc., in the region….”

Surveillance Publishing · Elephant in the Lab

“Clarivate’s business model is coming for scholarly publishing. Google is one peer, but the company’s real competitors are Elsevier, Springer Nature, Wiley, Taylor & Francis, and SAGE. Elsevier, in particular, has been moving into predictive analytics for years now. Of course the publishing giants have long profited off of academics and our university employers—by packaging scholars’ unpaid writing-and-editing labor only to sell it back to us as usuriously priced subscriptions or article processing charges (APCs). That’s a lucrative business that Elsevier and the others won’t give up. But they’re layering another business on top of their legacy publishing operations, in the Clarivate mold. The data trove that publishers are sitting on is, if anything, far richer than the citation graph alone.

Why worry about surveillance publishing? One reason is the balance sheet, since the companies’ trading in academic futures will further pad profits at the expense of taxpayers and students. The bigger reason is that our behavior—once alienated from us and abstracted into predictive metrics—will double back onto our work lives. Existing biases, like male academics’ propensity for self-citation, will receive a fresh coat of algorithmic legitimacy. More broadly, the academic reward system is already distorted by metrics. To the extent that publishers’ tallies and indices get folded into grant-making, tenure-and-promotion, and other evaluative decisions, the metric tide will gain power. The biggest risk is that scholars will internalize an analytics mindset, one already encouraged by citation counts and impact factors….”

Article Processing Charges, Altmetrics and Citation Impact: Is there an economic rationale?

Abstract:  The present study aims to analyze 1) the relationship between Citation Normalized Score of scientific publications and Article Processing Charges (APCs) of Gold Open Access (OA) publications 2) the determinants of APCs. To do so, we used APCs information provided by the OpenAPC database, citation scores of publications from the WoS database and, for Altmetrics, data from this http URL database, over the period from 2006 to 2019 for 83,752 articles published in 4751 journals belonging to 267 distinct publishers. Results show that contrary to common belief, paying high APCs does not necessarily increase the impact of publications. First, large publishers with high impact are not the most expensive. Second, publishers with the highest APCs are not necessarily the best in terms of impact. Correlation between APCs and impact is moderate. Regarding the determinants, results indicate that APCs are on average 50% higher in hybrid journals than in full OA journals. The results also suggest that Altmetrics do not have a great impact: OA articles that have garnered the most attention on internet are articles with relatively low APCs. Another interesting result is that the “number of readers” indicator is more effective as it is more correlated with classic bibliometrics indicators than the Altmetrics score.

 

China overtakes the US in terms of research quality, finds study – Physics World

“The quality of China’s scientific research output exceeded that of the US in 2019. That is according to a new analysis by researchers in the US, which also found that China had already overtaken the European Union in terms of research quality by 2015.  

China’s total research output has grown rapidly in recent years, but there has been a widespread belief that the “quality” – judged by the number of citations papers receive – is not as high as other countries. A common measure of a nation’s research quality is the percentage of its papers appearing in the top 1% of the most-cited papers globally. Since citation practices vary widely across disciplines, researchers typically weight the citation data of papers according to their fields, before comparing countries’ scientific output. When comparing field-weighted citation data, the US has a higher percentage of research in the top 1% worldwide than China does….”