Article processing charges for open access journal publishing: A review – Borrego – Learned Publishing – Wiley Online Library

Abstract:  Some open access (OA) publishers charge authors fees to make their articles freely available online. This paper reviews literature on article processing charges (APCs) that has been published since 2000. Despite praise for diamond OA journals, which charge no fees, most OA articles are published by commercial publishers that charge APCs. Publishers fix APCs depending on the reputation assigned to journals by peers. Evidence shows a relationship between high impact metrics and higher, faster rising APCs. Authors express reluctance about APCs, although this varies by discipline depending on previous experience of paying publication fees and the availability of research grants to cover them. Authors rely on a mix of research grants, library funds and personal assets to pay the charges. Two major concerns have been raised in relation to APCs: the inability of poorly funded authors to publish research and their impact on journal quality. Waivers have not solved the first issue. Research shows little extension of waiver use, unintended side effects on co-author networks and concerns regarding criteria to qualify for them. Bibliometric studies concur that journals that charge APCs have a similar citation impact to journals that rely on other income sources.


Open Access Advantages as a Function of the Discipline: Mixed-methods Study – ScienceDirect

Abstract:  Purpose

This mixed-methods study integrates bibliometric and altmetric investigation with a qualitative method in order to assess the prevalence and societal-impact of Open-Access (OA) publications, and to reveal the considerations behind researchers’ decision to publish articles in closed and open-access.


The bibliometric-altmetric study analyzed 584 OA and closed publications published between 2014 and 2019 by 40 Israeli researchers: 20 from STEM (Science, Technology, Engineering, Math) and 20 from SSH (Social Sciences and Humanities) discipline. We used a multistage cluster sampling method to select a representative sample for the STEM disciplines group (engineering, computer science, biology, mathematics, and physics), and for the SSH disciplines group (sociology, economics, psychology, political science, and history). Required data were extracted from Scopus and Unpaywall databases, and the PlumX-platform. Among the 40 researchers who were selected for the bibliometric-altmetric study, 20 researchers agreed to be interviewed for this study.


Comparing bibliometrics and altmetrics for the general publications did not reveal any significant differences between OA and closed publications. These were found only when comparing OA and closed publications across disciplines. STEM-researchers published 59 % of their publications in OA, compared to just 29 % among those in SSH, and they received significantly more bibliometric and altmetric citations from SSH OA publications and from their own closed-access publications. The altmetrics findings indicate that researchers are well acquainted and active in social media. However, according to the interviewees, there is no academic contribution for sharing research findings on social-media; it is viewed as a “public-service”. Researchers’ primary consideration for publishing in closed or OA was the journal impact-factor.

Research limitations/implications

Our findings contribute to the increasing body of research that addresses OA citations and societal-impact advantages. The findings suggest the need to adopt an OA-policy after a thorough assessment of the consequences for SSH disciplines.

Journal impact factors and the future of open access publishing – Richardson – Journal of Applied Clinical Medical Physics – Wiley Online Library

“There are many challenges faced by publishers and scientific journals by the wide-spread use of the internet and the development of open access. It is not a perfect system, and many criticisms are valid. Reviews take a long time and are subject to bias. Reviewers are unrewarded for their efforts. Journal impact factors are becoming archaic, but no metric is perfect. New tools are being developed but editors are not yet sure how to incorporate them into the process. All of these challenges will be faced by the JACMP and other open access journals. On an aspirational note, in 2018, the European Commission and European Research Council launched “cOAlition S,” an initiative (Plan S) that supports worldwide open access for research funded by public grants.41 Among others, the World Health Organization and the Bill and Melinda Gates foundation are funders of Plan S. If enough entities agree that this is the correct path forward, we may see all journal platforms becoming open access and solve some of the financial problems therein.”

‘Responsible use of what?’ Navigating US university governance to approve an institutional statement on the responsible use of metrics

A slide presentation by Rachel Miles, Research Impact Coordinator at Virginia Tech University Libraries. 


Open Access Publication in Total Knee Arthroplasty is Associated with Increased Social Media Attention, but is not Associated with Increased Citations – The Journal of Arthroplasty

Abstract:  Background

Open access (OA) publication is growing in total joint arthroplasty literature. While OA manuscripts are free to view, these publications require a fee from authors. This study aimed to compare social media attention and citation rates between OA and non-OA publications in the total knee arthroplasty (TKA) literature.


There were 9,606 publications included, with 4,669 (48.61%) as OA articles. The TKA articles were identified from 2016 to 2022 using a national database. Articles were grouped as OA or non-OA and Altmetric Attention Score (AAS), a weighted count of social media attention, and the Mendeley readership were analyzed using negative binomial regressions while adjusting for days since publication. Independent t-tests were utilized to compare means scores between OA vs. non-OA groups.


The OA articles had greater mean AAS (13.45 vs. 8.42, P = 0.012) and Mendeley readership (43.91 vs. 36.72, P < 0.001). OA was not an independent predictor of number of citations when compared to non- OA articles (13.98 vs. 13.63, P = 0.914). Subgroup analysis of studies in top-10 arthroplasty journals showed OA was not an independent predictor of AAS (13.51 vs. 9.53, P = 0.084) or number of citations (19.51 vs. 18.74, P = 0.495), but was an independent predictor of Mendeley readership (49.05 vs. 40.25, P < 0.003).


The OA publications in the TKA literature were associated with increased social media attention, but not overall citations. This association was not observed among the top 10 journals. Authors may utilize these results to weigh the relative importance of readership, citations, and online engagement to the cost of OA publication.

Measuring the Impact and Influence of Scientific Activity in the Humanities and Social Sciences

Abstract:  Scientific activity in the Humanities and Social Sciences (HSS) presents special characteristics that require the use of various sources and methodologies to adequately assess its impact and influence on both academic and non-academic audiences. This study aims to explore the validity of traditional and alternative information sources for the analysis of the characteristics of HSS research and its academic impact and influence (considering social, media, informative and political influence). It is also intended to highlight the differences between Humanities (H) and Social Sciences (SS) and analyse the variables that determine the different types of impact and influence of research in each of them. The following sources of information are used: Web of Science, conCIENCIA (institutional database), Google Scholar, Unpaywall, and Overton, focused on the study of the Spanish National Research Council (CSIC). The results obtained show that institutional sources make local research visible, which has high percentages of open access. The usefulness of alternative sources to measure social, media, informative and political influence is verified, since HSS publications have an important number of mentions. Significant differences are observed between H and SS in terms of publication coverage (higher in H in the institutional database), language (more Spanish in H), open access (higher percentages in SS) and impact measured through conCIENCIA (the greatest number of documents with a high impact is found in H). In addition, the influence on non-academic audiences is increased by the international orientation of research, the greater academic impact, the participation of SS centres and the immediacy of publications. This study is a starting point for future research, as it explores several tools and data sources to analyse the influence of HSS research on different audiences. A comprehensive analysis will also facilitate the proposal of new metrics applied to the HSS assessment, highlighting its importance for society as a whole.


Distortion of journal impact factors in the era of paper mills: Molecular Therapy

Abstract:  Academia’s obsession with the journal impact factor has been a subject of debate for some time. Most would probably agree that it is useful as a crude measure of a journal’s prestige, quality, and general influence on a scientific or medical field but should not be overinterpreted. Nonetheless, some institutions go as far as disregarding a student’s or faculty member’s publications in journals with impact factors less than a certain number (often the magic number is 5) when it comes to performance evaluation, promotion, graduation, or hiring. Such overemphasis ignores that one journal with a lower impact factor may actually have more rigorous standards for acceptance of a paper than another with a higher impact factor. This situation may be observed for a variety of reasons, such as the degree of specialization of a journal or the ratio of review articles vs. original research papers. Another more nefarious contributor to a journal’s impact factor, manipulated citations, is also growing and threatening to expose the deepening cracks in the foundation of academia’s favorite metric.


Article-level metrics: A new approach to quantify reach and impact of published research – ScienceDirect

Abstract:  A spectrum of measuring tools are available to evaluate the impact of published literature and the journals they are published in. Journal Level Metrics (JLM) such as Journal Impact Factor (JIF) or CiteScore assess the reputation of peer-reviewed journals based on citation analysis. Whereas, Article Level Metrics (ALM) quantify the importance, reach and impact of a particular article, and are a new approach to quantifying the reach and impact of published research. Traditionally JLM has served as a proxy for an individual publication’s significance, however, the introduction of contemporary and evolution of Alternative metrics measuring digital or societal influence of a particular article has gained popularity in recent times. These metrics help in rapid dissemination of research, development of newer research strategies and individual academic progress. We highlight the characteristics and importance of currently available ALM, and the newer ones influenced by social media, digital media and Open Access publishing models.




Altmetrics analysis of selected articles in the field of social sciences | Emerald Insight

Abstract:  Purpose

This study aims to measure the impact of the selected papers in the field of social sciences indexed in Scopus using altmetrics tools.


The research community consists of the articles of the Iranian researchers in the field of social sciences indexed in the Scopus database in 2014–2018. Some of the most important altmetric service providers have been used to assess the presence of the research outputs in the social media and their impact assessment. Also, the relationship between variables such as scientific collaboration of researchers, open access journals and the quality of research journals with altmetric activity have been investigated through appropriate correlation tests.


The findings indicated that the most important social media publishing Iranian articles are Mendeley, Twitter and Facebook. The results of the correlation test showed a statistically significant positive and weak relationship between the scientific collaboration of researchers and their altmetric activity. Also, there is a significant and weak statistical relation between journal openness and the altmetric scores. In this study, the findings suggest that the published articles in the journals with higher quality indicators have higher altmetric scores and are more likely to be present in social media.

Research implications

In this study, the social network indicators have been introduced as a solution to examine the effectiveness of research activities on social media. These indicators can be used to evaluate the impact and usefulness of the articles and other scientific outputs with the aim of completing and eliminating the shortcomings of traditional scientometrics indicators. What distinguishes altmetric criteria from other criteria related to the scientometric studies is the speed, ease and transparency of these scales. This allows the publications to be evaluated regardless of their formal form and in the shortest possible time, and in addition to the scientific impact, the social impact of the works is also measured.


The results of these studies show that using altmetric service providers not only reflects the social impact of publications on authors in different subject areas but also helps libraries, universities, research organizations and politicians in planning, budgeting and allocating resources.

Altmetrics and their relationship with citation counts: a case of journal articles in physics | Emerald Insight

Abstract:  Purpose

The first purpose of the present study is to investigate the coverage of journal articles in Physics in various sources of altmetrics. Secondly, the study investigates the relationship between altmetrics and citations. Finally, the study also investigates whether the relationship between citations and altmetrics was stronger or weaker for those articles that had been mentioned at least once in the sources of altmetrics.


The journal articles in Physics having at least one author from an Indian Institution and published during 2014–2018 in sources of altmetrics have been investigated. was used for collecting altmetrics data. Spearman’s rank correlation coefficient (?) has been used as the data found to be skewed.


The highest coverage was found on Twitter (22.68%), followed by Facebook (3.62%) and blogs (2.18%). The coverage in the rest of the sources was less than 1%. The average Twitter mentions for journal articles tweeted at least once was found to be 4 (3.99) and for Facebook mentions, it was found to be 1.48. Correlations between Twitter mentions–citations and Facebook mentions–citation were found to be statistically significant but low to weak positive.

Research limitations/implications

The study concludes that due to the low coverage of journal articles, altmetrics should be used cautiously for research evaluation keeping in mind the disciplinary differences. The study also suggests that altmetrics can function as complementary to citation-based metrics.


The study is one of the first large scale altmetrics studies dealing with research in Physics. Also, Indian research has not been attended to in the altmetrics literature and the present study shall fill that void.

Research assessment exercises are necessary — but we need to learn to do them better

“Research evaluation at the Australian Research Council (ARC), one of the country’s main funding bodies, is set to get a makeover. Last month, an independent expert review recommended that the ARC scrap its 13-year-old research-scoring system, known as Excellence in Research for Australia (ERA), and its companion exercise, the Engagement and Impact assessment, which grades the real-world benefits of institutions’ work. Both had been on hold since last August, awaiting the findings of the review.

This is a rare case in which an evaluation system can be rewritten from scratch. The ARC should take this opportunity to improve how it measures and communicates the value of Australia’s research workforce, on the basis of not just lessons learnt from the ERA’s deficiencies, but also principles that have been developed and implemented elsewhere in the world. In doing so, it will help to create a research culture that reflects the best possible values that research should represent….”

Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessment

“This review was commissioned by the joint UK higher education (HE) funding bodies as part of the Future Research Assessment Programme (FRAP). It revisits the findings of the 2015 review The Metric Tide to take a fresh look at the use of indicators in research management and assessment. 

While this review feeds into the larger FRAP process, the authors have taken full advantage of their independence and sought to stimulate informed and robust discussion about the options and opportunities of future REF exercises. The report should be read in that spirit: as an input to ongoing FRAP deliberations, rather than a reflection of their likely or eventual conclusions. 

The report is written in three sections. Section 1 plots the development of the responsible research assessment agenda since 2015 with a focus on the impact of The Metric Tide review and progress against its recommendations. Section 2 revisits the potential use of metrics and indicators in any future REF exercise, and proposes an increased uptake of ‘data for good’. Section 3 considers opportunities to further support the roll-out of responsible research assessment policies and practices across the UK HE sector. Appendices include an overview of progress against the recommendations of The Metric Tide and a literature review. 

We make ten recommendations targeted at different actors in the UK research system, summarised as: 

1: Put principles into practice. 

2: Evaluate with the evaluated. 

3: Redefine responsible metrics. 

4: Revitalise the UK Forum. 

5: Avoid all-metric approaches to REF. 

6: Reform the REF over two cycles. 

7: Simplify the purposes of REF. 

8: Enhance environment statements. 

9: Use data for good. 

10: Rethink university rankings….”

What Should Impact Assessment Look Like for Social Science? — Sage

“A decade ago, the San Francisco Declaration on Research Assessment, or DORA, tackled the pressing need to improve how funders, institutions, policy makers and others evaluated scientific research and its outputs. Existing measures, centered on scholarly citation, tended to use where the outputs were published as a proxy for the research’s quality, utility, and impact, measuring all disciplines with the same yardstick.?

In the 10 years since, various efforts to improve assessment and measure societal impact have launched that downplay or even eliminate literature-based measurements. Ideas for these new measures focus on impact in the real world, address disciplinary differences such as those between social science and physical science, and offer useful tools for researchers and end-users alike.?

This panel will engage representatives of various social and behavioral science disciplines, as well as publishers, to discuss:?

What does impact assessment look like from their perch?

What should it look like??

How have their perspectives on impact changed over the last decade?

What changes would they like to see 10 years from now??

What necessary next steps should be taken – whether immediately practical or aspirational?…”

Saudi universities entice top scientists to switch affiliations — sometimes with cash

“Research institutions in Saudi Arabia are gaming global university rankings by encouraging top researchers to change their main affiliations, sometimes in exchange for cash, and often with little obligation to do meaningful work. That’s the conclusion of a report that shows how, over the past decade, dozens of the world’s most highly cited researchers have switched their primary affiliations to universities in the country. That, in turn, has boosted the standing of Saudi Arabian institutions in university ranking tables, which consider the citation impacts of an institution’s researchers….”

How can altmetrics improve the Public Communication of Science and Technology? An analysis on universities and altmetrics

Abstract:  In current research evaluation models, monitoring and impact evaluation are extended beyond peer-reviewed articles to include Public Communication of Science and Technology activities. Through an online survey, we analyzed the perceptions of relevance and degree of application of the altmetric indicators for the PCST of 51 sampled Brazilian federal universities. Perceptions of relevance and application of altmetrics proved to be an outlier in 26 indicators. 66.7% of respondents said they did not know the relevance of altmetrics for the PCST or considered it not applicable to the field. Regarding the perception of relevance, the indicator “Mentions tracked by altmetrics” received high relevance scores (7 and 9) from 21.5% of respondents. The indicator was also the least applied, with only one university (1.9%) using it. In addition, 45% of respondents reported having no intention of applying it, 41.1% intend to apply it in the long term, and 11.7% in the short term.