Chefs de Cuisine: Perspectives from Publishing’s Top Table – Ziyad Marar – The Scholarly Kitchen

“What does open access (OA) / public access (PA) mean for your business?

It’s a slowish but profound reconfiguration of the research landscape. As William Gibson, the cyberpunk novelist, once put it ‘the future is already here — it’s just not very evenly distributed’.  When it comes to gold OA there are parts of well-funded STM publishing that have gone OA already and the rest just should follow. And we are accelerating toward OA in this respect. But with social science (and the humanities), it’s a more complex story, and one that my colleagues and I don’t tire of telling. For instance, the National Science Foundation in the US has an annual budget of $9.8B, while the Social and Behavioral Science Directorate gets $285M of that, and yet the measly political science budget of around $18M is routinely targeted by US politicians as a ‘waste of taxpayers’ dollars. You can imagine what that does for a model based primarily on APCs!

Since it is not one size fits all, I feel we need to take a lead in differentiating the OA future by subject domain. Engineering and Sociology need different things to flourish. It is true to say that the growth of national and consortial transformative agreements can give us a way to transition across all subject domains, but I suspect this will still — as the deals are renewed and assessed by the biomedical model — be challenging for social science research for reasons that lead to them being under-valued more generally….”

Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021 – Thelwall – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from and Mendeley associate with individual article quality scores. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014–2017/2018, split into 34 broadly field-based Units of Assessment (UoAs). Altmetrics correlated more strongly with research quality than previously found, although less strongly than raw and field normalized Scopus citation counts. Surprisingly, field normalizing citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best altmetric (e.g., three Spearman correlations with quality scores above 0.5), tweet counts are also a moderate strength indicator in eight UoAs (Spearman correlations with quality scores above 0.3), ahead of news (eight correlations above 0.3, but generally weaker), blogs (five correlations above 0.3), and Facebook (three correlations above 0.3) citations, at least in the United Kingdom. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities.

Transparency in conducting and reporting research: A survey of authors, reviewers, and editors across scholarly disciplines | PLOS ONE

Abstract:  Calls have been made for improving transparency in conducting and reporting research, improving work climates, and preventing detrimental research practices. To assess attitudes and practices regarding these topics, we sent a survey to authors, reviewers, and editors. We received 3,659 (4.9%) responses out of 74,749 delivered emails. We found no significant differences between authors’, reviewers’, and editors’ attitudes towards transparency in conducting and reporting research, or towards their perceptions of work climates. Undeserved authorship was perceived by all groups as the most prevalent detrimental research practice, while fabrication, falsification, plagiarism, and not citing prior relevant research, were seen as more prevalent by editors than authors or reviewers. Overall, 20% of respondents admitted sacrificing the quality of their publications for quantity, and 14% reported that funders interfered in their study design or reporting. While survey respondents came from 126 different countries, due to the survey’s overall low response rate our results might not necessarily be generalizable. Nevertheless, results indicate that greater involvement of all stakeholders is needed to align actual practices with current recommendations.


An Investigation of Gold Open Access Publications of STEM Faculty at a Public University in the United States: Science & Technology Libraries: Vol 0, No 0

Abstract:  This study investigated Gold Open Access journal publication by science and engineering faculty at the authors’ university from 2013 to 2022. Specifically, did Gold Open Access (OA) by these faculty increase, and did the publication rate vary between disciplines? The authors found that Gold OA publication increased by 176% over the past 10 years, and that an important factor was the Libraries’ creation of an Open Access Publishing Fund in 2017. Disciplinary differences in publication rates were also notable, with life sciences research showing the highest rates of open access publication. An analysis of where our faculty are publishing found that MDPI is the most popular Open Access publisher in STEM fields, but many of the new Gold Open Access journals from traditional STEM publishers are also being chosen.


Chefs de Cuisine: Perspectives from Publishing’s Top Table – Jasmin Lange – The Scholarly Kitchen

“Knowledge is best shared openly, it’s the most impactful and our preferred mode of publishing research. The present-day reality in the fields we publish in, however, is a different one. While open access is by far the fastest growing business model, the vast majority of our revenues is still generated by paid access models such as subscription or outright purchase. We push to be more open, and we need to push harder to make sure the “openness gap” between Humanities/Social Sciences (HSS) and STM is not widening. Customers are increasingly aware of the negative impacts caused by the slower transition of HSS and are supportive of establishing sustainable models that work for all fields of research. This is a good development and Brill has benefited from this development in recent years. In 2023 we will get close to having published 1,200 gold OA books; what our small OA team has achieved together with our editorial department, makes me very proud.”

Changes in the absolute numbers and proportions of open access articles from 2000 to 2021 based on the Web of Science Core Collection: a bibliometric study

The ultimate goal of current open access (OA) initiatives is for library services to use OA resources. This study aimed to assess the infrastructure for OA scholarly information services by tabulating the number and proportion of OA articles in a literature database.
We measured the absolute numbers and proportions of OA articles at different time points across various disciplines based on the Web of Science (WoS) database.
The number (proportion) of available OA articles between 2000 and 2021 in the WoS database was 12 million (32.4%). The number (proportion) of indexed OA articles in 1 year was 0.15 million (14.6%) in 2000 and 1.5 million (48.0%) in 2021. The proportion of OA by subject categories in the cumulative data was the highest in the multidisciplinary category (2000–2021, 79%; 2021, 89%), high in natural sciences (2000–2021, 21%–46%; 2021, 41%–62%) and health and medicine (2000–2021, 37%–40%; 2021, 52%–60%), and low in social sciences and others (2000–2021, 23%–32%; 2021, 36%–44%), engineering (2000–2021, 17%–33%; 2021, 31%–39%) and humanities and arts (2000–2021, 11%–22%; 2021, 28%–38%).
Our study confirmed that increasingly many OA research papers have been published in the last 20 years, and the recent data show considerable promise for better services in the future. The proportions of OA articles differed among scholarly disciplines, and designing library services necessitates several considerations with regard to the customers’ demands, available OA resources, and strategic approaches to encourage the use of scholarly OA articles.

Will Humanities and Social Sciences Publishing Consolidate? – The Scholarly Kitchen

“Today, I want to introduce a scenario that I believe should be modeled out by strategists, both in the publishing and library communities. In introducing this scenario, I want to underscore that I do not believe it to be inevitable, nor do I wish to advocate for it. But part of my job is to wonder about the future and to identify some scenarios that can inform planning in our sector. One of the scenarios that I have been considering more and more is a major consolidation among humanities and social sciences (HSS) publishers. 

In this piece, I focus primarily on consolidation among the US, UK, and EU commercial primary publishers. In this segment, consolidation is pursued largely through market-driven acquisitions and strategic partnerships. The same market factors that I discuss below will equally impact not-for-profit HSS publishers, but they may not wish, or may find it difficult, to consolidate in the same fashion. In some ways, though, this analysis may be of greatest importance for those that will find it most difficult to lead….

Finally, given the largely reactive concerns in academia and academic libraries to consolidation in STEM scholarly communication and infrastructure segments, is there any form of strategic investment or advocacy that can, from advocates’ perspective, constructively shape the HSS market before the consolidation scenario develops any further?”

Data Sharing Across Sectors Creates Better Early Warning Systems –

“The existing public sector’s early warning systems for infectious disease and climate events are commonly disconnected; there are limited mechanisms in place that relate the two. In other words, there is a lack of data that helps understand and predict the impacts of extreme weather events and environmental changes on disease risk.

Attempting to find and connect climate and health data proves next to impossible with the current infrastructure in developing countries. For instance, when faced with an outbreak of dengue fever in Peru, the health minister has data on only health and demographics. If you wanted to combine that with climate data you would need to ask the minister of the environment. Want to relate economic data? Ask the minister of the economy and finance….


The Harmonize Project seeks to build a digital infrastructure of harmonized databases to feed early warning systems for epidemics exacerbated by climate change in the LAC region.


In collaboration with the Barcelona Supercomputing Center (BSC)—and a network in Brazil, Colombia, and the Dominican Republic—and supported by Wellcome, the project will bring together ministries, universities, private companies, social impact organizations, and more to create a complex data infrastructure and collect real longitudinal data on the ground. These new data sets will provide valuable information on seasonal variation in land use and human behavior has given climate hazards, which are generally assumed to be unchanging in health impact models.

The outcome of such an infrastructure? Actionable knowledge to inform local risk mapping and create strong early warning systems to drive resilience in low-resource communities….”

Outside the library: Early career researchers and use of alternative information sources in pandemic times – Herman – Learned Publishing – Wiley Online Library

Abstract:  Presents findings from a study into the attitudes and practices of pandemic-era early career researchers (ECRs) in regard to obtaining access to the formally published scholarly literature, which focused on alternative providers, notably ResearchGate and Sci-Hub. The study is a part of the Harbingers project that has been exploring the work lives and scholarly communication practices of ECRs in pre-pandemic times and during the pandemic, and utilizes data from two rounds of interviews with around 170 ECRs from the sciences and social sciences in eight countries. Findings show that alternative providers, as represented by ResearchGate and Sci-Hub, have become established and appear to be gaining ground. However, there are considerable country- and discipline-associated differences. ECRs’ country-specific level of usage of the alternative providers is partly traceable to the adequacy of library provisions, although there are other factors at play in shaping ECRs’ attitudes and practices, most notably convenience and time saving, as well as the fact that these platforms have become embedded in the scholarly dashboard. There is a dearth of evidence of the impact of the pandemic on ECRs’ ways of obtaining scholarly papers.


The APC-Barrier and its effect on stratification in open access publishing | Quantitative Science Studies | MIT Press

Abstract:  Current implementations of Open Access (OA) publishing frequently involve Article Publishing Charges (APCs). Increasing evidence emerges that APCs impede researchers with fewer resources in publishing their research OA. We analysed 1.5 million scientific articles from journals listed in the Directory of Open Access Journals to assess average APCs and their determinants for a comprehensive set of journal publications, across scientific disciplines, world regions and through time. Levels of APCs were strongly stratified by scientific fields and the institutions’ countries, corroborating previous findings on publishing cultures and the impact of mandates of research funders. After controlling for country and scientific field with a multilevel mixture model, however, we found small to moderate effects of levels of institutional resourcing on the level of APCs. Effects were largest in countries with low GDP, suggesting decreasing marginal effects of institutional resources when general levels of funding are high. Our findings provide further evidence on how APCs stratify OA publishing and highlight the need for alternative publishing models.


Correlating article citedness and journal impact: an empirical investigation by field on a large-scale dataset | SpringerLink

Abstract:  In spite of previous research demonstrating the risks involved, and counsel against the practice as early as 1997, some research evaluations continue to use journal impact alone as a surrogate of the number of citations of hosted articles to assess the latter’s impact. Such usage is also taken up by research administrators and policy-makers, with very serious implications. The aim of this work is to investigate the correlation between the citedness of a publication and the impact of the host journal. We extend the analyses of previous literature to all STEM fields. Then we also aim to assess whether this correlation varies across fields and is stronger for highly cited authors than for lowly cited ones. Our dataset consists of a total of almost one million authorships of 2010–2019 publications authored by about 28,000 professors in 230 research fields. Results show a low correlation between the two indicators, more so for lowly cited authors as compared to highly cited ones, although differences occur across fields.


The Twitter accounts of scientific journals: a dataset

Abstract:  Twitter harbours dense networks of academics, but to what extent do scientific journals use that platform? This article introduces a dataset of 3,485 Twitter accounts pertaining to a sample of 13,821 journals listed in Web of Science’s three major indices (SCIE, SSCI and AHCI). The summary statistics indicate that 25.2% of the journals have a dedicated Twitter presence. This number is likely to grow, as, on average, every one and a half days sees yet another journal setting up a new profile. The share of Twitter presence, however, varies strongly by publisher and discipline. The most active discipline is political science, which has almost 75% of its journals on Twitter, while other research categories have zero. The median account issues 116 messages a year and it interacts with distinct other users once in two to three Tweets. Approximately 600 journals refer to themselves as ‘peer-reviewed’, while 263 journals refer to their citation-based impact (like the impact factor) in their profile description. All in all, the data convey immense heterogeneity with respect to the Twitter behaviour of scientific journals. As there are numerous deceptive Twitter profile names established by predatory publishers, it is recommended that journals establish their official accounts lest bogus journals mislead the public about scientific findings. The dataset is available for use for further scientometric analyses.

“Open Access helps both: authors and readers” : Peter Suber in an interview with Bodo Rödel (24 June 2022)

(The interview is in English and the abstract in German.)

Abstract:  From Google’s English:  In the interview, Open Access expert Peter Suber and Bodo Rödel, head of the “Publications and Scientific Information Services” department, discuss the effects of Open Access described in Suber’s 2012 book, the future development of publication platforms, the role of publishers and changed user requirements in science. In addition, topics are also addressed that are not originally caused by Open Access, such as gaining reputation or the impact of the predominance of an academic language on other non-native speakers. 

[2212.07811] Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from this http URL and Mendeley associate with journal article quality. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014-17/18, split into 34 Units of Assessment (UoAs). The results show that altmetrics are better indicators of research quality than previously thought, although not as good as raw and field normalised Scopus citation counts. Surprisingly, field normalising citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best, tweet counts are also a relatively strong indicator in many fields, and Facebook, blogs and news citations are moderately strong indicators in some UoAs, at least in the UK. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities. The Altmetric Attention Score, although hybrid, is almost as good as Mendeley reader counts as a quality indicator and reflects more non-scholarly impacts.


[2212.05416] In which fields are citations indicators of research quality?

Abstract:  Citation counts are widely used as indicators of research quality to support or replace human peer review and for lists of top cited papers, researchers, and institutions. Nevertheless, the extent to which citation counts reflect research quality is not well understood. We report the largest-scale evaluation of the relationship between research quality and citation counts, correlating them for 87,739 journal articles in 34 field-based Units of Assessment (UoAs) from the UK. We show that the two correlate positively in all academic fields examined, from very weak (0.1) to strong (0.5). The highest correlations are in health, life sciences and physical sciences and the lowest are in the arts and humanities. The patterns are similar for the field classification schemes of Scopus and this http URL. We also show that there is no citation threshold in any field beyond which all articles are excellent quality, so lists of top cited articles are not definitive collections of excellence. Moreover, log transformed citation counts have a close to linear relationship with UK research quality ranked scores that is shallow in some fields but steep in others. In conclusion, whilst appropriately field normalised citations associate positively with research quality in all fields, they never perfectly reflect it, even at very high values.