Laakso & Multas (2022, preprint) European scholarly journals from small- and mid-size publishers in times of Open Access: Mapping journals and public funding mechanisms | Zenodo

Mikael Laakso, & Anna-Maija Multas. (2022). European scholarly journals from small- and mid-size publishers in times of Open Access: Mapping journals and public funding mechanisms (Version 1). Zenodo. https://doi.org/10.5281/zenodo.5909512

Abstract:

Open Access (OA) publishing omits reader-side fees, which requires that resources to sustain journals can not originate from the reader-side. Large international publishers have had capability to monetize OA publishing from national consortia and institutions, but small- and mid-sized publishers have not succeeded to the same degree. There is currently a lack of information concerning to what degree small- and mid-sized publishers are present in European countries, to what degree their journals are already OA, and how the countries are supporting these journals financially or technically to publish their materials OA. The methods for this study include bibliometric analysis, document analysis of web information, inquiries to OA experts in European countries, and web-survey to a small sample of journals in each country. The study found that there are 16387 journals from small-and mid-sized publishers being published in European countries (incl. transcontinental states) of which 36% are already publishing OA. The vast majority of journals published in Europe are by single-publisher journals (77% of all publishers publish only one journal). Journals from smalland mid-sized publishers were found to be multilingual or non-English to a higher degree than journals from large publishers (44% and 43% vs 6% and 5%). According to our observations there is large diversity in how (and if) countries reserve and distribute funds to journals active in the countries, ranging from continuous inclusive subsidies to competitive grant funding or nothing at all. Funding information was often difficult to discover and efforts to make such information more easily available would likely facilitate policy development in this area. We call out for additions and corrections to journal funding instrument information in order to make the data as comprehensive and accurate as possible.

Mapping Science: Tools for Bibliometric and Altmetric Studies

Introduction. The study investigates whether online attention, carried out on social media or by video tutorials, affects the popularity of these tools in the research community.
Method. We collected data from the Web of Science, Scopus, YouTube, Facebook, Twitter, and Instagram, using web-scraping tools. Bibliometrics, altmetrics and webometrics were applied to process the data and to analyse Gephi, Sci2 Tool, VOSviewer, Pajek, CiteSpace and HistCite.
Analysis. Statistical and network analyses, and YouTube analytics, were used. The tools’ interfaces were assessed in the preliminary stage of the comparison. The results were plotted on charts and graphs, and compared.
Results. Social media and video tutorials had minimal influence on the popularity of different tools, as reflected by the number of papers within the Web of Science and Scopus where they featured. However, the small but constant growth of publications mentioning Gephi could be a result of Twitter promotion and a high number of video tutorials. The authors proposed four directions for further comparisons of science mapping software.
Conclusions. This work shows that biblio- and scientometricians are not influenced by social media visibility or accessibility of video tutorials. Future research on this topic could focus on evaluating the tools, their features and usability, or the availability of workshops.

Surveillance Publishing

Abstract:  This essay develops the idea of surveillance publishing, with special attention to the example of Elsevier. A scholarly publisher can be defined as a surveillance publisher if it derives a substantial proportion of its revenue from prediction products, fueled by data extracted from researcher behavior. The essay begins by tracing the Google search engine’s roots in bibliometrics, alongside a history of the citation analysis company that became, in 2016, Clarivate. The point is to show the co-evolution of scholarly communication and the surveillance advertising economy. The essay then refines the idea of surveillance publishing by engaging with the work of Shoshana Zuboff, Jathan Sadowski, Mariano-Florentino Cuéllar, and Aziz Huq. The recent history of Elsevier is traced to describe the company’s research-lifecycle data-harvesting strategy, with the aim to develop and sell prediction products to universities and other customers. The essay concludes by considering some of the potential costs of surveillance publishing, as other big commercial publishers increasingly enter the predictive-analytics mark. It is likely, I argue, that windfall subscription-and-APC profits in Elsevier’s “legacy” publishing business have financed its decade-long acquisition binge in analytics, with the implication that university customers are budgetary victims twice over. The products’ purpose, I stress, is to streamline the top-down assessment and evaluation practices that have taken hold in recent decades, in tandem with the view that the university’s main purpose is to grow regional and national economies. A final pair of concerns is that publishers’ prediction projects may camouflage and perpetuate existing biases in the system—and that scholars may internalize an analytics mindset, one already encouraged by citation counts and impact factors.

The OAPEN Library and the origin of downloads – libraries & academic institutions – OAPEN – supporting the transition to open access for academic books

On a regular basis, we look at the download data of the OAPEN Library and where it comes from. While examining the data from January to August 2021, we focused on the usage originating from libraries and academic institutions. Happily, we found that more than 1,100 academic institutions and libraries have used the OAPEN Library.

Of course, we do not actively track individual users. Instead we use a more general approach: we look at the website from which the download from the OAPEN Library originated. How does that work? For instance, when someone in the library of the University of Leipzig clicks on the download link of a book in the OAPEN library, two things happen: first, the book is directly available on the computer that person is working on, and second, the OAPEN server notes the ‘return address’: https://katalog.ub.uni-leipzig.de/. We have no way of knowing who the person is that started the download, we just know the request originated from the Leipzig University Library. Furthermore, some organisations choose to suppress sending their ‘return address’, making them anonymous.

What is helpful to us, is the fact that aggregators such as ExLibris, EBSCO or SerialSolutions use a specific return address. Examples are “west-sydney-primo.hosted.exlibrisgroup.com” – pointing to the library of the Western City University – or “sfx.unibo.it”– coming from the library of the Università di Bologna. And in this way, many academic libraries can also be identified from their web address. Some academic institutions only display their ‘general’ address.

[…]

Author productivity pattern and applicability of Lotka’s inverse square law: a bibliometric appraisal of selected LIS open access journals | Emerald Insight

Abstract:  Purpose

The purpose of this paper is to know whether the authors’ productivity pattern of library and information science (LIS) open access journals adheres to Lotka’s inverse square law of scientific productivity. Since the law was introduced, it has been tested in various fields of knowledge, and results have varied. This study has closely followed Lotka’s inverse square law in the field of LIS open access journals to find a factual result and set a baseline for future studies on author productivity of LIS open access journals.

Design/methodology/approach

The publication data of selected ten LIS open access journals pertain to authorship, citations were downloaded from the Scopus database and analysed using bibliometric indicators like authorship pattern, collaborative index (CI), degree of collaboration (DC), collaborative coefficient (CC) and citation counts. This study has applied Lotka’s inverse square law to assess authors’ productivity pattern of LIS open access journals and further Kolmogorov-Smirnov (K-S) goodness-of-fit test applied for testing of observed and expected author productivity data.

Findings

Inferences were drawn for the set objectives on authorship pattern, collaboration trend and authors’ productivity pattern of LIS open access journals covered in this study. The single authorship pattern is dominant in LIS open access journals covered in this study. The CI, DC and CC are found to be 1.95, 0.47 and 0.29, respectively. The expected values as per Lotka’s law (n = ?2) significantly vary from the observed values as per the chi-square test and K-S goodness-of-fit test. Hence, this study does not adhere to Lotka’s inverse square law of scientific productivity.

Practical implications

Researchers may find an idea about the authors’ productivity patterns of LIS open access journals. This study has used the K-S goodness-of-fit test and the chi-square test to validate the authors’ productivity data. The inferences found out from this study will be a baseline for future research on author productivity of LIS open access journals.

Originality/value

This study is significant from the viewpoint of the growing research on open access journals in the field of LIS and to identify the authorship pattern, collaboration trend and author productivity pattern of such journals.

VOSviewer goes online! (Part 1)

Our VOSviewer software enables visualizations of bibliometric networks to be explored interactively. Nevertheless, VOSviewer visualizations often end up as static images in blog posts, research articles, policy reports, and PowerPoint presentations. In this way the visualizations lose a lot of their value, and in the end they may indeed be “just nice to look at but not useful or helpful”.

To address this problem, we have developed VOSviewer Online, a web-based version of VOSviewer released today. Using VOSviewer Online, visualizations of bibliometric networks can be explored interactively in a web browser. This makes it much easier to share interactive visualizations, and it reduces the need to show static images.

Job: Project officer indi:oa – Responsible evaluation and quality assurance of Open Access publications using bibliometric indicators (w/m/d), E 13 TV-L, part-time, fixed-term. Application deadline: July 19, 2021 | Niedersächsische Staats- und Universitätsbibliothek Göttingen (SUB Göttingen)

Die Niedersächsische Staats- und Universitätsbibliothek Göttingen (SUB Göttingen) engagiert sich seit Jahren in nationalen und internationalen Projekten für die Schaffung von Infrastrukturen und Services in den Bereichen wissenschaftliches Publizieren und Umsetzung von Open Access.

In diesem Kontext ist für das BMBF-Projekt „indi:oa – Verantwortungsbewusste Bewertung und Qualitätssicherung von Open-Access-Publikationen mittels bibliometrischer Indikatoren“ die Stelle als

Mitarbeiter*in (w/m/d)
Entgeltgruppe 13 TV-L, Teilzeit, befristet

zum nächstmöglichen Zeitpunkt an der SUB Göttingen in Teilzeit (75%, zurzeit 29,85 Wochenstunden) zu besetzen. Die Stelle ist befristet bis zum Projektende Ende Juli 2023.

Das BMBF-Verbundprojekt „indi:oa – Verantwortungsbewusste Bewertung und Qualitätssicherung von Open-Access-Publikationen mittels bibliometrischer Indikatoren“ möchte einer uninformierten Verwendung bibliometrischer Kennzahlen im Kontext der Open-Access-Transformation an wissenschaftlichen Einrichtungen in Deutschland entgegenwirken. Gemeinsam mit dem Deutschen Zentrum für Hochschul- und Wissenschaftsforschung (DZHW) kombiniert das Projekt bibliometrische Studien mit Awareness-Aktivitäten. Dadurch soll die Bereitschaft an wissenschaftlichen Einrichtungen gestärkt werden, innovative und qualitätsgesicherte Open-Access-Publikationsangebote als Alternative zu klassischen Journalen der großen Verlage wahrzunehmen.

Ihre Aufgaben:

Erhebung der Informationsbedürfnisse von Open-Access-Beauftragten und Forschungsreferaten in Bezug auf Open Access und Bibliometrie
Erstellung von zielgruppenspezifischen Handlungsempfehlungen und Trainingsmaterialien auf Grundlage empirischer Evidenz
Aufbau und Begleitung eines informellen Mentoringnetzwerkes
Koordinierung des Projektverbundes einschließlich Öffentlichkeitsarbeit

Erforderlich:

Wissenschaftlicher Hochschulabschluss (Master oder äquivalent) in einem sozial- oder bibliotheks- und informationswissenschaftlichen Fach (oder vergleichbar)
Interesse an Fragestellungen der quantitativen Wissenschaftsforschung und/oder Erfahrungen im Bereich des Wissenschaftsmanagements einschließlich Publikationsberatung
Hohe Problemlösungskompetenz, Eigenständigkeit und Teamfähigkeit
Gute Deutsch- und Englischkenntnisse in Wort und Schrift

Wünschenswert:

Kenntnisse über das wissenschaftliche Publikationswesen mit Schwerpunkt auf Open Access und Open Science
Erfahrung mit qualitativen oder quantitativen Erhebungen
Erfahrungen in der Datenanalyse und -visualisierung unter Einsatz einer statistischen Programmierumgebung (zum Beispiel R oder Python)
Gesellschaftliches Engagement, zum Beispiel in Open-Science-Communities

Wir bieten Ihnen eine bereichsübergreifende und spannende Projektarbeit in einem engagierten internationalen Team in einer kooperativen Arbeitsatmosphäre. Zudem unterstützen wir flexible Arbeitsformen wie Telearbeit und innovative Trainings zur professionellen Weiterentwicklung. Teilzeit ist möglich. Aufstockung auf Vollzeit ist gegebenenfalls möglich durch Mitarbeit in verwandten Projekten der SUB Göttingen.

Für eventuelle Rückfragen stehen Ihnen Herr Najko Jahn (E-Mail) und Frau Dr. Birgit Schmidt (E-Mail), +49 551 39-33181 (Tel.) zur Verfügung.

Die Universität Göttingen strebt in den Bereichen, in denen Frauen unterrepräsentiert sind, eine Erhöhung des Frauenanteils an und fordert daher qualifizierte Frauen nachdrücklich zur Bewerbung auf. Sie versteht sich zudem als familienfreundliche Hochschule und fördert die Vereinbarkeit von Wissenschaft, Beruf und Familie. Die Universität hat sich zum Ziel gesetzt, mehr schwerbehinderte Menschen zu beschäftigen. Bewerbungen Schwerbehinderter erhalten bei gleicher Qualifikation den Vorzug.

Bitte reichen Sie Ihre Bewerbung mit allen wichtigen Unterlagen in einem Dokument zusammengefasst bis zum 19.07.2021 ausschließlich über das Bewerbungsportal ein.

Public draft: OA eBook Usage Data Analytics and Reporting Use-cases by Stakeholder. Feedback invited through July 10, 2021

Publishers, libraries, and a diverse array of scholarly communications platforms and services generate information about how OA books are accessed online. Since its launch in 2015, the OA eBook Usage Data Trust (@OAEBU_project) effort has brought together these thought leaders to document the barriers facing OA eBook usage analytics. To start addressing these challenges and to understand the role of a usage data trust, the effort has spent the last year studying and documenting the usage data ecosystem. Interview-based research led to the documentation of the OA book data supply chain, which maps related metadata and usage data standards and workflows. Dozens worldwide have engaged in human-centered design workshops and communities of practice that went virtual during 2020. Together these communities revealed how OA book publishers, platforms, and libraries are looking beyond their need to provide usage and impact reports. Workshop findings are now documented within use-cases that list the queries and activities where usage data analytics can help scholars and organizations to be more effective and strategic. Public comment is invited for the OA eBook Usage Data Analytics and Reporting Use Cases Report through July 10, 2021.

Meet the new Faculty Opinions Score – Faculty Opinions Blog

“Traditional citation metrics, such as the journal impact factor, can frequently act as biased measurements of research quality and contribute to the broken system of research evaluation. Academic institutions and funding bodies are increasingly moving away from relying on citation metrics towards greater use of transparent expert opinion. 

Faculty Opinions has championed this cause for two decades, with over 230k recommendations made by our 8000+ Faculty Members, and we are excited to introduce the next step in our evolution – a formidable mark of research quality – the new Faculty Opinions Score….

The Faculty Opinions Score assigns a numerical value of research publications in Biology and Medicine, to quantify their impact and quality compared to other publications in their field. 

The Faculty Opinions Score is derived by combining our unique star-rated Recommendations on individual publications, made by world-class experts, with bibliometrics to produce a radically new metric in the research evaluation landscape. 

 

 

 

Key properties of the Faculty Opinions Score: 

A score of zero is assigned to articles with no citations and no recommendations. 
The average score of a set of recommended articles has an expected value of 10. However, articles with many recommendations or highly cited articles may have a much higher score. There is no upper bound. 
Non-recommended articles generally score lower than recommended articles. 
Recommendations contribute more to the score than bibliometric performance. In other words, expert recommendations increase an article’s score (much) more than citations do….”

Morrison et al. (2021) Open access article processing charges 2011 – 2021 | uOttawa Research

by: Heather Morrison, Luan Borges, Xuan Zhao, Tanoh Laurent Kakou & Amit Nataraj Shanbhoug

Abstract

This study examines trends in open access article processing charges (APCs) from 2011 – 2021, building on a 2011 study by Solomon & Björk (2012). Two methods are employed, a modified replica and a status update of the 2011 journals. Data is drawn from multiple sources and datasets are available as open data (Morrison et al, 2021). Most journals do not charge APCs; this has not changed. The global average per-journal APC increased slightly, from 906 USD to 958 USD, while the per-article average increased from 904 USD to 1,626 USD, indicating that authors choose to publish in more expensive journals. Publisher size, type, impact metrics and subject affect charging tendencies, average APC and pricing trends. About half the journals from the 2011 sample are no longer listed in DOAJ in 2021, due to ceased publication or publisher de-listing. Conclusions include a caution about the potential of the APC model to increase costs beyond inflation, and a suggestion that support for the university sector, responsible for the majority of journals, nearly half the articles, with a tendency not to charge and very low average APCs, may be the most promising approach to achieve economically sustainable no-fee OA journal publishing.

A preprint of the full article is available here: https://ruor.uottawa.ca/handle/10393/42327

The two base datasets and their documentation are available as open data: Morrison, Heather et al., 2021, “2011 – 2021 OA APCs”, https://doi.org/10.5683/SP2/84PNSG, Scholars Portal Dataverse, V1

 

via https://sustainingknowledgecommons.org/2021/06/24/open-access-article-processing-charges-2011-2021/

Manipulation of bibliometric data by editors of scientific journals

“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….

An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….

Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! … 

Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”