Publishers, libraries, and a diverse array of scholarly communications platforms and services generate information about how OA books are accessed online. Since its launch in 2015, the OA eBook Usage Data Trust (@OAEBU_project) effort has brought together these thought leaders to document the barriers facing OA eBook usage analytics. To start addressing these challenges and to understand the role of a usage data trust, the effort has spent the last year studying and documenting the usage data ecosystem. Interview-based research led to the documentation of the OA book data supply chain, which maps related metadata and usage data standards and workflows. Dozens worldwide have engaged in human-centered design workshops and communities of practice that went virtual during 2020. Together these communities revealed how OA book publishers, platforms, and libraries are looking beyond their need to provide usage and impact reports. Workshop findings are now documented within use-cases that list the queries and activities where usage data analytics can help scholars and organizations to be more effective and strategic. Public comment is invited for the OA eBook Usage Data Analytics and Reporting Use Cases Report through July 10, 2021.
“Outside of eLife and , to an extent , PLoS , no one of scale and weight in the commercial publishing sector has really climbed aboard the Open Science movement with a recognition of the sort of data and communication control that Open Science will require .
So what is that requirement ? In two words – Replicability and Retraction . …”
German Research Foundation warns against the growing influence of major publishers on research. Scientific freedom is under threat from two sides.
Die Deutsche Forschungsgemeinschaft warnt vor dem wachsenden Einfluss der Großverlage auf die Forschung. Die Wissenschaftsfreiheit ist hier von zwei Seiten bedroht.
Abstract: Open citation data can improve the transparency and robustness of scientific portfolio analysis, improve science policy decision-making, stimulate downstream commercial activity, and increase the discoverability of scientific articles. Once sparsely populated, public-domain citation databases crossed a threshold of one billion citations in February 2021. Shortly thereafter, the threshold of one billion public domain citations from the Crossref database alone was crossed. As the relative advantage of withholding data in closed databases has diminished with the flood of public domain data, this likely constitutes an irreversible change in the citation data ecosystem. The successes of this movement can guide future open data efforts.
This description from Twitter:
“Very excited to announce my latest project: Unsub Extender! https://unsubextender.lib.iastate.edu
Run an @unsub_org export .csv file through Unsub Extender to automatically make interactive plots and visualizations with filters. Written in python w/@streamlit and Altair @jakevdp @ellisonbg…”
“This briefing paper issued by the Committee on Scientific Library Services and Information Systems (AWBI) of the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) on the subject of data tracking in digital research resources describes options for the digital tracking of research activities. It outlines how academic publishers are becoming data analytics specialists, indicates the consequences for research and its institutions, and identifies the types of data mining that are being used. As such, it primarily serves to present contemporary practices with a view to stimulating discussion so that positions can be adopted regarding the consequences of these practices for the academic community. It is aimed at all stakeholders in the research landscape….
Potentially, research tracking of this kind can fundamentally contradict academic freedom and informational self-determination. It can endanger scientists and hinder the freedom of competition in the field of information provision. For this reason, scholars and academic institutions must become aware of the problem and clarify the legal, technical and ethical framework conditions of their information supply – not least so as to avoid involuntarily violating applicable law, but also to ensure that academics are appropriately informed and protected. AWBI’s aim in issuing this briefing paper is to encourage a broad debate within the academic community – at the level of academic decision-makers, among academics, and within information infrastructure institutions – so as to reflect on the practice of tracking, its legality, the measures required for compliance with data protection and the consequences of the aggregation of usage data, thereby enabling such measures to be adopted. The collection of data on research and research activity can be useful as long as it follows clear-cut, transparent guidelines, minimises risks to individual researchers and ensures that academic organisations are able to use such data if not have control over it.”
“Yesterday’s news that Clarivate will acquire ProQuest, valued at $5.3 billion, is the largest transaction in recent memory in the scholarly information sector. Both companies are intermediaries — they each work extensively with publishers and libraries — and each has extensive interests in discovery, a lynchpin service in the research ecosystem. Will this transaction result in dramatically strengthened products and improved services for researchers, as its proponents foresee? Or will it result in information enclosure, lock-in, service deterioration, and price increases, as detractors forewarn? One thing is for certain: In Clarivate CEO Jerre Stead’s proclamation that “enterprise software is the fastest growing library market,” we can see the monetization of Lorcan Dempsey’s wry observation that “workflow is the new content.” …”
Abstract: This paper looks closely at how data analytic providers leverage rankings as a part of their strategies to further extract rent and assets from the university beyond their traditional roles as publishers and citation data providers. Multinational publishers such as Elsevier, with over 2,500 journals in its portfolio, has transitioned to become a data analytic firm. Rankings expand their abilities to monetize further their existing journal holdings, as there is a strong association between publication in high-impact journals and improvement in rankings. The global academic publishing industry has become highly oligopolistic, and a small handful of legacy multinational firms are now publishing the majority of the world’s research output (See Larivière et. al. 2015; Fyfe et. al. 2017; Posada & Chen, 2018). It is therefore crucial that their roles and enormous market power in influencing university rankings be more closely scrutinized. We suggest that due to a combination of a lack of transparency regarding, for example, Elsevier’s data services and products and their self-positioning as a key intermediary in the commercial rankings business, they have managed to evade the social responsibilities and scrutiny that come with occupying such a critical public function in university evaluation. As the quest for ever-higher rankings often works in conflict with universities’ public missions, it is critical to raise questions about the governance of such private digital platforms and the compatibility between their private interests and the maintenance of universities’ public values.
“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….
An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….
Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! …
Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”
Abstract: In April 2020, the OAPEN Library moved to a new platform, based on DSpace 6. During the same period, IRUS-UK started working on the deployment of Release 5 of the COUNTER Code of Practice (R5). This is, therefore, a good moment to compare two widely used usage metrics – R5 and Google Analytics (GA). This article discusses the download data of close to 11,000 books and chapters from the OAPEN Library, from the period 15 April 2020 to 31 July 2020. When a book or chapter is downloaded, it is logged by GA and at the same time a signal is sent to IRUS-UK. This results in two datasets: the monthly downloads measured in GA and the usage reported by R5, also clustered by month. The number of downloads reported by GA is considerably larger than R5. The total number of downloads in GA for the period is over 3.6 million. In contrast, the amount reported by R5 is 1.5 million, around 400,000 downloads per month. Contrasting R5 and GA data on a country-by-country basis shows significant differences. GA lists more than five times the number of downloads for several countries, although the totals for other countries are about the same. When looking at individual tiles, of the 500 highest ranked titles in GA that are also part of the 1,000 highest ranked titles in R5, only 6% of the titles are relatively close together. The choice of metric service has considerable consequences on what is reported. Thus, drawing conclusions about the results should be done with care. One metric is not better than the other, but we should be open about the choices made. After all, open access book metrics are complicated, and we can only benefit from clarity.
The scientific merit of a paper and its ability to reach broader audiences is essential for scientific impact. Thus, scientific merit measurements are made by scientometric indexes, and journals are increasingly using published papers as open access (OA). In this study, we present the scientometric data for journals published in clinical allergy and immunology and compare the scientometric data of journals in terms of their all-OA and hybrid-OA publication policies.
Data were obtained from Clarivate Analytics InCites, Scimago Journal & Country Rank, and journal websites. A total of 35 journals were evaluated for bibliometric data, journal impact factor (JIF), scientific journal ranking (SJR), Eigenfactor score (ES), and Hirsch index (h-index). US dollars (USD) were used for the requested article publishing charge (APC).
The most common publication policy was hybrid-OA (n = 20). The median OA publishing APC was 3000 USD. Hybrid-OA journals charged a higher APC than all-OA journals (3570 USD vs. 675 USD, p = 0.0001). Very strong positive correlations were observed between SJR and JIF and between ES and h-index. All the journals in the h-index and ES first quartiles were hybrid-OA journals.
Based on these results, we recommend the use of SJR and ES together to evaluate journals in clinical allergy and immunology. Although there is a wide APC gap between all-OA and hybrid-OA journals, all journals within the first quartiles for h-index and ES were hybrid-OA. Our results conflict with the literature stating that the OA publication model’s usage causes an increase in citation counts.
Abstract: An overview is presented of resources and web analytics strategies useful in setting solutions for capturing usage statistics and assessing audiences for open access academic journals. A set of complementary metrics to citations is contemplated to help journal editors and managers to provide evidence of the performance of the journal as a whole, and of each article in particular, in the web environment. The measurements and indicators selected seek to generate added value for editorial management in order to ensure its sustainability. The proposal is based on three areas: counts of visits and downloads, optimization of the website alongside with campaigns to attract visitors, and preparation of a dashboard for strategic evaluation. It is concluded that, from the creation of web performance measurement plans based on the resources and proposals analysed, journals may be in a better position to plan the data-driven web optimization in order to attract authors and readers and to offer the accountability that the actors involved in the editorial process need to assess their open access business model.
“Jisc has announced that it will be using Unsub, an analytics dashboard, to help evaluate journal agreements that UK universities hold with publishers.
The dashboard, created in 2019 by the not-for-profit software company Our Research, can produce forecasts of different journal subscription scenarios, giving Jisc insight into the costs and benefits of subscription packages for each university and across the consortium. …”
Abstract: Traditionally, Web of Science and Scopus have been the two most widely used databases for bibliometric analyses. However, during the last few years some new scholarly databases, such as Dimensions, have come up. Several previous studies have compared different databases, either through a direct comparison of article coverage or by comparing the citations across the databases. This article aims to present a comparative analysis of the journal coverage of the three databases (Web of Science, Scopus and Dimensions), with the objective to describe, understand and visualize the differences in them. The most recent master journal lists of the three databases is used for analysis. The results indicate that the databases have significantly different journal coverage, with the Web of Science being most selective and Dimensions being the most exhaustive. About 99.11% and 96.61% of the journals indexed in Web of Science are also indexed in Scopus and Dimensions, respectively. Scopus has 96.42% of its indexed journals also covered by Dimensions. Dimensions database has the most exhaustive journal coverage, with 82.22% more journals than Web of Science and 48.17% more journals than Scopus. This article also analysed the research outputs for 20 selected countries for the 2010–2018 period, as indexed in the three databases, and identified database-induced variations in research output volume, rank, global share and subject area composition for different countries. It is found that there are clearly visible variations in the research output from different countries in the three databases, along with differential coverage of different subject areas by the three databases. The analytical study provides an informative and practically useful picture of the journal coverage of Web of Science, Scopus and Dimensions databases.
“Furthermore, it appears that the turn toward open access in the scholarly communications landscape is increasingly facilitating the agendas of an oligopoly of for-profit data analytics companies. Perhaps realizing that “they’ve found something that is even more profitable than selling back to us academics the content that we have produced,”5 they venture ever further up the research stream, with every intent to colonize and canalize its entire flow.6 This poses a severe threat to the independence and quality of scholarly inquiry.7
In the light of these troubling developments, the expansion from Dotawo as a “diamond” open access to a common access journal represents a strong reaffirmation of the call that the late Aaron Swartz succinctly formulated in his “Guerilla Open Access Manifesto”: …
Swartz’s is a call to action that transcends the limitations of the open access movement as construed by the BOAI Declaration by plainly affirming that knowledge is a common good. His call goes beyond open access, because it specifically targets materials that linger on a paper or silicon substrate in academic libraries and digital repositories without being accessible to “fair use.” The deposition of the references from Dotawo contributions in a public library is a first and limited attempt to offer a remedy, heeding the “Code of Best Practices in Fair Use” of the www?Association of Research Libraries, which approvingly cites the late Supreme Court Justice Brandeis that “the noblest of human productions — knowledge, truths ascertained, conceptions, and ideas — become, after voluntary communication to others, free as the air to common use.”9 This approach also dovetails the interpretation of “folk law” recently propounded by Kenneth Goldsmith, the founder of public library www?Ubuweb….”