Unsub Extender · Streamlit

This description from Twitter: 

“Very excited to announce my latest project: Unsub Extender! https://unsubextender.lib.iastate.edu

Run an @unsub_org export .csv file through Unsub Extender to automatically make interactive plots and visualizations with filters. Written in python w/@streamlit and Altair @jakevdp @ellisonbg…”

Data tracking in research: aggregation and use or sale of usage data by academic publishers

“This briefing paper issued by the Committee on Scientific Library Services and Information Systems (AWBI) of the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) on the subject of data tracking in digital research resources describes options for the digital tracking of research activities. It outlines how academic publishers are becoming data analytics specialists, indicates the consequences for research and its institutions, and identifies the types of data mining that are being used. As such, it primarily serves to present contemporary practices with a view to stimulating discussion so that positions can be adopted regarding the consequences of these practices for the academic community. It is aimed at all stakeholders in the research landscape….

Potentially, research tracking of this kind can fundamentally contradict academic freedom and informational self-determination. It can endanger scientists and hinder the freedom of competition in the field of information provision. For this reason, scholars and academic institutions must become aware of the problem and clarify the legal, technical and ethical framework conditions of their information supply – not least so as to avoid involuntarily violating applicable law, but also to ensure that academics are appropriately informed and protected. AWBI’s aim in issuing this briefing paper is to encourage a broad debate within the academic community – at the level of academic decision-makers, among academics, and within information infrastructure institutions – so as to reflect on the practice of tracking, its legality, the measures required for compliance with data protection and the consequences of the aggregation of usage data, thereby enabling such measures to be adopted. The collection of data on research and research activity can be useful as long as it follows clear-cut, transparent guidelines, minimises risks to individual researchers and ensures that academic organisations are able to use such data if not have control over it.” 

Clarivate to Acquire ProQuest – The Scholarly Kitchen

“Yesterday’s news that Clarivate will acquire ProQuest, valued at $5.3 billion, is the largest transaction in recent memory in the scholarly information sector. Both companies are intermediaries — they each work extensively with publishers and libraries — and each has extensive interests in discovery, a lynchpin service in the research ecosystem. Will this transaction result in dramatically strengthened products and improved services for researchers, as its proponents foresee? Or will it result in information enclosure, lock-in, service deterioration, and price increases, as detractors forewarn? One thing is for certain: In Clarivate CEO Jerre Stead’s proclamation that “enterprise software is the fastest growing library market,” we can see the monetization of Lorcan Dempsey’s wry observation that “workflow is the new content.” …”

University Rankings and Governance by Metrics and Algorithms | Zenodo

Abstract:  This paper looks closely at how data analytic providers leverage rankings as a part of their strategies to further extract rent and assets from the university beyond their traditional roles as publishers and citation data providers. Multinational publishers such as Elsevier, with over 2,500 journals in its portfolio, has transitioned to become a data analytic firm. Rankings expand their abilities to monetize further their existing journal holdings, as there is a strong association between publication in high-impact journals and improvement in rankings.  The global academic publishing industry has become highly oligopolistic, and a small handful of legacy multinational firms are now publishing the majority of the world’s research output (See Larivière et. al. 2015; Fyfe et. al. 2017; Posada & Chen, 2018). It is therefore crucial that their roles and enormous market power in influencing university rankings be more closely scrutinized. We suggest that due to a combination of a lack of transparency regarding, for example, Elsevier’s data services and products and their self-positioning as a key intermediary in the commercial rankings business, they have managed to evade the social responsibilities and scrutiny that come with occupying such a critical public function in university evaluation. As the quest for ever-higher rankings often works in conflict with universities’ public missions, it is critical to raise questions about the governance of such private digital platforms and the compatibility between their private interests and the maintenance of universities’ public values.

 

Manipulation of bibliometric data by editors of scientific journals

“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….

An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….

Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! … 

Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”

Open access book usage data – how close is COUNTER to the other kind?

Abstract:  In April 2020, the OAPEN Library moved to a new platform, based on DSpace 6. During the same period, IRUS-UK started working on the deployment of Release 5 of the COUNTER Code of Practice (R5). This is, therefore, a good moment to compare two widely used usage metrics – R5 and Google Analytics (GA). This article discusses the download data of close to 11,000 books and chapters from the OAPEN Library, from the period 15 April 2020 to 31 July 2020. When a book or chapter is downloaded, it is logged by GA and at the same time a signal is sent to IRUS-UK. This results in two datasets: the monthly downloads measured in GA and the usage reported by R5, also clustered by month. The number of downloads reported by GA is considerably larger than R5. The total number of downloads in GA for the period is over 3.6 million. In contrast, the amount reported by R5 is 1.5 million, around 400,000 downloads per month. Contrasting R5 and GA data on a country-by-country basis shows significant differences. GA lists more than five times the number of downloads for several countries, although the totals for other countries are about the same. When looking at individual tiles, of the 500 highest ranked titles in GA that are also part of the 1,000 highest ranked titles in R5, only 6% of the titles are relatively close together. The choice of metric service has considerable consequences on what is reported. Thus, drawing conclusions about the results should be done with care. One metric is not better than the other, but we should be open about the choices made. After all, open access book metrics are complicated, and we can only benefit from clarity.

 

Cureus | Scientometric Data and Open Access Publication Policies of Clinical Allergy and Immunology Journals

Abstract. Introduction

The scientific merit of a paper and its ability to reach broader audiences is essential for scientific impact. Thus, scientific merit measurements are made by scientometric indexes, and journals are increasingly using published papers as open access (OA). In this study, we present the scientometric data for journals published in clinical allergy and immunology and compare the scientometric data of journals in terms of their all-OA and hybrid-OA publication policies.

Methods

Data were obtained from Clarivate Analytics InCites, Scimago Journal & Country Rank, and journal websites. A total of 35 journals were evaluated for bibliometric data, journal impact factor (JIF), scientific journal ranking (SJR), Eigenfactor score (ES), and Hirsch index (h-index). US dollars (USD) were used for the requested article publishing charge (APC).

Results

The most common publication policy was hybrid-OA (n = 20). The median OA publishing APC was 3000 USD. Hybrid-OA journals charged a higher APC than all-OA journals (3570 USD vs. 675 USD, p = 0.0001). Very strong positive correlations were observed between SJR and JIF and between ES and h-index. All the journals in the h-index and ES first quartiles were hybrid-OA journals.

Conclusion

Based on these results, we recommend the use of SJR and ES together to evaluate journals in clinical allergy and immunology. Although there is a wide APC gap between all-OA and hybrid-OA journals, all journals within the first quartiles for h-index and ES were hybrid-OA. Our results conflict with the literature stating that the OA publication model’s usage causes an increase in citation counts.

Web analytics for open access academic journals: justification, planning and implementation | BiD: textos universitaris de biblioteconomia i documentació

Abstract:  An overview is presented of resources and web analytics strategies useful in setting solutions for capturing usage statistics and assessing audiences for open access academic journals. A set of complementary metrics to citations is contemplated to help journal editors and managers to provide evidence of the performance of the journal as a whole, and of each article in particular, in the web environment. The measurements and indicators selected seek to generate added value for editorial management in order to ensure its sustainability. The proposal is based on three areas: counts of visits and downloads, optimization of the website alongside with campaigns to attract visitors, and preparation of a dashboard for strategic evaluation. It is concluded that, from the creation of web performance measurement plans based on the resources and proposals analysed, journals may be in a better position to plan the data-driven web optimization in order to attract authors and readers and to offer the accountability that the actors involved in the editorial process need to assess their open access business model.

 

 

Jisc partners with Unsub to evaluate UK university journal subscriptions | Jisc

“Jisc has announced that it will be using Unsub, an analytics dashboard, to help evaluate journal agreements that UK universities hold with publishers.

The dashboard, created in 2019 by the not-for-profit software company Our Research, can produce forecasts of different journal subscription scenarios, giving Jisc insight into the costs and benefits of subscription packages for each university and across the consortium. …”