The Horrors of Good Intentions: Told through the story of a dark repository.

“The library manages a dark repository, named Dark Blue because I have no imagination, for material needing preservation but not public access such as preservation copies of digitized moving image and in-process born-digital material. You can read more about the implementation of this repository in this 2018 post. It is fair to say that Dark Blue had some growing pains over these past few years that include incorrect packaging of material and broken deposit and withdrawal workflows. While these sound like technical problems, the thesis of this post is that our troubles with Dark Blue are not based on bad systems or policies, but the limitation of people and time, and choosing to do the “nice” thing over the realistic thing….”

 

The curious internal logic of open access policymaking – Samuel Moore

“This week, the White House Office of Science and Technology Policy (OSTP) declared 2023 its ‘Year of Open Science‘, announcing ‘new grant funding, improvements in research infrastructure, broadened research participation for emerging scholars, and expanded opportunities for public engagement’. This announcement builds on the OSTP’s open access policy announcement last year that will require immediate open access to federally-funded research from 2025. Given the state of the academic publishing market, and the tendency for US institutions to look towards market-based solutions, such a policy change will result in more article-processing charge payments and, most likely, publishing agreements between libraries and academic publishers (as I have written about elsewhere). The OSTP’s policy interventions will therefore hasten the marketisation of open access publishing by further cementing the business models of large commercial publishers — having similar effects to the policy initiatives of European funders.

As the US becomes more centralised and maximalist in its approach to open access policymaking, European institutions are taking a leaf out of the North American book by implementing rights retention policies — of the kind implemented by Harvard in 2008 and adopted widely in North America thereafter. If 2023 will be the ‘year of open science’ in the USA, it will surely be the year of rights retention in Europe. This is largely in response to funders now refusing to pay APCs for hybrid journals — a form of profiteering initially permitted by many funders who now realise the errors of their ways. With APC payments prohibited, researchers need rights retention to continue publishing in hybrid journals while meeting their funder requirements….”

Publication and data surveillance in academia | Research Information

“It is becoming increasingly clear that the core functions of higher education are destined to be quantified and that this data will be harvested, curated, and repackaged through a variety of enterprise management platforms. All aspects of the academic lifecycle, such as research production, publication, distribution, impact determination, citation analysis, grant award trends, graduate student research topic, and more can be sold, analysed, and gamed to an unhealthy degree. By unhealthy, we mean constricted and self-consuming as the output we develop is directly contingent on the input we receive. Well-meaning tools, such as algorithmically derived research suggestions and citation analysis, create a shrinking and inequitable academic landscape that favours invisibly defined metrics of impact that are reinforced through further citation thereby limiting the scope and scale of research available….

As the shift to open access gains momentum, there is danger of the unintended consequences as enterprise platforms seek to maximise profit as the models shift from under their feet. As Alexander Grossmann and Björn Brembs discuss, the cost creep incurred by libraries reflects this pivot shifting to a model of author costs, which are often supported by libraries, thereby adjusting costing methods from the backend subscription model to the front-end pay to publish model. It is not surprising or controversial that for-profit enterprise, database, and academic platform vendors are seeking to turn a profit. We should remain vigilant, however, to academia’s willingness to find the easy and convenient solution without considering the longer-term effects of what they are selling. In a recent industry platform webinar, academic enterprise representatives discussed the “alchemy” of user-derived data and their ability to repackage and sell this data, with consent, to development companies with their key take away being a driver towards increased revenue. More to the point, they had learned the lessons of the tech industry, and more specifically the social media companies in understanding the data we generate can be used to target us, to sell to us, to use us for further development. They discussed the ways in which the use of this data would become, like social media, intelligent and drive user behaviour – further cinching the knot on the closed-loop as algorithmically-based suggestions further constrain research and reinforce a status-quo enabled by profit motive in the guise of engagement, use, and reuse….”

Data for Good Can’t be a Casualty of Tech Restructuring  • CrisisReady

“Technology companies like Meta, Twitter and Amazon are laying off thousands of employees as part of corporate restructuring in an uncertain global economy. In addition to jobs, many internal programs deemed unnecessary or financially infeasible may be lost. Programs that fall under the rubric of “corporate social responsibility” (CSR) are generally the first casualties of restructuring. CSR efforts include “data for good” programs designed to translate anonymized corporate data into social good and may be seen in the current climate as a way that companies cater to employee values or enable friendlier regulatory environments; in other words, nice-to-haves rather than need-to-haves for the bottom line.  

We believe the platforms built to safely and ethically share corporate data to support public policy are not a luxury that companies should jettison or monetize. The data we produce in our daily lives has become integral to how public decisions are made while planning for public health or disaster response. Our 21st century public data ecosystem is increasingly reliant on novel private data streams that corporations own and currently share only conditionally and increasingly, for profit….

We contend that the rapid sharing of aggregated and anonymized location data with disaster response and public health agencies should be automatic and free — though conditional on strict privacy protocols and time-limited — during acute emergencies….

While the challenges to realizing the full value of private data for public good are many, there is precedent for a path forward. Two decades ago, the International Space Charter was negotiated to facilitate access to satellite data from companies and governments for the sake of responding to major disasters. A similar approach guaranteeing access rights to privately held data for good during emergencies is more important now….”

On the culture of open access: the Sci-hub paradox | Research Square

Abstract:  Shadow libraries have gradually become key players of scientific knowledge dissemination, despite their illegality in most countries of the world. Many publishers and scientist-editors decry such libraries for their copyright infringement and loss of publication usage information, while some scholars and institutions support them, sometimes in a roundabout way, for their role in reducing inequalities of access to knowledge, particularly in low-income countries. Although there is a wealth of literature on shadow libraries, none of this have focused on its potential role in knowledge dissemination, through the open access movement. Here we analyze how shadow libraries can affect researchers’ citation practices, highlighting some counter-intuitive findings about their impact on the Open Access Citation Advantage (OACA). Based on a large randomized sample, this study first shows that OA publications, including those in fully OA journals, receive more citations than their subscription-based counterparts do. However, the OACA has slightly decreased over the seven last years. The introduction of a distinction between those accessible or not via the Sci-hub platform among subscription-based suggest that the generalization of its use cancels the positive effect of OA publishing. The results show that publications in fully OA journals (and to a lesser extent those in hybrid journals) are victims of the success of Sci-hub. Thus, paradoxically, although Sci-hub may seem to facilitate access to scientific knowledge, it negatively affects the OA movement as a whole, by reducing the comparative advantage of OA publications in terms of visibility for researchers. The democratization of the use of Sci-hub may therefore lead to a vicious cycle against the development of fully OA journals.

 

The great convergence – Does increasing standardisation of journal articles limit intellectual creativity? | Impact of Social Sciences

“To be sure, plenty of original research across many disciplines is regularly published in otherwise conventional formats, and even producing a relatively conventional article in STS is not exactly a trivial matter (as we can attest from experience). Yet, we also believe that especially in interpretive fields, the perceived generative potential of research lies in enabling contributions that readers will find original, critical or otherwise inspiring. It is precisely this potential to generate surprise on a conceptual level that is at risk when a typical convention of how to frame arguments becomes too strong. Will STS be open and welcoming to diverse and varied intellectual traditions and concepts with this increasingly dominant typical article format? On the basis of our findings, we are not so sure.”

Beyond the fetish of open – Open Future

“Openness in technical architectures does not lead automatically to beneficial, progressive social, political and economic structures. Open (access) resources, without dedicated custodians, are prone to degradation or exploitation. Inviting open markets to address some of these issues brings about the problems of commodification. Absent of resilient, mission-, and community-specific governance systems, open social, economic, political networks and communities remain vulnerable to external interference, exploitation, or simple degradation. If openness means the free circulation of ideas, the porousness of community boundaries, the lack of ossified power relations, the fluidity of social, economic, cultural, political norms, structures, flows and processes, then this openness will always be under the threat of being appropriated and abused.

While openness has traditionally been framed as a source of resilience, it has become increasingly clear that open systems, societies and resources can also be extremely vulnerable. So, instead of focusing on openness as a magical solution that works equally well in all possible contexts, we may want to ask ourselves: what do we hope to achieve with openness? What may be the alternatives to openness which would allow us to achieve the same goals in that one specific context? And most importantly, what protections do we need to take to protect open resources from abuse?”

Pandemic and infodemic: the role of academic journals and preprints | SpringerLink

“In contrast, before the outbreak of COVID-19 pandemic, clinical researchers were generally reluctant to adopt widespread sharing of preprints, probably because of concern that the potential harm that could result to patients, if medical treatment is based on findings that have not been vetted by peer reviewers. For example, the BMJ group opened a preprint server (ClinMedNetPrints.org) in 1999, but was closed in 2008, because only around 80 submissions were posted during this period [7]. The BMJ group, together with Cold Spring Harbor Laboratory and Yale University launched a new server, bioR?iv in 2013, and medR?iv in 2019 [7], but they were not actively used.

Outbreak of COVID-19 pandemic triggered clinical researchers to use actively preprint servers, and during the initial few years of the COVID-19 pandemic, more than 35,000 preprints, mainly related to COVID-19, have been posted to medR?iv. This marked increase in the posting of preprints indicates that clinical researchers have found benefits of preprints in the era of COVID-19 pandemic: research outcomes can be disseminated quickly, potentially speeding up research that may lead to the development of vaccines and treatments; quality of the draft can be improved by receiving feedback from a wider group of readers; the authors can claim priority of their discovery; and unlike articles published in subscription-based journals, all the preprints are freely available to anyone….”

 

The challenge of preprints for public health The challenge of preprints for public health

“Despite disagreements over whether this form of publication is actually beneficial or not, its advantages and problems present a high degree of convergence among advocates and detractors. On the one hand, preprint is beneficial because it is a quicker way to disseminate scientific content with open access to everyone; on the other hand, the lack of adequate vetting, especially for peer reviews, increases the risk of disseminating bad science and can lead to several problems 2. The dissent lies in considering to what extent possible risks overcome possible benefits (or vice versa).

 

The argument about this rapid dissemination has strong supporting evidence. A study on preprint publication showed that preprint are published on average 14 months earlier than peer-reviewed journal articles 1. This is expected considering that the time-intensive process of peer reviews and revising manuscripts is totally bypassed. However, in this strength lies its very fragility: how to assure that this shorter process will not compromise the quality of the publication?

 

ASAPbio (Accelerating Science and Publication in Biology) 3 is a group of biology researchers that promotes preprint publication and has produced a number of studies that attempt to allay concerns about its quality, claiming, for example, that published articles previously submitted to a preprint server did not show relevant changes for its publication 4. Authors from this group have argued that the current approaches to evaluate research and researchers hold back a more widespread adoption of the preprint methodology 5, which would explain its relatively small participation on the general panorama of scientific publication.

 

Despite claims to the contrary, however, there are examples of poor studies published as preprints, which caused undesirable consequences in public health. Two methodologically flawed studies about a protective effect of tobacco smoking against COVID-19 (one of which has an author with known connections with the tobacco industry), for example, increased the commercialization of tobacco products in France and Iran 6 and a virology study that erroneously stated that the SARS-COV-2 virus had “HIV insertions” fueled conspiracy theories about the former virus being a bioweapon, which lingered on even after the preprint was removed from the server due to its egregious errors 7. Studies have found that much of the public discussion and even policy was indeed driven by what was published in preprints rather than in scientific journals 7,8,9,10, thus, quality issues are a major cause of concern.

 

On the other hand, similar errors have been observed within traditional publishing; the publication of a poor quality paper with undisclosed conflicts of interest in one of the most prestigious medical journals, The Lancet, which became the trigger for the contemporary wave of anti-vaccine activism, is a major, and regretful, example. Understanding to what extent this problem is likely to occur with or without gatekeeping mechanisms is necessary….”

Open access and the evolving academic publishing landscape of the water sector: Water International: Vol 0, No 0

“The launch of many new water journals in recent years is a testament to the growth and importance of water research as a problematique, that is, as both a problem in and of itself and as an important correlate of other global challenges. As entire regions start to run dry or suffer repeated flooding due to climate change, it is more important than ever to understand water availability, quality, use and governance. And as the burgeoning industry of ‘nexus’ studies shows, researchers and policy- makers have discovered at, indeed, most elements of society are linked to water. This is a great time to be a water scholar with exciting new opportunities to collaborate with researchers from across the natural and social sciences, engineering. and humanities. Water scholars also have initiated many new journals, book series, etc., that clamour for our insights and academic production. But there are tensions too, linked to the perhaps too-rapid proliferation of journals, their transition to open access (OA) business models, and the unhealthy ways in which these are linked to career prospects for water scholars.”

WHO guiding principles for pathogen genome data sharing

“The world needs timely, high quality and geographically representative sharing of pathogen genome data in as close to real time as possible. When pathogen genome data is shared nationally and internationally, it helps to prevent, detect, and respond to epidemics and pandemics. Regular collection and sharing of pathogen genome data is also crucial for endemic diseases, especially for pathogens that are resistant to antimicrobials and require regularly updated policies. Genomic surveillance is critical for early warning of new epidemics, to monitor the evolution of infectious disease agents, and develop diagnostics, medicines and vaccines. This technology has been crucial in our response to the COVID-19 pandemic, from identifying a novel coronavirus to developing the first diagnostic tests and vaccines, to tracking and identifying new variants. …”

Does Scholarly Publishing Have an Innovation Problem? – The Scholarly Kitchen

“At PLOS, we’ve been in deep conversation over the past few months with a number of people from four core groups: researchers, senior university administrators, funders, and librarians. Our conversations have been with a selected group who are engaged with the transition to open science (and weighted towards the biomedical sciences) so there’s clearly some inbuilt bias. But I think that my key takeaways have wider relevance….

 

While our stakeholders were all staunch supporters of open research, we heard significant divergence from librarians about business models. European library budgeting and negotiating is still heavily linked to the legacy of APCs and assessed by cost per article. In the US, every librarian we spoke to was strongly anti-APC. But all librarians were deeply frustrated with the pain and cost and managing OA deals – whatever their nature – and copyright terms across publishers. And all stakeholder groups expressed concern that moves towards open research would lead to further “land grabs” by large publishers to control yet more of the research enterprise….”

Google Scholar – Platforming the Scholarly Economy | Internet Policy Review

Abstract:  Google Scholar has become an important player in the scholarly economy. Whereas typical academic publishers sell bibliometrics, analytics and ranking products, Alphabet, through Google Scholar, provides “free” tools for academic search and scholarly evaluation that have made it central to academic practice. Leveraging political imperatives for open access publishing, Google Scholar has managed to intermediate data flows between researchers, research managers and repositories, and built its system of citation counting into a unit of value that coordinates the scholarly economy. At the same time, Google Scholar’s user-friendly but opaque tools undermine certain academic norms, especially around academic autonomy and the academy’s capacity to understand how it evaluates itself.

 

Access to chemical database Reaxys under threat in UK as fees spiral | Chemistry World

Concerns have been raised that institutional access to the Reaxys chemical and reactions database could end at universities across the UK in a row over rising costs. The dispute over subscription fees is being described as a potentially significant problem for chemists in the UK, and maybe worldwide.

Reaxys incorporates Beilstein – the largest organic chemistry database – and Gmelin – a sizeable repository of organometallic and inorganic compounds and databases, as well as other key chemistry resources. Launched in 2009 and licensed by commercial publishing giant Elsevier, Reaxys enables research chemists to search and find chemical compounds, reactions, properties and synthesis planning information. It also includes chemical patent literature.

The Joint Information Systems Committee (Jisc), an organisation that assists UK universities with digital resources and negotiates on behalf of the UK higher education and research sector, is currently in talks with Elsevier to make institutional access to Reaxys more affordable.

[…]