SciELO – Public Health – The challenge of preprints for public health The challenge of preprints for public health

“ASAPbio (Accelerating Science and Publication in Biology) 3 is a group of biology researchers that promotes preprint publication and has produced a number of studies that attempt to allay concerns about its quality, claiming, for example, that published articles previously submitted to a preprint server did not show relevant changes for its publication 4. Authors from this group have argued that the current approaches to evaluate research and researchers hold back a more widespread adoption of the preprint methodology 5, which would explain its relatively small participation on the general panorama of scientific publication.

 

Despite claims to the contrary, however, there are examples of poor studies published as preprints, which caused undesirable consequences in public health. Two methodologically flawed studies about a protective effect of tobacco smoking against COVID-19 (one of which has an author with known connections with the tobacco industry), for example, increased the commercialization of tobacco products in France and Iran 6 and a virology study that erroneously stated that the SARS-COV-2 virus had “HIV insertions” fueled conspiracy theories about the former virus being a bioweapon, which lingered on even after the preprint was removed from the server due to its egregious errors 7. Studies have found that much of the public discussion and even policy was indeed driven by what was published in preprints rather than in scientific journals 7,8,9,10, thus, quality issues are a major cause of concern.

 

On the other hand, similar errors have been observed within traditional publishing; the publication of a poor quality paper with undisclosed conflicts of interest in one of the most prestigious medical journals, The Lancet, which became the trigger for the contemporary wave of anti-vaccine activism, is a major, and regretful, example. Understanding to what extent this problem is likely to occur with or without gatekeeping mechanisms is necessary.

 

Preprint advocates countered that the effect of poor science disseminated via preprints would be lessened by media reporting that explicitly indicated that those studies did not undergo any peer review and, thus, required more criticism and reserve before being considered essential sources for a public debate. It was probably the case of South African media 8, but in Brazil, a study found that less than 40% of preprint-based reports on mass media clearly showed their provisional character 11….”

Preprints in Health Professions Education: Raising Awareness… : Academic Medicine

Abstract:  A preprint is a version of a research manuscript posted by its authors to a preprint server before peer review. Preprints are associated with a variety of benefits, including the ability to rapidly communicate research, the opportunity for researchers to receive feedback and raise awareness of their research, and broad and unrestricted access. For early-career researchers, preprints also provide a mechanism for demonstrating research progress and productivity without the lengthy timelines of traditional journal publishing. Despite these benefits, few health professions education (HPE) research articles are deposited as preprints, suggesting that preprinting is not currently integrated into HPE culture. In this article, the authors describe preprints, their benefits and related risks, and the potential barriers that hamper their widespread use within HPE. In particular, the authors propose the barriers of discordant messaging and the lack of formal and informal education on how to deposit, critically appraise, and use preprints. To mitigate these barriers, several recommendations are proposed to facilitate preprints in becoming an accepted and encouraged component of HPE culture, allowing the field to take full advantage of this evolving form of research dissemination.

 

Open Science: Emergency Response or the New Normal? | Acta Médica Portuguesa

From Google’s English:  “To align with open science, the assessment of research and researchers has to be broader, valuing all contributions and results (and not just publications), and adopting an essentially qualitative perspective, based on the review by peers, with limited and responsible use of quantitative indicators. There has also been slow progress in this domain, but it is hoped that the recently presented Agreement on Reforming Research Assessment and the Coali-tion for Advancing Research Assessment10 will speed up and give greater breadth to the transformation of the assessment process. If the three conditions mentioned above are met in the coming years, open science will no longer be just the science of emergencies. And open and collaborative research practices, with rapid dissemination of results, could become dominant, being considered the correct way of doing science, without the need to designate them as open science.”

Antibiotic discovery in the artificial intelligence era – Lluka – Annals of the New York Academy of Sciences – Wiley Online Library

Abstract:  As the global burden of antibiotic resistance continues to grow, creative approaches to antibiotic discovery are needed to accelerate the development of novel medicines. A rapidly progressing computational revolution—artificial intelligence—offers an optimistic path forward due to its ability to alleviate bottlenecks in the antibiotic discovery pipeline. In this review, we discuss how advancements in artificial intelligence are reinvigorating the adoption of past antibiotic discovery models—namely natural product exploration and small molecule screening. We then explore the application of contemporary machine learning approaches to emerging areas of antibiotic discovery, including antibacterial systems biology, drug combination development, antimicrobial peptide discovery, and mechanism of action prediction. Lastly, we propose a call to action for open access of high-quality screening datasets and interdisciplinary collaboration to accelerate the rate at which machine learning models can be trained and new antibiotic drugs can be developed.

 

The challenge of preprints for public health The challenge of preprints for public health

“Despite disagreements over whether this form of publication is actually beneficial or not, its advantages and problems present a high degree of convergence among advocates and detractors. On the one hand, preprint is beneficial because it is a quicker way to disseminate scientific content with open access to everyone; on the other hand, the lack of adequate vetting, especially for peer reviews, increases the risk of disseminating bad science and can lead to several problems 2. The dissent lies in considering to what extent possible risks overcome possible benefits (or vice versa).

 

The argument about this rapid dissemination has strong supporting evidence. A study on preprint publication showed that preprint are published on average 14 months earlier than peer-reviewed journal articles 1. This is expected considering that the time-intensive process of peer reviews and revising manuscripts is totally bypassed. However, in this strength lies its very fragility: how to assure that this shorter process will not compromise the quality of the publication?

 

ASAPbio (Accelerating Science and Publication in Biology) 3 is a group of biology researchers that promotes preprint publication and has produced a number of studies that attempt to allay concerns about its quality, claiming, for example, that published articles previously submitted to a preprint server did not show relevant changes for its publication 4. Authors from this group have argued that the current approaches to evaluate research and researchers hold back a more widespread adoption of the preprint methodology 5, which would explain its relatively small participation on the general panorama of scientific publication.

 

Despite claims to the contrary, however, there are examples of poor studies published as preprints, which caused undesirable consequences in public health. Two methodologically flawed studies about a protective effect of tobacco smoking against COVID-19 (one of which has an author with known connections with the tobacco industry), for example, increased the commercialization of tobacco products in France and Iran 6 and a virology study that erroneously stated that the SARS-COV-2 virus had “HIV insertions” fueled conspiracy theories about the former virus being a bioweapon, which lingered on even after the preprint was removed from the server due to its egregious errors 7. Studies have found that much of the public discussion and even policy was indeed driven by what was published in preprints rather than in scientific journals 7,8,9,10, thus, quality issues are a major cause of concern.

 

On the other hand, similar errors have been observed within traditional publishing; the publication of a poor quality paper with undisclosed conflicts of interest in one of the most prestigious medical journals, The Lancet, which became the trigger for the contemporary wave of anti-vaccine activism, is a major, and regretful, example. Understanding to what extent this problem is likely to occur with or without gatekeeping mechanisms is necessary….”

The challenge of preprints for public health

“Preprints are “a form of a scholarly article which is not peer-reviewed yet but made available either as paper format or electronic copy” 1. After an early attempt by the U.S. National Institutes of Health in the early 1960s, this format really took hold in the early 1990s, first as an email server at Los Alamos National Laboratory, which later became a web service known as arXiv 1. In the following years, the number of both preprint servers and total preprints submitted to web services increased considerably, however, preprints are still a small fraction (6.4%) of the total output of scientific publication 1. Despite disagreements over whether this form of publication is actually beneficial or not, its advantages and problems present a high degree of convergence among advocates and detractors. On the one hand, preprint is beneficial because it is a quicker way to disseminate scientific content with open access to everyone; on the other hand, the lack of adequate vetting, especially for peer reviews, increases the risk of disseminating bad science and can lead to several problems 2. The dissent lies in considering to what extent possible risks overcome possible benefits (or vice versa)….”

WHO guiding principles for pathogen genome data sharing

“The world needs timely, high quality and geographically representative sharing of pathogen genome data in as close to real time as possible. When pathogen genome data is shared nationally and internationally, it helps to prevent, detect, and respond to epidemics and pandemics. Regular collection and sharing of pathogen genome data is also crucial for endemic diseases, especially for pathogens that are resistant to antimicrobials and require regularly updated policies. Genomic surveillance is critical for early warning of new epidemics, to monitor the evolution of infectious disease agents, and develop diagnostics, medicines and vaccines. This technology has been crucial in our response to the COVID-19 pandemic, from identifying a novel coronavirus to developing the first diagnostic tests and vaccines, to tracking and identifying new variants. …”

Understanding theIncreasing Market Share of the Academic Publisher “Multidisciplinary Digital Publishing Institute” in the Publication Output of Central and Eastern European Countries: A Case Study of Hungary

As the open access movement has gained widespread popularity in the scientific community, academic publishers have gradually adapted to the new environment. The pioneer open access journals have turned themselves into megajournals, and the subscription-based publishers have established open access branches and have turned subscription-based journals into hybrid ones. Maybe the most dramatic outcome of the open access boom is the market entry of such fast-growing open access publishers as Frontiers and Multidisciplinary Digital Publishing Institute (MDPI). By 2021, in terms of the number of papers published, MDPI has become one of the largest academic publishers worldwide. However, the publisher’s market shares across countries and regions show an uneven pattern. Whereas in such scientific powers as the United States and China, MDPI has remained a relatively small-scale player, it has gained a high market share in Europe, particularly in the Central and Eastern European (CEE) countries. In 2021, 28 percent of the SCI/SSCI papers authored/co-authored by researchers from CEE countries were published in MDPI journals, a share that was as high as the combined share of papers published by Elsevier and Springer Nature, the two largest academic publishers in the world. This paper seeks to find an explanation for the extensively growing share of MDPI in the publication outputs of CEE countries by choosing Hungary as a case study. To do this, by employing data analysis, some unique features of MDPI will be revealed. Then, we will present the results of a questionnaire survey conducted among Hungary-based researchers regarding MDPI and the factors that motivated them to publish in MDPI journals. Our results show that researchers generally consider MDPI journals’ sufficiently prestigious, emphasizing the importance of the inclusion of MDPI journals in Scopus and Web of Science databases and their high ranks and impacts. However, most researchers posit that the quick turnaround time that MDPI journals offer is the top driver of publishing in such journals.

Does the Peer Review Process Need Blockchain? – NEO.LIFE

“Another major change in scientific publishing could come from the same blockchain-based infrastructure that’s enabling the rise of the rest of decentralized science. Washington University faculty member and VitaDAO core contributor Tim Peterson proposed his own peer review alternative, called The Longevity Decentralized Review (TLDR), and is assembling a team of editors to begin reviewing papers on longevity and aging….

 

TLDR works a lot like Reddit: First researchers post their work publicly, either directly or to numerous so-called “pre-print” servers like bioRxiv or medRxiv. These have been around for several years but became much more influential during the COVID-19 pandemic because of the speed with which they could bring research to other scientists. Reviewers get paid by the TLDR site, which is funded through charitable donations and from anyone who would like their manuscript peer-reviewed. VitaDAO is one of the TLDR backers, offering $VITA tokens for peer review of longevity-related projects of interest to VitaDAO. It’s anybody’s guess whether this will result in meaningful income to reviewers, but it’ll be more than the zero dollars and zero cents they earn now….”

Panel: Trends in Peer Review of Open Access Preprints

“Speed of research is a major feature of open access preprint platforms like arXiv – formal peer review can follow later after rapid distribution of results. However, as submissions to arXiv and other preprint servers have grown, many researchers are seeking new avenues for community feedback and peer review. At this panel discussion”, leaders in preprints and peer review will discuss current trends in virtual overlay journals, open peer reviews, and more.

 

Tips for requesting articles from Internet Archive on OCLC’s resource sharing network | OCLC

“Join us for a webinar on November 9 to learn how Internet Archive is now quickly fulfilling Interlibrary Loan (ILL) requests for articles at no charge from libraries that use WorldShare ILL, Tipasa, and ILLiad. Staff at Internet Archive (OCLC symbol: IAILL) supply articles fast—with an average turnaround time of 37 minutes on OCLC’s resource sharing network.”

Frontiers | neuPrint: An open access tool for EM connectomics

Abstract:  Due to advances in electron microscopy and deep learning, it is now practical to reconstruct a connectome, a description of neurons and the chemical synapses between them, for significant volumes of neural tissue. Smaller past reconstructions were primarily used by domain experts, could be handled by downloading data, and performance was not a serious problem. But new and much larger reconstructions upend these assumptions. These networks now contain tens of thousands of neurons and tens of millions of connections, with yet larger reconstructions pending, and are of interest to a large community of non-specialists. Allowing other scientists to make use of this data needs more than publication—it requires new tools that are publicly available, easy to use, and efficiently handle large data. We introduce neuPrint to address these data analysis challenges. Neuprint contains two major components—a web interface and programmer APIs. The web interface is designed to allow any scientist worldwide, using only a browser, to quickly ask and answer typical biological queries about a connectome. The neuPrint APIs allow more computer-savvy scientists to make more complex or higher volume queries. NeuPrint also provides features for assessing reconstruction quality. Internally, neuPrint organizes connectome data as a graph stored in a neo4j database. This gives high performance for typical queries, provides access though a public and well documented query language Cypher, and will extend well to future larger connectomics databases. Our experience is also an experiment in open science. We find a significant fraction of the readers of the article proceed to examine the data directly. In our case preprints worked exactly as intended, with data inquiries and PDF downloads starting immediately after pre-print publication, and little affected by formal publication later. From this we deduce that many readers are more interested in our data than in our analysis of our data, suggesting that data-only papers can be well appreciated and that public data release can speed up the propagation of scientific results by many months. We also find that providing, and keeping, the data available for online access imposes substantial additional costs to connectomics research.

 

EU/EEA routine surveillance open data policy

“Aggregate EU/EEA epidemiological routine surveillance data shall be:

As open as possible and as closed as necessary to protect personal or commercially sensitive information;
Compatible with the FAIR principles, i.e. findable, accessible, interoperable and reusable;
Publicly and easily accessible to any interested party regardless of their motivation;
Accessible free of charge through a standardised open access license;
Shared as timely as possible….”

Early sharing not the only driver for preprint use | Research Information

“But what is interesting, is that while early sharing came out as important for authors, it is not their only driving motivator when using and selecting such services and adopting more open research practices. Authors are looking for more integrated services and want those platforms to offer multiple features that not only enhance the sharing, development and discoverability of their work, but also enable them to track and monitor its progress:   

Transparency was the top feature for authors when selecting an integrated preprint service:

71 per cent of authors said that greater transparency of the peer review process at journals was useful. Through its integration with peer review, In Review enables authors to see specific details of peer review and track their article, providing a high level of transparency into an often ‘hidden’ process.

50 per cent of authors said that the more transparent the service was, the more they felt it was credible, as it enabled greater accountability for the journal

Integrated early sharing – authors surveyed stated that ease of use (69 per cent) and being able to share their manuscript as a preprint at the same time as submitting it to a journal (BMC/ Springer journals) (83 per cent) had an impact on where they choose to take their work. We also learnt that this type of integrated solution is attractive for researchers in LMICs and early career researchers….”