Transparency of COVID-19-related research: A meta-research study | PLOS ONE

Abstract:  Background

We aimed to assess the adherence to five transparency practices (data availability, code availability, protocol registration and conflicts of interest (COI), and funding disclosures) from open access Coronavirus disease 2019 (COVID-19) related articles.

Methods

We searched and exported all open access COVID-19-related articles from PubMed-indexed journals in the Europe PubMed Central database published from January 2020 to June 9, 2022. With a validated and automated tool, we detected transparent practices of three paper types: research articles, randomized controlled trials (RCTs), and reviews. Basic journal- and article-related information were retrieved from the database. We used R for the descriptive analyses.

Results

The total number of articles was 258,678, of which we were able to retrieve full texts of 186,157 (72%) articles from the database Over half of the papers (55.7%, n = 103,732) were research articles, 10.9% (n = 20,229) were review articles, and less than one percent (n = 1,202) were RCTs. Approximately nine-tenths of articles (in all three paper types) had a statement to disclose COI. Funding disclosure (83.9%, confidence interval (CI): 81.7–85.8 95%) and protocol registration (53.5%, 95% CI: 50.7–56.3) were more frequent in RCTs than in reviews or research articles. Reviews shared data (2.5%, 95% CI: 2.3–2.8) and code (0.4%, 95% CI: 0.4–0.5) less frequently than RCTs or research articles. Articles published in 2022 had the highest adherence to all five transparency practices. Most of the reviews (62%) and research articles (58%) adhered to two transparency practices, whereas almost half of the RCTs (47%) adhered to three practices. There were journal- and publisher-related differences in all five practices, and articles that did not adhere to transparency practices were more likely published in lowest impact journals and were less likely cited.

Conclusion

While most articles were freely available and had a COI disclosure, adherence to other transparent practices was far from acceptable. A much stronger commitment to open science practices, particularly to protocol registration, data and code sharing, is needed from all stakeholders.

The next chapter for protocols.io

“Today, Springer Nature announced the acquisition of protocols.io. The press release is available here, but in this blog post we would like to share what this means for protocols.io as a company and for the researchers using the platform. Before we delve into the details, it is important to confirm that our business model, open access repository, and mission and vision will not alter with this acquisition….”

Springer Nature continues open research drive with acquisition of protocols.io | Springer Nature Group | Springer Nature

“Springer Nature, the world’s leading publisher of protocols, has acquired protocols.io – a secure platform for developing and sharing reproducible methods.

Scientific advancement depends on data credibility and work that can be verified, built upon and reproduced. Sharing  all elements of research, including data, methods and materials, and even negative results, makes research more  efficient, enables reproducibility and therefore builds trust in science. Studies show that lack of awareness of existing work or negative results leads to unnecessary duplication and could waste up to €26 billion in Europe alone. 

By laying out detailed step-by-step instructions for research methods, aiming to standardise the process, ensure accuracy of results and enabling research to be reproduced, protocols have a vital role to play in addressing this. With protocols.io joining Springer Nature’s leading protocol offering, researchers will now have the option to make their protocols openly available on the protocols.io platform (fully OA)  as well as publishing them in peer-reviewed publications (searchable via the Springer Nature Experiments)….”

A Simple Replication Agreement Could Improve Trust in Science – Scientific American

“One of the major challenges driving the replication crisis is that scientists often do not share all information needed to replicate their work. Access to research materials is especially crucial for the replication of computational studies, given the increasing utilization of computational methods and the data-reliant nature of such studies on large data sets. Unfortunately, it is far from guaranteed.

There are many reasons why….

To address this conflict, we propose a new policy instrument that could facilitate studies’ replicability without depriving scientists of their IP protection: the conditional access agreement (CAA). In short, the CAA establishes a private, controlled channel of communication for the transfer of replication materials between authors and replicators. This allows for on-demand replicability while maintaining the proprietary potential of a scientific study.

Under the CAA mechanism, when submitting a paper for publication, an author would execute an agreement with the journal, pledging to provide full access to replication materials upon demand by other researchers. The agreement would specify that anyone requesting access to the materials can only obtain it upon signing a nondisclosure agreement (NDA). The NDA would prohibit the use of the replication materials delivered by the original authors for any purpose other than replication….”

Facilitated Preprint Posting is now available for Lab Protocols at PLOS ONE – EveryONE

“When authors submit a Lab Protocol to PLOS ONE, they prepare a short manuscript that contextualizes their step-by-step protocol, describing the value it adds to the published literature and providing evidence that the protocol works. This additional context helps readers to decide whether and, if so, how to adapt the protocol for their own research.

In 2023, PLOS is making it easier for authors to share these protocol manuscripts as preprints, by expanding our partnership with the preprint server bioRxiv to include Lab Protocols.

During the submission process, Lab Protocol authors will now be asked if they want PLOS to forward their manuscript to bioRxiv to be considered for public posting within a few days. Facilitated posting to bioRxiv has been offered at PLOS ONE since 2018. Extending this service to Lab Protocols means that authors can share and get credit for their methods development work sooner, even as the peer review process unfolds….”

Change to protocols.io free plans

“Starting May 15, 2023, we will reduce the number of free private protocols from five (5) to two (2) for non-premium accounts. (As an integral commitment of protocols.io, public content will remain free to read and share.)

 

On this webinar, protocols.io CEO will discuss the reasons for the change and will answer questions regarding it.

 

The change will help protocols.io move towards financial sustainability and will encourage more academic researchers to share their protocols publicly. We remain committed to supporting free sharing and access to all public protocols.”

Open science in health psychology and behavioral medicine: A statement from the Behavioral Medicine Research Council.

Abstract:  Open Science practices include some combination of registering and publishing study protocols (including hypotheses, primary and secondary outcome variables, and analysis plans) and making available preprints of manuscripts, study materials, de-identified data sets, and analytic codes. This statement from the Behavioral Medicine Research Council (BMRC) provides an overview of these methods, including preregistration; registered reports; preprints; and open research. We focus on rationales for engaging in Open Science and how to address shortcomings and possible objections. Additional resources for researchers are provided. Research on Open Science largely supports positive consequences for the reproducibility and reliability of empirical science. There is no solution that will encompass all Open Science needs in health psychology and behavioral medicine’s diverse research products and outlets, but the BMRC supports increased use of Open Science practices where possible.

 

Science Publishing Innovation: Why Do So Many Good Ideas Fail? – Science Editor

“Over a decade ago, BioMed Central (BMC) recognized the importance of postpublication discussion. Prepublication review can improve papers and catch errors, but only time and subsequent work of other scientists can truly show which results in a publication are robust and valid. Unlike a print journal (or print as a medium, in general), the Internet permits the readers to comment on published papers over time. So in 2002 BMC developed and enabled commenting on every one of its articles across its suite of journals. Not only does this allow for postpublication review, but it enables readers to easily ask authors and other readers a question, with public responses enriching the original manuscript, clarifying, and helping to improve the comprehension of the work.

This is a terrific idea, but it didn’t really catch on….

Remarkably, despite the creation of arXiv for physicists in 1990 and despite the enthusiastic embrace of preprints by the physics community, it has been assumed this is impossible for biology. The common argument is that biologists are different from physicists and the arXiv success is not informative. What many did find telling is the death of the 2007 preprint initiative from the Nature Publishing Group (NPG). NPG tried preprints with Nature Precedings, but adoption was low and in 2012 NPG pulled the plug on the experiment.3 This triggered some skepticism about the prospects of the bioRxiv preprint effort from Cold Spring Harbor Lab (CSHL) Press.4 Critics told the director of CSHL Press, John Inglis, that a preprint for biologists simply couldn’t work.5

Once again, we must ask the cause of the Nature Precedings failure. Did NPG kill it because biologists wouldn’t behave in the same way as physicists? We know that isn’t the case. Preprints in biology are all the rage today….

In the winter of 2012, Alexei Stoliartchouk and I came up with the idea for protocols.io—a central place where scientists can share and discover science methods. We wanted to create a site where corrections and the constant tweaking of science methods could be shared, even after publication in a journal….

Few people know about bioprotocols.com, but many know about OpenWetWare (OWW) and Nature Protocol Exchange—both open-access community resources for sharing protocols. Both have been mentioned to me countless times as evidence that protocols.io wouldn’t work. As with preprints, the problems that OWW and Protocol Exchange faced seemed to be proof that biologists would not share details of their methods on such a platform. As with bioRxiv, we are in the early days of protocols.io, but judging from the growth in the figure below, it’s hard to argue that biologists don’t need this or that they won’t take the time to publicly share their methods….”

Provide public access to ethics-approved study protocols

“Daniël Lakens argues for methodological review of study protocols before data are collected (Nature 613, 9; 2023). I call, in addition, for all study protocols involving ethics approval to be made publicly available once approval is granted. At a minimum, the protocols should be included in submitted and published research papers.

Sharing protocols is part of open science (see, for example, go.nature.com/4oicpuy and go.nature.com/4odf6wi). But biomedical research frequently flouts open-science principles (S. Serghiou et al. PLoS Biol. 19, e3001107; 2021). Protocols are often not available or not provided on request. When they are provided, they commonly date from after the study began, or do not align with study conduct and reporting (D. Campbell et al. Trials 23, 674; 2022)….”

MetaArXiv Preprints | Reproducible research practices and transparency across linguistics

Abstract:  Scientific studies of language span across many disciplines and provide evidence for social, cultural, cognitive, technological, and biomedical studies of human nature and behavior. By becoming increasingly empirical and quantitative, linguistics has been facing challenges and limitations of the scientific practices that pose barriers to reproducibility and replicability. One of the proposed solutions to the widely acknowledged reproducibility and replicability crisis has been the implementation of transparency practices, e.g. open access publishing, preregistrations, sharing study materials, data, and analyses, performing study replications and declaring conflicts of interest. Here, we have assessed the prevalence of these practices in randomly sampled 600 journal articles from linguistics across two time points. In line with similar studies in other disciplines, we found a moderate amount of articles published open access, but overall low rates of sharing materials, data, and protocols, no preregistrations, very few replications and low rates of conflict of interest reports. These low rates have not increased noticeably between 2008/2009 and 2018/2019, pointing to remaining barriers and slow adoption of open and reproducible research practices in linguistics. As linguistics has not yet firmly established transparency and reproducibility as guiding principles in research, we provide recommendations and solutions for facilitating the adoption of these practices.

 

Open Science in Developmental Science | Annual Review of Developmental Psychology

Abstract:  Open science policies have proliferated in the social and behavioral sciences in recent years, including practices around sharing study designs, protocols, and data and preregistering hypotheses. Developmental research has moved more slowly than some other disciplines in adopting open science practices, in part because developmental science is often descriptive and does not always strictly adhere to a confirmatory approach. We assess the state of open science practices in developmental science and offer a broader definition of open science that includes replication, reproducibility, data reuse, and global reach.

 

Research transparency in dental research: A programmatic analysis – Raittio – European Journal of Oral Sciences – Wiley Online Library

Abstract:  We assessed adherence to five transparency practices—data sharing, code sharing, conflict of interest disclosure, funding disclosure, and protocol registration—in articles in dental journals. We searched and exported the full text of all research articles from PubMed-indexed dental journals available in the Europe PubMed Central database until the end of 2021. We programmatically assessed their adherence to the five transparency practices using a validated and automated tool. Journal- and article-related information was retrieved from ScimagoJR and Journal Citation Reports. Of all 329,784 articles published in PubMed-indexed dental journals, 10,659 (3.2%) were available to download. Of those, 77% included a conflict of interest disclosure, and 62% included a funding disclosure. Seven percent of the articles had a registered protocol. Data sharing (2.0%) and code sharing (0.1%) were rarer. Sixteen percent of articles did not adhere to any of the five transparency practices, 29% adhered to one, 48% adhered to two, 7.0% adhered to three, 0.3% adhered to four, and no article adhered to all five practices. Adherence to transparency practices increased over time; however, data and code sharing especially remained rare. Coordinated efforts involving all stakeholders are needed to change current transparency practices in dental research.

 

How do researchers really feel about methods-sharing? – The Official PLOS Blog

“In scientific communications, methods are finally getting their due. Tools for better-communicating methods are everywhere these days—from new reporting standards and methods-specific article types, to dedicated methods journals and purpose-built repository platforms. But so far, no single solution has enjoyed wide adoption or been generally acknowledged as best practice.

Now, new data gathered by PLOS, with the support of protocols.io and TCC Africa, sheds light on how researchers view methods, and lends insight into their motivations and behaviors when it comes to methods-sharing. Over 1,000 researchers completed the survey. Respondents were concentrated primarily in the Life and Health Sciences, and tended to be more senior in their careers. Read on for highlights, or skip straight to the preprint for in-depth details.

Takeaway #1 – Established methods-sharing norms are insufficient…

Takeaway #2 – Researchers see methods sharing as important…

Takeaway #3 – When it comes to their specific goals, researchers aren’t satisfied…

Takeaway #4 – The main blockers to methods-sharing are practical…”

Left in the dark: the importance of publicly available clinical trial protocols – Braat – 2022 – Medical Journal of Australia – Wiley Online Library

“Prospective registration of a randomised controlled trial (RCT) based on a protocol with formal ethics approval is a benchmark for transparent medical research. The reporting of the primary results of the study should correspond to the design, analysis, and reporting specified in the protocol and trial registration. However, modifications to various aspects of the trial are often made after registration, ranging from administrative updates to substantial protocol amendments. To track the history of revisions, the protocol and registry entry should be updated, and the documentation trail should support an independent appraisal of whether any biases have been introduced that could affect interpretation of trial results.

In this issue of the MJA, Coskinas and colleagues report their investigation of changes to 181 phase 3 RCTs registered with the Australian New Zealand Clinical Trials Registry (ANZCTR) during 1 September 2007 – 31 December 2013.1 The authors compared protocol documents (including ANZCTR registration information) with subsequent journal publications for any changes to the primary outcome, treatment comparisons, analysis set definition, eligibility criteria, sample size, or primary analysis method. They found that protocols were available for only 124 trials (69%); it could be determined that no major changes had been made to eleven of these trials (9%), while 78 had definitely been modified (63%). By comparing publications with trial registration information, it was found that no changes were made to five of the 57 trials without available protocols (9%), and it could not be determined whether changes had been made to a further ten (18%)….”