Provide public access to ethics-approved study protocols

“Daniël Lakens argues for methodological review of study protocols before data are collected (Nature 613, 9; 2023). I call, in addition, for all study protocols involving ethics approval to be made publicly available once approval is granted. At a minimum, the protocols should be included in submitted and published research papers.

Sharing protocols is part of open science (see, for example, go.nature.com/4oicpuy and go.nature.com/4odf6wi). But biomedical research frequently flouts open-science principles (S. Serghiou et al. PLoS Biol. 19, e3001107; 2021). Protocols are often not available or not provided on request. When they are provided, they commonly date from after the study began, or do not align with study conduct and reporting (D. Campbell et al. Trials 23, 674; 2022)….”

MetaArXiv Preprints | Reproducible research practices and transparency across linguistics

Abstract:  Scientific studies of language span across many disciplines and provide evidence for social, cultural, cognitive, technological, and biomedical studies of human nature and behavior. By becoming increasingly empirical and quantitative, linguistics has been facing challenges and limitations of the scientific practices that pose barriers to reproducibility and replicability. One of the proposed solutions to the widely acknowledged reproducibility and replicability crisis has been the implementation of transparency practices, e.g. open access publishing, preregistrations, sharing study materials, data, and analyses, performing study replications and declaring conflicts of interest. Here, we have assessed the prevalence of these practices in randomly sampled 600 journal articles from linguistics across two time points. In line with similar studies in other disciplines, we found a moderate amount of articles published open access, but overall low rates of sharing materials, data, and protocols, no preregistrations, very few replications and low rates of conflict of interest reports. These low rates have not increased noticeably between 2008/2009 and 2018/2019, pointing to remaining barriers and slow adoption of open and reproducible research practices in linguistics. As linguistics has not yet firmly established transparency and reproducibility as guiding principles in research, we provide recommendations and solutions for facilitating the adoption of these practices.

 

Open Science in Developmental Science | Annual Review of Developmental Psychology

Abstract:  Open science policies have proliferated in the social and behavioral sciences in recent years, including practices around sharing study designs, protocols, and data and preregistering hypotheses. Developmental research has moved more slowly than some other disciplines in adopting open science practices, in part because developmental science is often descriptive and does not always strictly adhere to a confirmatory approach. We assess the state of open science practices in developmental science and offer a broader definition of open science that includes replication, reproducibility, data reuse, and global reach.

 

Research transparency in dental research: A programmatic analysis – Raittio – European Journal of Oral Sciences – Wiley Online Library

Abstract:  We assessed adherence to five transparency practices—data sharing, code sharing, conflict of interest disclosure, funding disclosure, and protocol registration—in articles in dental journals. We searched and exported the full text of all research articles from PubMed-indexed dental journals available in the Europe PubMed Central database until the end of 2021. We programmatically assessed their adherence to the five transparency practices using a validated and automated tool. Journal- and article-related information was retrieved from ScimagoJR and Journal Citation Reports. Of all 329,784 articles published in PubMed-indexed dental journals, 10,659 (3.2%) were available to download. Of those, 77% included a conflict of interest disclosure, and 62% included a funding disclosure. Seven percent of the articles had a registered protocol. Data sharing (2.0%) and code sharing (0.1%) were rarer. Sixteen percent of articles did not adhere to any of the five transparency practices, 29% adhered to one, 48% adhered to two, 7.0% adhered to three, 0.3% adhered to four, and no article adhered to all five practices. Adherence to transparency practices increased over time; however, data and code sharing especially remained rare. Coordinated efforts involving all stakeholders are needed to change current transparency practices in dental research.

 

How do researchers really feel about methods-sharing? – The Official PLOS Blog

“In scientific communications, methods are finally getting their due. Tools for better-communicating methods are everywhere these days—from new reporting standards and methods-specific article types, to dedicated methods journals and purpose-built repository platforms. But so far, no single solution has enjoyed wide adoption or been generally acknowledged as best practice.

Now, new data gathered by PLOS, with the support of protocols.io and TCC Africa, sheds light on how researchers view methods, and lends insight into their motivations and behaviors when it comes to methods-sharing. Over 1,000 researchers completed the survey. Respondents were concentrated primarily in the Life and Health Sciences, and tended to be more senior in their careers. Read on for highlights, or skip straight to the preprint for in-depth details.

Takeaway #1 – Established methods-sharing norms are insufficient…

Takeaway #2 – Researchers see methods sharing as important…

Takeaway #3 – When it comes to their specific goals, researchers aren’t satisfied…

Takeaway #4 – The main blockers to methods-sharing are practical…”

Left in the dark: the importance of publicly available clinical trial protocols – Braat – 2022 – Medical Journal of Australia – Wiley Online Library

“Prospective registration of a randomised controlled trial (RCT) based on a protocol with formal ethics approval is a benchmark for transparent medical research. The reporting of the primary results of the study should correspond to the design, analysis, and reporting specified in the protocol and trial registration. However, modifications to various aspects of the trial are often made after registration, ranging from administrative updates to substantial protocol amendments. To track the history of revisions, the protocol and registry entry should be updated, and the documentation trail should support an independent appraisal of whether any biases have been introduced that could affect interpretation of trial results.

In this issue of the MJA, Coskinas and colleagues report their investigation of changes to 181 phase 3 RCTs registered with the Australian New Zealand Clinical Trials Registry (ANZCTR) during 1 September 2007 – 31 December 2013.1 The authors compared protocol documents (including ANZCTR registration information) with subsequent journal publications for any changes to the primary outcome, treatment comparisons, analysis set definition, eligibility criteria, sample size, or primary analysis method. They found that protocols were available for only 124 trials (69%); it could be determined that no major changes had been made to eleven of these trials (9%), while 78 had definitely been modified (63%). By comparing publications with trial registration information, it was found that no changes were made to five of the 57 trials without available protocols (9%), and it could not be determined whether changes had been made to a further ten (18%)….”

Access to unpublished protocols and statistical analysis plans of randomised trials | Trials | Full Text

Abstract:  Background

Access to protocols and statistical analysis plans (SAPs) increases the transparency of randomised trial by allowing readers to identify and interpret unplanned changes to study methods, however they are often not made publicly available. We sought to determine how often study investigators would share unavailable documents upon request.

Methods

We used trials from two previously identified cohorts (cohort 1: 101 trials published in high impact factor journals between January and April of 2018; cohort 2: 100 trials published in June 2018 in journals indexed in PubMed) to determine whether study investigators would share unavailable protocols/SAPs upon request. We emailed corresponding authors of trials with no publicly available protocol or SAP up to four times.

Results

Overall, 96 of 201 trials (48%) across the two cohorts had no publicly available protocol or SAP (11/101 high-impact cohort, 85/100 PubMed cohort). In total, 8/96 authors (8%) shared some trial documentation (protocol only [n?=?5]; protocol and SAP [n?=?1]; excerpt from protocol [n?=?1]; research ethics application form [n?=?1]). We received protocols for 6/96 trials (6%), and a SAP for 1/96 trial (1%). Seventy-three authors (76%) did not respond, 7 authors responded (7%) but declined to share a protocol or SAP, and eight email addresses were invalid (8%). A total of 329 emails were sent (an average of 41 emails for every trial which sent documentation). After emailing authors, the total number of trials with an available protocol increased by only 3%, from 52% in to 55%.

Conclusions

Most study investigators did not share their unpublished protocols or SAPs upon direct request. Alternative strategies are needed to increase transparency of randomised trials and ensure access to protocols and SAPs.

A meta-research study of randomized controlled trials found infrequent and delayed availability of protocols – ScienceDirect

Abstract:  Objectives

Availability of randomized controlled trial (RCT) protocols is essential for the interpretation of trial results and research transparency.

Study Design and Setting

In this study, we determined the availability of RCT protocols approved in Switzerland, Canada, Germany, and the United Kingdom in 2012. For these RCTs, we searched PubMed, Google Scholar, Scopus, and trial registries for publicly available protocols and corresponding full-text publications of results. We determined the proportion of RCTs with (1) publicly available protocols, (2) publications citing the protocol, and (3) registries providing a link to the protocol. A multivariable logistic regression model explored factors associated with protocol availability.

Results

Three hundred twenty-six RCTs were included, of which 118 (36.2%) made their protocol publicly available; 56 (47.6% 56 of 118) provided as a peer-reviewed publication and 48 (40.7%, 48 of 118) provided as supplementary material. A total of 90.9% (100 of 110) of the protocols were cited in the main publication, and 55.9% (66 of 118) were linked in the clinical trial registry. Larger sample size (>500; odds ratio [OR] = 5.90, 95% confidence interval [CI], 2.75–13.31) and investigator sponsorship (OR = 1.99, 95% CI, 1.11–3.59) were associated with increased protocol availability. Most protocols were made available shortly before the publication of the main results.

Conclusion

RCT protocols should be made available at an early stage of the trial.

The experiment begins: Arcadia publishing 1.0 · Reimagining scientific publishing

“In thinking about how to share Arcadia’s research, we wanted to keep features of traditional publishing that have been honed over centuries, but improve upon what hasn’t quite adapted to the nature of modern science and technology. We have a unique opportunity to use our own research to develop mechanisms of sharing and quality control that can be more agile and adaptable. Our initial attempt is outlined here and we will continue to iterate upon it, always keeping the advancement of knowledge as our guiding principle when making decisions on what to try next….

We are reimagining scientific publishing — sharing our work early and often, maximizing utility and reusability, and improving our science on the basis of public feedback.

This is our first draft. We have ambitious goals and we’re committed to replicable long-term solutions, but we also know that “perfection is the enemy of good.” We’re using this platform to release findings now rather than hiding them until we’ve gotten everything exactly how we want it. Readers can think of the pubs on this platform as drafts that will evolve over time, shaped by public feedback. The same goes for the platform itself! We’re treating our publishing project like an experiment — we’re not sure where we will land, but we can only learn if we try. In this pub, we’re sharing our strategy and the reasoning behind some of our key decisions, highlighting features we’re excited about and areas for improvement. …

Assessing Open Science practices in physical activity behaviour change intervention evaluations | BMJ Open Sport & Exercise Medicine

Abstract:  Objectives Concerns on the lack of reproducibility and transparency in science have led to a range of research practice reforms, broadly referred to as ‘Open Science’. The extent that physical activity interventions are embedding Open Science practices is currently unknown. In this study, we randomly sampled 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practices.

Methods One hundred reports of randomised controlled trial physical activity behaviour change interventions published between 2018 and 2021 were identified, as used within the Human Behaviour-Change Project. Open Science practices were coded in identified reports, including: study pre-registration, protocol sharing, data, materials and analysis scripts sharing, replication of a previous study, open access publication, funding sources and conflict of interest statements. Coding was performed by two independent researchers, with inter-rater reliability calculated using Krippendorff’s alpha.

Results 78 of the 100 reports provided details of study pre-registration and 41% provided evidence of a published protocol. 4% provided accessible open data, 8% provided open materials and 1% provided open analysis scripts. 73% of reports were published as open access and no studies were described as replication attempts. 93% of reports declared their sources of funding and 88% provided conflicts of interest statements. A Krippendorff’s alpha of 0.73 was obtained across all coding.

Conclusion Open data, materials, analysis and replication attempts are currently rare in physical activity behaviour change intervention reports, whereas funding source and conflict of interest declarations are common. Future physical activity research should increase the reproducibility of their methods and results by incorporating more Open Science practices.

COAR Welcomes Significant Funding for the Notify Project

We are delighted to announce that COAR has been awarded a US$4 million grant from Arcadia, a charitable fund of Lisbet Rausing and Peter Baldwin. The 4 year grant will go towards the COAR Notify Project, which is developing and implementing a standard protocol for connecting the content in the distributed repository network with peer reviews and assessments in external services, using linked data notifications.

Frontiers | The Academic, Societal and Animal Welfare Benefits of Open Science for Animal Science | Veterinary Science

Abstract:  Animal science researchers have the obligation to reduce, refine, and replace the usage of animals in research (3R principles). Adherence to these principles can be improved by transparently publishing research findings, data and protocols. Open Science (OS) can help to increase the transparency of many parts of the research process, and its implementation should thus be considered by animal science researchers as a valuable opportunity that can contribute to the adherence to these 3R-principles. With this article, we want to encourage animal science researchers to implement a diverse set of OS practices, such as Open Access publishing, preprinting, and the pre-registration of test protocols, in their workflows.

 

Interviews with the lab protocol community – insights from an Academic Editor and a reviewer – EveryONE

“What do you think are the benefits of lab protocols for open science?

RK: PLOS ONE journal in collaboration with protocols.io has developed a unique and state-of-the-art platform for publishing lab protocols. This is a well-timed and useful innovation. The development of scientific knowledge is based on a variety of methodological approaches bordering on art. Because of the increasing complexity of scientific methods and their diversity, an appropriate forum or open science platform is needed, where the research community can present the best solution and point out the problems that may be encountered in other laboratories. Such a platform should of course be open, and in this form, it is really effective.

AF: Improving data reproducibility in research is one of today’s most important issues to address. Providing clear and detailed protocols, without limitation of words or space, is an effective way to communicate optimized protocols. This will directly help to improve data reproducibility between labs, as well as provide a thorough record of procedures that have been published in parallel. Improving communication of optimized protocols helps to drive robust research, allowing people to build their own research on already thorough studies, and not spend excessive time optimizing protocols based on poorly executed or explained protocols. …”

Open access methods and protocols promote open science in a pandemic – ScienceDirect

“How open-access methods and protocols publishing advanced the project’s goals

In considering a publication strategy, Milón was motivated by a common feeling of frustration: being fascinated by a new scientific publication and excited to try the new approach in his own lab but ultimately being disappointed to realize that the methods reporting wasn’t quite robust enough to faithfully recreate the experiment. Milón sees this as not only an inconvenience for himself but a broader challenge for research reproducibility. To help prevent challenges to other groups adopting their method, their results were therefore reviewed, polished, and packaged as three freely available scientific documents (Alcántara et al., 2021a; Alcántara et al., 2021b; Mendoza-Rojas, et al., 2021). The development of the method, including detailed reporting of the various optimizations and analytical comparisons that informed each component of the assay was described in Cell Reports Methods (Alcántara et al., 2021b). The methods paper provides the empirical justification for each step of the method and serves as both a general blueprint for future open-source diagnostic methods development and as a more specific template from which future modifications to any given step can be explored….”