How do researchers really feel about methods-sharing? – The Official PLOS Blog

“In scientific communications, methods are finally getting their due. Tools for better-communicating methods are everywhere these days—from new reporting standards and methods-specific article types, to dedicated methods journals and purpose-built repository platforms. But so far, no single solution has enjoyed wide adoption or been generally acknowledged as best practice.

Now, new data gathered by PLOS, with the support of protocols.io and TCC Africa, sheds light on how researchers view methods, and lends insight into their motivations and behaviors when it comes to methods-sharing. Over 1,000 researchers completed the survey. Respondents were concentrated primarily in the Life and Health Sciences, and tended to be more senior in their careers. Read on for highlights, or skip straight to the preprint for in-depth details.

Takeaway #1 – Established methods-sharing norms are insufficient…

Takeaway #2 – Researchers see methods sharing as important…

Takeaway #3 – When it comes to their specific goals, researchers aren’t satisfied…

Takeaway #4 – The main blockers to methods-sharing are practical…”

Left in the dark: the importance of publicly available clinical trial protocols – Braat – 2022 – Medical Journal of Australia – Wiley Online Library

“Prospective registration of a randomised controlled trial (RCT) based on a protocol with formal ethics approval is a benchmark for transparent medical research. The reporting of the primary results of the study should correspond to the design, analysis, and reporting specified in the protocol and trial registration. However, modifications to various aspects of the trial are often made after registration, ranging from administrative updates to substantial protocol amendments. To track the history of revisions, the protocol and registry entry should be updated, and the documentation trail should support an independent appraisal of whether any biases have been introduced that could affect interpretation of trial results.

In this issue of the MJA, Coskinas and colleagues report their investigation of changes to 181 phase 3 RCTs registered with the Australian New Zealand Clinical Trials Registry (ANZCTR) during 1 September 2007 – 31 December 2013.1 The authors compared protocol documents (including ANZCTR registration information) with subsequent journal publications for any changes to the primary outcome, treatment comparisons, analysis set definition, eligibility criteria, sample size, or primary analysis method. They found that protocols were available for only 124 trials (69%); it could be determined that no major changes had been made to eleven of these trials (9%), while 78 had definitely been modified (63%). By comparing publications with trial registration information, it was found that no changes were made to five of the 57 trials without available protocols (9%), and it could not be determined whether changes had been made to a further ten (18%)….”

Access to unpublished protocols and statistical analysis plans of randomised trials | Trials | Full Text

Abstract:  Background

Access to protocols and statistical analysis plans (SAPs) increases the transparency of randomised trial by allowing readers to identify and interpret unplanned changes to study methods, however they are often not made publicly available. We sought to determine how often study investigators would share unavailable documents upon request.

Methods

We used trials from two previously identified cohorts (cohort 1: 101 trials published in high impact factor journals between January and April of 2018; cohort 2: 100 trials published in June 2018 in journals indexed in PubMed) to determine whether study investigators would share unavailable protocols/SAPs upon request. We emailed corresponding authors of trials with no publicly available protocol or SAP up to four times.

Results

Overall, 96 of 201 trials (48%) across the two cohorts had no publicly available protocol or SAP (11/101 high-impact cohort, 85/100 PubMed cohort). In total, 8/96 authors (8%) shared some trial documentation (protocol only [n?=?5]; protocol and SAP [n?=?1]; excerpt from protocol [n?=?1]; research ethics application form [n?=?1]). We received protocols for 6/96 trials (6%), and a SAP for 1/96 trial (1%). Seventy-three authors (76%) did not respond, 7 authors responded (7%) but declined to share a protocol or SAP, and eight email addresses were invalid (8%). A total of 329 emails were sent (an average of 41 emails for every trial which sent documentation). After emailing authors, the total number of trials with an available protocol increased by only 3%, from 52% in to 55%.

Conclusions

Most study investigators did not share their unpublished protocols or SAPs upon direct request. Alternative strategies are needed to increase transparency of randomised trials and ensure access to protocols and SAPs.

A meta-research study of randomized controlled trials found infrequent and delayed availability of protocols – ScienceDirect

Abstract:  Objectives

Availability of randomized controlled trial (RCT) protocols is essential for the interpretation of trial results and research transparency.

Study Design and Setting

In this study, we determined the availability of RCT protocols approved in Switzerland, Canada, Germany, and the United Kingdom in 2012. For these RCTs, we searched PubMed, Google Scholar, Scopus, and trial registries for publicly available protocols and corresponding full-text publications of results. We determined the proportion of RCTs with (1) publicly available protocols, (2) publications citing the protocol, and (3) registries providing a link to the protocol. A multivariable logistic regression model explored factors associated with protocol availability.

Results

Three hundred twenty-six RCTs were included, of which 118 (36.2%) made their protocol publicly available; 56 (47.6% 56 of 118) provided as a peer-reviewed publication and 48 (40.7%, 48 of 118) provided as supplementary material. A total of 90.9% (100 of 110) of the protocols were cited in the main publication, and 55.9% (66 of 118) were linked in the clinical trial registry. Larger sample size (>500; odds ratio [OR] = 5.90, 95% confidence interval [CI], 2.75–13.31) and investigator sponsorship (OR = 1.99, 95% CI, 1.11–3.59) were associated with increased protocol availability. Most protocols were made available shortly before the publication of the main results.

Conclusion

RCT protocols should be made available at an early stage of the trial.

The experiment begins: Arcadia publishing 1.0 · Reimagining scientific publishing

“In thinking about how to share Arcadia’s research, we wanted to keep features of traditional publishing that have been honed over centuries, but improve upon what hasn’t quite adapted to the nature of modern science and technology. We have a unique opportunity to use our own research to develop mechanisms of sharing and quality control that can be more agile and adaptable. Our initial attempt is outlined here and we will continue to iterate upon it, always keeping the advancement of knowledge as our guiding principle when making decisions on what to try next….

We are reimagining scientific publishing — sharing our work early and often, maximizing utility and reusability, and improving our science on the basis of public feedback.

This is our first draft. We have ambitious goals and we’re committed to replicable long-term solutions, but we also know that “perfection is the enemy of good.” We’re using this platform to release findings now rather than hiding them until we’ve gotten everything exactly how we want it. Readers can think of the pubs on this platform as drafts that will evolve over time, shaped by public feedback. The same goes for the platform itself! We’re treating our publishing project like an experiment — we’re not sure where we will land, but we can only learn if we try. In this pub, we’re sharing our strategy and the reasoning behind some of our key decisions, highlighting features we’re excited about and areas for improvement. …

Assessing Open Science practices in physical activity behaviour change intervention evaluations | BMJ Open Sport & Exercise Medicine

Abstract:  Objectives Concerns on the lack of reproducibility and transparency in science have led to a range of research practice reforms, broadly referred to as ‘Open Science’. The extent that physical activity interventions are embedding Open Science practices is currently unknown. In this study, we randomly sampled 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practices.

Methods One hundred reports of randomised controlled trial physical activity behaviour change interventions published between 2018 and 2021 were identified, as used within the Human Behaviour-Change Project. Open Science practices were coded in identified reports, including: study pre-registration, protocol sharing, data, materials and analysis scripts sharing, replication of a previous study, open access publication, funding sources and conflict of interest statements. Coding was performed by two independent researchers, with inter-rater reliability calculated using Krippendorff’s alpha.

Results 78 of the 100 reports provided details of study pre-registration and 41% provided evidence of a published protocol. 4% provided accessible open data, 8% provided open materials and 1% provided open analysis scripts. 73% of reports were published as open access and no studies were described as replication attempts. 93% of reports declared their sources of funding and 88% provided conflicts of interest statements. A Krippendorff’s alpha of 0.73 was obtained across all coding.

Conclusion Open data, materials, analysis and replication attempts are currently rare in physical activity behaviour change intervention reports, whereas funding source and conflict of interest declarations are common. Future physical activity research should increase the reproducibility of their methods and results by incorporating more Open Science practices.

COAR Welcomes Significant Funding for the Notify Project

We are delighted to announce that COAR has been awarded a US$4 million grant from Arcadia, a charitable fund of Lisbet Rausing and Peter Baldwin. The 4 year grant will go towards the COAR Notify Project, which is developing and implementing a standard protocol for connecting the content in the distributed repository network with peer reviews and assessments in external services, using linked data notifications.

Frontiers | The Academic, Societal and Animal Welfare Benefits of Open Science for Animal Science | Veterinary Science

Abstract:  Animal science researchers have the obligation to reduce, refine, and replace the usage of animals in research (3R principles). Adherence to these principles can be improved by transparently publishing research findings, data and protocols. Open Science (OS) can help to increase the transparency of many parts of the research process, and its implementation should thus be considered by animal science researchers as a valuable opportunity that can contribute to the adherence to these 3R-principles. With this article, we want to encourage animal science researchers to implement a diverse set of OS practices, such as Open Access publishing, preprinting, and the pre-registration of test protocols, in their workflows.

 

Interviews with the lab protocol community – insights from an Academic Editor and a reviewer – EveryONE

“What do you think are the benefits of lab protocols for open science?

RK: PLOS ONE journal in collaboration with protocols.io has developed a unique and state-of-the-art platform for publishing lab protocols. This is a well-timed and useful innovation. The development of scientific knowledge is based on a variety of methodological approaches bordering on art. Because of the increasing complexity of scientific methods and their diversity, an appropriate forum or open science platform is needed, where the research community can present the best solution and point out the problems that may be encountered in other laboratories. Such a platform should of course be open, and in this form, it is really effective.

AF: Improving data reproducibility in research is one of today’s most important issues to address. Providing clear and detailed protocols, without limitation of words or space, is an effective way to communicate optimized protocols. This will directly help to improve data reproducibility between labs, as well as provide a thorough record of procedures that have been published in parallel. Improving communication of optimized protocols helps to drive robust research, allowing people to build their own research on already thorough studies, and not spend excessive time optimizing protocols based on poorly executed or explained protocols. …”

Open access methods and protocols promote open science in a pandemic – ScienceDirect

“How open-access methods and protocols publishing advanced the project’s goals

In considering a publication strategy, Milón was motivated by a common feeling of frustration: being fascinated by a new scientific publication and excited to try the new approach in his own lab but ultimately being disappointed to realize that the methods reporting wasn’t quite robust enough to faithfully recreate the experiment. Milón sees this as not only an inconvenience for himself but a broader challenge for research reproducibility. To help prevent challenges to other groups adopting their method, their results were therefore reviewed, polished, and packaged as three freely available scientific documents (Alcántara et al., 2021a; Alcántara et al., 2021b; Mendoza-Rojas, et al., 2021). The development of the method, including detailed reporting of the various optimizations and analytical comparisons that informed each component of the assay was described in Cell Reports Methods (Alcántara et al., 2021b). The methods paper provides the empirical justification for each step of the method and serves as both a general blueprint for future open-source diagnostic methods development and as a more specific template from which future modifications to any given step can be explored….”

Open access methods and protocols promote open science in a pandemic – ScienceDirect

“How open-access methods and protocols publishing advanced the project’s goals

In considering a publication strategy, Milón was motivated by a common feeling of frustration: being fascinated by a new scientific publication and excited to try the new approach in his own lab but ultimately being disappointed to realize that the methods reporting wasn’t quite robust enough to faithfully recreate the experiment. Milón sees this as not only an inconvenience for himself but a broader challenge for research reproducibility. To help prevent challenges to other groups adopting their method, their results were therefore reviewed, polished, and packaged as three freely available scientific documents (Alcántara et al., 2021a; Alcántara et al., 2021b; Mendoza-Rojas, et al., 2021). The development of the method, including detailed reporting of the various optimizations and analytical comparisons that informed each component of the assay was described in Cell Reports Methods (Alcántara et al., 2021b). The methods paper provides the empirical justification for each step of the method and serves as both a general blueprint for future open-source diagnostic methods development and as a more specific template from which future modifications to any given step can be explored….”

Why I am building Arcadia.

“I walked away with the backing to establish a new startup, Trove….

At Trove, we are led by curiosity and remain committed to learning and sharing the knowledge we’ve gained. There is no need to lock up the lessons we’ve learned from others in the tick community. In fact, we have sought their feedback, and we will publish most of our protocols, tools, and datasets without paywalls or delays. It’s the most rigorous any of us have ever had to be, and all of this is in the absence of journals. Our work may ultimately translate into products that could be useful to many more people….

For all these reasons, I have decided to take the best parts of my experiences to build a new research organization called Arcadia Science. I am co-founding Arcadia with yet another fierce woman scientist Prachee Avasthi, who is a leader among leaders in the fight for open science. …”

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.