Transparency in infectious disease research: meta-research survey of specialty journals | The Journal of Infectious Diseases | Oxford Academic

Abstract:  Background

Infectious diseases carry large global burdens and have implications for society at large. Therefore, reproducible, transparent research is extremely important.

Methods

We evaluated transparency indicators (code and data sharing, registration, conflict and funding disclosures) in the 5340 PubMed Central Open Access articles published in 2019 or 2021 in the 9 most-cited specialty journals in infectious disease using the text-mining R package, rtransparent.

Results

5340 articles were evaluated (1860 published in 2019 and 3480 in 2021 (of which 1828 on COVID-19)). Text-mining identified code sharing in 98 (2%) articles, data sharing in 498 (9%), registration in 446 (8%), conflict of interest disclosures in 4209 (79%) and funding disclosures in 4866 (91%). There were substantial differences across the 9 journals: 1-9% for code sharing, 5-25% for data sharing, 1-31% for registration, 7-100% for conflicts of interest, and 65-100% for funding disclosures. Validation-corrected imputed estimates were 3%, 11%, 8%, 79% and 92%, respectively. There were no major differences between articles published in 2019 and non-COVID-19 articles in 2021. In 2021, non-COVID-19 articles had more data sharing (12%) than COVID-19 articles (4%).

Conclusions

Data sharing, code sharing, and registration are very uncommon in infectious disease specialty journals. Increased transparency is required.

Call for Volunteers: TOP Guidelines Advisory Board and Preregistration Template Evaluation Working Group

“Are you passionate about promoting transparency and openness in scientific research? The Center for Open Science (COS) is currently seeking volunteers for two opportunities. We seek colleagues to join (1) the Transparency and Openness Promotion (TOP) Guidelines Advisory Board and (2) the Preregistration Template Evaluation Working Group….”

Open science in health psychology and behavioral medicine: A statement from the Behavioral Medicine Research Council.

Abstract:  Open Science practices include some combination of registering and publishing study protocols (including hypotheses, primary and secondary outcome variables, and analysis plans) and making available preprints of manuscripts, study materials, de-identified data sets, and analytic codes. This statement from the Behavioral Medicine Research Council (BMRC) provides an overview of these methods, including preregistration; registered reports; preprints; and open research. We focus on rationales for engaging in Open Science and how to address shortcomings and possible objections. Additional resources for researchers are provided. Research on Open Science largely supports positive consequences for the reproducibility and reliability of empirical science. There is no solution that will encompass all Open Science needs in health psychology and behavioral medicine’s diverse research products and outlets, but the BMRC supports increased use of Open Science practices where possible.

 

Registered report: Survey on attitudes and experiences regarding preregistration in psychological research | PLOS ONE

Abstract:  Background

Preregistration, the open science practice of specifying and registering details of a planned study prior to knowing the data, increases the transparency and reproducibility of research. Large-scale replication attempts for psychological results yielded shockingly low success rates and contributed to an increasing demand for open science practices among psychologists. However, preregistering one’s studies is still not the norm in the field. Here, we conducted a study to explore possible reasons for this discrepancy.

Methods

In a mixed-methods approach, we conducted an online survey assessing attitudes, motivations, and perceived obstacles with respect to preregistration. Respondents (N = 289) were psychological researchers that were recruited through their publications on Web of Science, PubMed, PSYNDEX, and PsycInfo, and preregistrations on OSF Registries. Based on the theory of planned behavior, we predicted that positive attitudes (moderated by the perceived importance of preregistration) as well as a favorable subjective norm and higher perceived behavioral control positively influence researchers’ intention to preregister (directional hypothesis 1). Furthermore, we expected an influence of research experience on attitudes and perceived motivations and obstacles regarding preregistration (non-directional hypothesis 2). We analyzed these hypotheses with multiple regression models and included preregistration experience as a control variable.

Results

Researchers’ attitudes, subjective norms, perceived behavioral control, and the perceived importance of preregistration significantly predicted researchers’ intention to use preregistration in the future (see hypothesis 1). Research experience influenced both researchers’ attitudes and their perception of motivations to preregister, but not the perception of obstacles (see hypothesis 2). Descriptive reports on researchers’ attitudes, motivations and obstacles regarding preregistration are provided.

Discussion

Many researchers had already preregistered and had a rather positive attitude toward preregistration. Nevertheless, several obstacles were identified that may be addressed to improve and foster preregistration.

Nature welcomes Registered Reports

“This year marks the 50th anniversary of Nature’s decision to mandate peer review for all papers. It’s an appropriate time to introduce readers and authors to Registered Reports, a research-article format that Nature is offering from this week for studies designed to test whether a hypothesis is supported (see go.nature.com/3kivjh1).

The fundamental principle underpinning a Registered Report is that a journal commits to publishing a paper if the research question and the methodology chosen to address it pass peer review, with the result itself taking a back seat. For now, Nature is offering Registered Reports in the field of cognitive neuroscience and in the behavioural and social sciences. In the future, we plan to extend this to other fields, as well as to other types of study, such as more exploratory research.

Why are we introducing this format? In part to try to address publication bias, the tendency of the research system — editors, reviewers and authors — to favour the publication of positive over negative results. Registered Reports help to incentivize research regardless of the result. An elegant and robust study should be appreciated as much for its methodology as for its results….”

Nature welcomes Registered Reports

“This year marks the 50th anniversary of Nature’s decision to mandate peer review for all papers. It’s an appropriate time to introduce readers and authors to Registered Reports, a research-article format that Nature is offering from this week for studies designed to test whether a hypothesis is supported (see go.nature.com/3kivjh1).

The fundamental principle underpinning a Registered Report is that a journal commits to publishing a paper if the research question and the methodology chosen to address it pass peer review, with the result itself taking a back seat. For now, Nature is offering Registered Reports in the field of cognitive neuroscience and in the behavioural and social sciences. In the future, we plan to extend this to other fields, as well as to other types of study, such as more exploratory research.

Why are we introducing this format? In part to try to address publication bias, the tendency of the research system — editors, reviewers and authors — to favour the publication of positive over negative results. Registered Reports help to incentivize research regardless of the result. An elegant and robust study should be appreciated as much for its methodology as for its results….”

MetaArXiv Preprints | Reproducible research practices and transparency across linguistics

Abstract:  Scientific studies of language span across many disciplines and provide evidence for social, cultural, cognitive, technological, and biomedical studies of human nature and behavior. By becoming increasingly empirical and quantitative, linguistics has been facing challenges and limitations of the scientific practices that pose barriers to reproducibility and replicability. One of the proposed solutions to the widely acknowledged reproducibility and replicability crisis has been the implementation of transparency practices, e.g. open access publishing, preregistrations, sharing study materials, data, and analyses, performing study replications and declaring conflicts of interest. Here, we have assessed the prevalence of these practices in randomly sampled 600 journal articles from linguistics across two time points. In line with similar studies in other disciplines, we found a moderate amount of articles published open access, but overall low rates of sharing materials, data, and protocols, no preregistrations, very few replications and low rates of conflict of interest reports. These low rates have not increased noticeably between 2008/2009 and 2018/2019, pointing to remaining barriers and slow adoption of open and reproducible research practices in linguistics. As linguistics has not yet firmly established transparency and reproducibility as guiding principles in research, we provide recommendations and solutions for facilitating the adoption of these practices.

 

About Meta-Psychology

“Meta-Psychology publishes theoretical and empirical contributions that advance psychology as a science through critical discourse related to individual articles, research lines, research areas, or psychological science as a field. Important contributions include systematic reviews, meta-analyses, replicability reports, and replication studies. We encourage pre-registered studies and registered reports (i.e., peer-review on the basis of theory, methods, and planned data-analysis, before data has been collected). Manuscripts introducing novel methods are welcome, but also tutorials on established methods that are still poorly understood by psychology researchers. We further welcome papers introducing statistical packages or other software useful for psychology researchers….”

 

Open Science in Developmental Science | Annual Review of Developmental Psychology

Abstract:  Open science policies have proliferated in the social and behavioral sciences in recent years, including practices around sharing study designs, protocols, and data and preregistering hypotheses. Developmental research has moved more slowly than some other disciplines in adopting open science practices, in part because developmental science is often descriptive and does not always strictly adhere to a confirmatory approach. We assess the state of open science practices in developmental science and offer a broader definition of open science that includes replication, reproducibility, data reuse, and global reach.

 

Open Science Practices in Communication Sciences and Disorders: A Survey | Journal of Speech, Language, and Hearing Research

Abstract:  Purpose: Open science is a collection of practices that seek to improve the accessibility, transparency, and replicability of science. Although these practices have garnered interest in related fields, it remains unclear whether open science practices have been adopted in the field of communication sciences and disorders (CSD). This study aimed to survey the knowledge, implementation, and perceived benefits and barriers of open science practices in CSD.

Method: An online survey was disseminated to researchers in the United States actively engaged in CSD research. Four-core open science practices were examined: preregistration, self-archiving, gold open access, and open data. Data were analyzed using descriptive statistics and regression models.

Results: Two hundred twenty-two participants met the inclusion criteria. Most participants were doctoral students (38%) or assistant professors (24%) at R1 institutions (58%). Participants reported low knowledge of preregistration and gold open access. There was, however, a high level of desire to learn more for all practices. Implementation of open science practices was also low, most notably for preregistration, gold open access, and open data (< 25%). Predictors of knowledge and participation, as well as perceived barriers to implementation, are discussed.

Conclusion: Although participation in open science appears low in the field of CSD, participants expressed a strong desire to learn more in order to engage in these practices in the future.

Methodology over metrics: current scientific standards are a disservice to patients and society – Journal of Clinical Epidemiology

Abstract:  Covid-19 research made it painfully clear that the scandal of poor medical research, as denounced by Altman in 1994, persists today. The overall quality of medical research remains poor, despite longstanding criticisms. The problems are well known, but the research community fails to properly address them. We suggest that most problems stem from an underlying paradox: although methodology is undeniably the backbone of high-quality and responsible research, science consistently undervalues methodology. The focus remains more on the destination (research claims and metrics) than on the journey. Notwithstanding, research should serve society more than the reputation of those involved. While we notice that many initiatives are being established to improve components of the research cycle, these initiatives are too disjointed. The overall system is monolithic and slow to adapt. We assert that top-down action is needed from journals, universities, funders and governments to break the cycle and put methodology first. These actions should involve the widespread adoption of registered reports, balanced research funding between innovative, incremental and methodological research projects, full recognition and demystification of peer review, improved methodological review of reports, adherence to reporting guidelines, and investment in methodological education and research. Currently, the scientific enterprise is doing a major disservice to patients and society.

 

Left in the dark: the importance of publicly available clinical trial protocols – Braat – 2022 – Medical Journal of Australia – Wiley Online Library

“Prospective registration of a randomised controlled trial (RCT) based on a protocol with formal ethics approval is a benchmark for transparent medical research. The reporting of the primary results of the study should correspond to the design, analysis, and reporting specified in the protocol and trial registration. However, modifications to various aspects of the trial are often made after registration, ranging from administrative updates to substantial protocol amendments. To track the history of revisions, the protocol and registry entry should be updated, and the documentation trail should support an independent appraisal of whether any biases have been introduced that could affect interpretation of trial results.

In this issue of the MJA, Coskinas and colleagues report their investigation of changes to 181 phase 3 RCTs registered with the Australian New Zealand Clinical Trials Registry (ANZCTR) during 1 September 2007 – 31 December 2013.1 The authors compared protocol documents (including ANZCTR registration information) with subsequent journal publications for any changes to the primary outcome, treatment comparisons, analysis set definition, eligibility criteria, sample size, or primary analysis method. They found that protocols were available for only 124 trials (69%); it could be determined that no major changes had been made to eleven of these trials (9%), while 78 had definitely been modified (63%). By comparing publications with trial registration information, it was found that no changes were made to five of the 57 trials without available protocols (9%), and it could not be determined whether changes had been made to a further ten (18%)….”

PsyArXiv Preprints | Three Myths about Open Science That Just Won’t Die

Abstract:  Knowledge and implementation of open science principles and behaviors remains uneven between and within sub-disciplines in psychology, despite over 10 years of education and advocacy. One reason for the slow and uneven progress of the movement is a set of closely-held myths about the implications of open science practices, exacerbated by the relative isolation of various sub-disciplines in the field. This talk will cover three of the major recurring myths: that open science is in conflict with prioritizing diversity, that “open data” is a binary choice between fully open and accessible and completely closed off, and that preregistration and registered reports are only appropriate for certain types of research designs. Putting these myths to rest is necessary as we work towards improving our scientific practice.

 

Easing Into Open Science: A Guide for Graduate Students and Their Advisors | Collabra: Psychology | University of California Press

Abstract:  This article provides a roadmap to assist graduate students and their advisors to engage in open science practices. We suggest eight open science practices that novice graduate students could begin adopting today. The topics we cover include journal clubs, project workflow, preprints, reproducible code, data sharing, transparent writing, preregistration, and registered reports. To address concerns about not knowing how to engage in open science practices, we provide a difficulty rating of each behavior (easy, medium, difficult), present them in order of suggested adoption, and follow the format of what, why, how, and worries. We give graduate students ideas on how to approach conversations with their advisors/collaborators, ideas on how to integrate open science practices within the graduate school framework, and specific resources on how to engage with each behavior. We emphasize that engaging in open science behaviors need not be an all or nothing approach, but rather graduate students can engage with any number of the behaviors outlined.