Preregistration: A Plan, Not a Prison

“If you are worried about harming the publishability of your work once you have deviated from the original plan, focus on journals that have made a commitment to rewarding transparency, those that have signed the Transparency and Openness Promotion Guidelines, those that issue Open Practice Badges, or those that accept Registered Reports. These practices signal that the journals have a core commitment toward open and reproducible research and so are best suited to evaluating work based on ideal scientific practices.”

Preregistration: A Plan, Not a Prison

“If you are worried about harming the publishability of your work once you have deviated from the original plan, focus on journals that have made a commitment to rewarding transparency, those that have signed the Transparency and Openness Promotion Guidelines, those that issue Open Practice Badges, or those that accept Registered Reports. These practices signal that the journals have a core commitment toward open and reproducible research and so are best suited to evaluating work based on ideal scientific practices.”

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

 

Publishing for science or science for publications? The role of open science to reduce research waste – Siegerink – 2021 – Journal of Thrombosis and Haemostasis – Wiley Online Library

“One of the underlying ideas of Open Science is that when scientists are open about what they are doing, and what they have been up to, double work can be prevented. For example, Prospero (https://www.crd.york.ac.uk/prospero/), a registry in which authors can file their initiative to execute a systematic review and meta-analysis can indeed fulfill that role. However, Chapelle et al show that only 10 of the 20 meta-analyses were indeed preregistered. …

Another way to reduce redundant publications is to share research before it is peer-reviewed by publishing it on a preprint server such as medRxiv.org. Although the coronavirus disease 2019 pandemic has popularized this practice, it is only used for a small fraction of all research output. Would further adoption of this practice be a way to further reduce research waste? The data collected by Chapelle et al suggest that the time window between “received” and “published online” was short. Preprints will only prevent double work when there is a sufficiently large window between these two timepoints during which other researchers have to decide whether or not to start a new project….

As long as the scientific enterprise incentivizes research waste and science for publications, time and resources are wasted. Open science practices cannot counteract this because they do not address the root cause….”

Beware performative reproducibility

I worry that, by adopting the trappings of reproducibility, poor-quality work can look as if it has engaged in best practices. The problem is that sloppy work is driven by a scientific culture that overemphasizes exciting findings. When funders and journals reward showy claims at the expense of rigorous methods and reproducible results, reforms to change practice could become self-defeating. Helpful new practices, rules and policies are transformed into meaningless formalities on the way to continuing to grab headlines at any cost.

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Opening the Door to Registered Reports: Census of Journals Publishing Registered Reports (2013 – 2020)

Abstract:  Registered reports are a new publication workflow where the decision to publish is made prior to data collection and analysis, and thus cannot be dependent on the outcome of the study. An increasing number of journals have adopted this new mechanism; however, previous research suggests that submission rates are still relatively low. We conducted a census of journals publishing registered reports (N = 278) using independent coders to collect information from submission guidelines, with the goal of documenting journals’ early adoption of registered reports. Our results show that the majority of journals adopting registered reports are in psychology, and it typically takes about a year to publish the first registered report after adopting. However, many journals have still not published their first registered report. There is high variability in impact of journals adopting registered reports. Many journals do not include concrete information about policies that address concerns about registered reports (e.g., exploratory analysis); however, those that do typically allow these practices with some restrictions. Additionally, other open science practices are commonly recommended or required as part of the registered report process, especially open data and materials. Overall, many journals did not include many of the fields coded by the research team, which could be a barrier to submission for some authors. Though the majority of journals allow authors to be anonymous during the review process, a sizable portion do not, which could also be a barrier to submission. We conclude with future directions and implications for authors of registered reports, journals that have already adopted registered reports, and journals that may consider adopting registered reports in the future.

 

Full article: Promoting scientific integrity through open science in health psychology: results of the Synergy Expert Meeting of the European health psychology society

Abstract:  The article describes a position statement and recommendations for actions that need to be taken to develop best practices for promoting scientific integrity through open science in health psychology endorsed at a Synergy Expert Group Meeting. Sixteen Synergy Meeting participants developed a set of recommendations for researchers, gatekeepers, and research end-users. The group process followed a nominal group technique and voting system to elicit and decide on the most relevant and topical issues. Seventeen priority areas were listed and voted on, 15 of them were recommended by the group. Specifically, the following priority actions for health psychology were endorsed: (1) for researchers: advancing when and how to make data open and accessible at various research stages and understanding researchers’ beliefs and attitudes regarding open data; (2) for educators: integrating open science in research curricula, e.g., through online open science training modules, promoting preregistration, transparent reporting, open data and applying open science as a learning tool; (3) for journal editors: providing an open science statement, and open data policies, including a minimal requirements submission checklist. Health psychology societies and journal editors should collaborate in order to develop a coordinated plan for research integrity and open science promotion across behavioural disciplines.

 

PsyArXiv Preprints | Questionable and open research practices: attitudes and perceptions among quantitative communication researchers

Abstract:  Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices are believed to be widespread, evidence for this claim is primarily derived from other disciplines. Before change in communication research can happen, it is important to document the extent to which QRPs are used and whether researchers are open to the changes proposed by the so-called open science agenda. We conducted a large survey among authors of papers published in the top-20 journals in communication science in the last ten years (N=1039). A non-trivial percent of researchers report using one or more QRPs. While QRPs are generally considered unacceptable, researchers perceive QRPs to be common among their colleagues. At the same time, we find optimism about the use of open science practices in communication research. We end with a series of recommendations outlining what journals, institutions and researchers can do moving forward.

Reducing bias and improving transparency in medical research: a critical overview of the problems, progress and suggested next steps – Stephen H Bradley, Nicholas J DeVito, Kelly E Lloyd, Georgia C Richards, Tanja Rombey, Cole Wayant, Peter J Gill, 2020

Abstract:  In recent years there has been increasing awareness of problems that have undermined trust in medical research. This review outlines some of the most important issues including research culture, reporting biases, and statistical and methodological issues. It examines measures that have been instituted to address these problems and explores the success and limitations of these measures. The paper concludes by proposing three achievable actions which could be implemented to deliver significantly improved transparency and mitigation of bias. These measures are as follows: (1) mandatory registration of interests by those involved in research; (2) that journals support the ‘registered reports’ publication format; and (3) that comprehensive study documentation for all publicly funded research be made available on a World Health Organization research repository. We suggest that achieving such measures requires a broad-based campaign which mobilises public opinion. We invite readers to feedback on the proposed actions and to join us in calling for their implementation.

 

Reducing bias and improving transparency in medical research: a critical overview of the problems, progress and suggested next steps – Stephen H Bradley, Nicholas J DeVito, Kelly E Lloyd, Georgia C Richards, Tanja Rombey, Cole Wayant, Peter J Gill, 2020

Abstract:  In recent years there has been increasing awareness of problems that have undermined trust in medical research. This review outlines some of the most important issues including research culture, reporting biases, and statistical and methodological issues. It examines measures that have been instituted to address these problems and explores the success and limitations of these measures. The paper concludes by proposing three achievable actions which could be implemented to deliver significantly improved transparency and mitigation of bias. These measures are as follows: (1) mandatory registration of interests by those involved in research; (2) that journals support the ‘registered reports’ publication format; and (3) that comprehensive study documentation for all publicly funded research be made available on a World Health Organization research repository. We suggest that achieving such measures requires a broad-based campaign which mobilises public opinion. We invite readers to feedback on the proposed actions and to join us in calling for their implementation.