The potential butterfly effect of preregistered peer-reviewed research – The Official PLOS Blog

“Refocusing journal peer review on the study design phase exerts more and greater downstream changes. Peer review that focuses on evaluating significance of the research question, the methods and analytical approach before work begins, has the power to shape stronger, more rigorous and more creative research. Making an editorial decision while results are still unknown minimizes the potential impacts confirmation bias and impact bias, taking science communication back to its roots, with an emphasis on quality, rigor, and a pure intellectual curiosity. As Kiermer explains, “Preregistration and peer review of the study protocol with a journal is a way to tackle publication bias. As long as the protocol is followed, or any deviations explained, it’s a guarantee for the author that the results will be published, even if they don’t confirm their hypothesis.”

In combination, all of these factors contribute to a more complete and efficient scientific record, replete with studies exploring important hypotheses, performed to the very highest technical standards, and free from the distorting influence of impact-chasing, ego, and bias. A scientific record that is both demonstrably trustworthy, and widely trusted. And with that, there is no telling where science might go, or how quickly….”

Frontiers | The Academic, Societal and Animal Welfare Benefits of Open Science for Animal Science | Veterinary Science

Abstract:  Animal science researchers have the obligation to reduce, refine, and replace the usage of animals in research (3R principles). Adherence to these principles can be improved by transparently publishing research findings, data and protocols. Open Science (OS) can help to increase the transparency of many parts of the research process, and its implementation should thus be considered by animal science researchers as a valuable opportunity that can contribute to the adherence to these 3R-principles. With this article, we want to encourage animal science researchers to implement a diverse set of OS practices, such as Open Access publishing, preprinting, and the pre-registration of test protocols, in their workflows.

 

MPDL is supporting the Peer Community In (PCI) Initiative

Initiated at the request of several Max Planck institutes, the Max Planck Digital Library is supporting the platform „Peer Community in Registered Reports“ by making a one-time funding contribution of 5,000 Euro.

The Peer Community In (PCI) initiative is a non-profit, non-commercial platform that evaluates and recommends preprints in many scientific fields. The overarching aim of this researcher-run organization is to create specific communities of researchers reviewing and recommending, for free, unpublished preprints in their field.

 

Sharing is caring: Ethical implications of transparent research in psychology. – PsycNET

Abstract:  The call for greater openness in research data is quickly growing in many scientific fields. Psychology as a field, however, still falls short in this regard. Research is vulnerable to human error, inaccurate interpretation, and reporting of study results, and decisions during the research process being biased toward favorable results. Despite the obligation to share data for verification and the importance of this practice for protecting against human error, many psychologists do not fulfill their ethical responsibility of sharing their research data. This has implications for the accurate and ethical dissemination of specific research findings and the scientific development of the field more broadly. Open science practices provide promising approaches to address the ethical issues of inaccurate reporting and false-positive results in psychological research literature that hinder scientific growth and ultimately violate several relevant ethical principles and standards from the American Psychological Association’s (APA’s) Ethical Principles of Psychologists Code of Conduct (APA, 2017). Still, current incentive structures in the field for publishing and professional advancement appear to induce hesitancy in applying these practices. With each of these considerations in mind, recommendations on how psychologists can ethically proceed through open science practices and incentive restructuring—in particular, data management, data and code sharing, study preregistration, and registered reports—are provided.

Preprints and preregistration: making your research or research plans publicly available at an early stage. – Digital Scholarship Leiden

“We explored two different ways of opening up research at an early stage: preprints and pre-registration. How to go about making your research or research plans publicly available at an early stage, and what can you expect to happen after you have done so?…”

Comparison of Preregistration Platforms

Abstract:  Preregistration can force researchers to front-load a lot of decision-making to an early stage of a project. Choosing which preregistration platform to use must be therefore be one of those early decisions, and because a preregistration cannot be moved, that choice is permanent. This article aims to help researchers who are already interested in preregistration choose a platform by clarifying differences between them. Preregistration criteria and features are explained and analyzed for sites that cater to a broad range of research fields, including: GitHub, AsPredicted, Zenodo, the Open Science Framework (OSF), and an “open-ended” variant of OSF. While a private prespecification document can help mitigate self-deception, this guide considers publicly shared preregistrations that aim to improve credibility. It therefore defines three of the criteria (a timestamp, a registry, and persistence) as a bare minimum for a valid and reliable preregistration. GitHub and AsPredicted fail to meet all three. Zenodo and OSF meet the basic criteria and vary in which additional features they offer.

What has Royal Society Open Science achieved in its first few years? | Royal Society Open Science

Abstract:  It has been a pleasure and a privilege to serve as the first Editor-in-Chief of Royal Society Open Science for the past 6 years. I step down at the end of December 2021, having completed two 3-year terms, and am taking the opportunity here to reflect on some of the successes and challenges that the journal has experienced and the innovations that we have introduced. When I was first approached back in 2015, the breadth of the journal, covering the whole of science, resonated with my own interests: my research career has ranged across the entire landscape of chemistry, while my leadership roles have embraced all of science, technology and medicine. The open access ethos, the objective refereeing policy that rejects the idea of only publishing what is in fashion, and the opportunities offered by a new venture that could transcend traditional disciplinary boundaries also all appealed to me. Among our successful innovations are Registered Reports, Replication Studies and the new ‘Science, Society and Policy’ section. The challenges have included the transition to paid article processing charges (APCs), whether to resist pressure to retract a controversial paper, and bullying of young female authors by established senior males in the same field. I explore all of these below, provide some statistics on the journal’s performance, also cover some of the notable papers we have published, and provide some concluding thoughts.

 

Comparing dream to reality: an assessment of adherence of the first generation of preregistered studies | Royal Society Open Science

Abstract:  Preregistration is a method to increase research transparency by documenting research decisions on a public, third-party repository prior to any influence by data. It is becoming increasingly popular in all subfields of psychology and beyond. Adherence to the preregistration plan may not always be feasible and even is not necessarily desirable, but without disclosure of deviations, readers who do not carefully consult the preregistration plan might get the incorrect impression that the study was exactly conducted and reported as planned. In this paper, we have investigated adherence and disclosure of deviations for all articles published with the Preregistered badge in Psychological Science between February 2015 and November 2017 and shared our findings with the corresponding authors for feedback. Two out of 27 preregistered studies contained no deviations from the preregistration plan. In one study, all deviations were disclosed. Nine studies disclosed none of the deviations. We mainly observed (un)disclosed deviations from the plan regarding the reported sample size, exclusion criteria and statistical analysis. This closer look at preregistrations of the first generation reveals possible hurdles for reporting preregistered studies and provides input for future reporting guidelines. We discuss the results and possible explanations, and provide recommendations for preregistered research.

 

Open-access science in the misinformation era

“While open-access science has made research available worldwide, some scholars worry that misinformation, fraud and politicization have become rampant in a system that rewards speed and sparkle….

In a widely discussed Scholarly Kitchen piece published last week, Schonfeld said that misinformation, politicization and other problems embedded in the open-access movement stem from a “mismatch” between the incentives in science and the ways in which “openness and politicization are bringing science into the public discourse.” …

While open access has democratized science, to good effect — making research available to sick patients interested in learning more about their condition or to scientists working in the Global South — it also has had “second-order effects” that are more concerning, he said.

“It’s now easier for scientific literature to be quoted and used in all sorts of political discourse,” Schonfeld said in an interview. “When the methods of scholarly publishing that we use today were first formed, there was no sense that there was going to be a kind of politicized discourse looking for opportunities to misinform the public and intentionally cause disunity.” …”

Open-access science in the misinformation era

“While open-access science has made research available worldwide, some scholars worry that misinformation, fraud and politicization have become rampant in a system that rewards speed and sparkle….

In a widely discussed Scholarly Kitchen piece published last week, Schonfeld said that misinformation, politicization and other problems embedded in the open-access movement stem from a “mismatch” between the incentives in science and the ways in which “openness and politicization are bringing science into the public discourse.” …

While open access has democratized science, to good effect — making research available to sick patients interested in learning more about their condition or to scientists working in the Global South — it also has had “second-order effects” that are more concerning, he said.

“It’s now easier for scientific literature to be quoted and used in all sorts of political discourse,” Schonfeld said in an interview. “When the methods of scholarly publishing that we use today were first formed, there was no sense that there was going to be a kind of politicized discourse looking for opportunities to misinform the public and intentionally cause disunity.” …”

Preregistration: A Plan, Not a Prison

“If you are worried about harming the publishability of your work once you have deviated from the original plan, focus on journals that have made a commitment to rewarding transparency, those that have signed the Transparency and Openness Promotion Guidelines, those that issue Open Practice Badges, or those that accept Registered Reports. These practices signal that the journals have a core commitment toward open and reproducible research and so are best suited to evaluating work based on ideal scientific practices.”

Preregistration: A Plan, Not a Prison

“If you are worried about harming the publishability of your work once you have deviated from the original plan, focus on journals that have made a commitment to rewarding transparency, those that have signed the Transparency and Openness Promotion Guidelines, those that issue Open Practice Badges, or those that accept Registered Reports. These practices signal that the journals have a core commitment toward open and reproducible research and so are best suited to evaluating work based on ideal scientific practices.”

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

 

Publishing for science or science for publications? The role of open science to reduce research waste – Siegerink – 2021 – Journal of Thrombosis and Haemostasis – Wiley Online Library

“One of the underlying ideas of Open Science is that when scientists are open about what they are doing, and what they have been up to, double work can be prevented. For example, Prospero (https://www.crd.york.ac.uk/prospero/), a registry in which authors can file their initiative to execute a systematic review and meta-analysis can indeed fulfill that role. However, Chapelle et al show that only 10 of the 20 meta-analyses were indeed preregistered. …

Another way to reduce redundant publications is to share research before it is peer-reviewed by publishing it on a preprint server such as medRxiv.org. Although the coronavirus disease 2019 pandemic has popularized this practice, it is only used for a small fraction of all research output. Would further adoption of this practice be a way to further reduce research waste? The data collected by Chapelle et al suggest that the time window between “received” and “published online” was short. Preprints will only prevent double work when there is a sufficiently large window between these two timepoints during which other researchers have to decide whether or not to start a new project….

As long as the scientific enterprise incentivizes research waste and science for publications, time and resources are wasted. Open science practices cannot counteract this because they do not address the root cause….”