Open Science Standards at Journals that Inform Evidence-Based Policy | SpringerLink

Abstract:  Evidence-based policy uses intervention research to inform consequential decisions about resource allocation. Research findings are often published in peer-reviewed journals. Because detrimental research practices associated with closed science are common, journal articles report more false-positives and exaggerated effect sizes than would be desirable. Journal implementation of standards that promote open science—such as the transparency and openness promotion (TOP) guidelines—could reduce detrimental research practices and improve the trustworthiness of research evidence on intervention effectiveness. We evaluated TOP implementation at 339 peer-reviewed journals that have been used to identify evidence-based interventions for policymaking and programmatic decisions. Each of ten open science standards in TOP was not implemented in most journals’ policies (instructions to authors), procedures (manuscript submission systems), or practices (published articles). Journals implementing at least one standard typically encouraged, but did not require, an open science practice. We discuss why and how journals could improve implementation of open science standards to safeguard evidence-based policy.

 

Call for Volunteers: TOP Guidelines Advisory Board and Preregistration Template Evaluation Working Group

“Are you passionate about promoting transparency and openness in scientific research? The Center for Open Science (COS) is currently seeking volunteers for two opportunities. We seek colleagues to join (1) the Transparency and Openness Promotion (TOP) Guidelines Advisory Board and (2) the Preregistration Template Evaluation Working Group….”

NASA’s Thirst for Open Source Software — and for Open Science – The New Stack

“Software has been a crucial component to all of NASA’s major achievements, from space travel to the deepest images of our universe. Naturally, NASA’s need for high-quality scientific software has led it to open source developers, and now to an ambitious new program based on the larger principles of “open science.”

Bringing NASA’s open source message to the annual FOSDEM conference was Steve Crawford, a space-loving astronomer who is now also the data officer of NASA’s science directorate, the group engaging the scientific community to define questions and expand research….

But there’s also an outreach to the world beyond NASA — including a new $40 million, five-year program called Transform to Open Science. The idea of open science involves free availability of research information to encourage outside contributions, and NASA is actively trying to lead us there….

The official TOPS webpage calls it NASA’s “global community initiative to spark change and inspire open science engagement through events and activities that will shift the current paradigm.” Throughout 2023, NASA TOPS will be partnering with 12 scientific professional societies in the scientific community “to advance the adoption of open science, roll out an open science curriculum, and support minority-serving institutions engagement with NASA through prizes, challenges, and hackathons.”…”

Exploring enablers and barriers to implementing the Transparency and Openness Promotion Guidelines: a theory-based survey of journal editors | Royal Society Open Science

Abstract:  The Transparency and Openness Promotion (TOP) Guidelines provide a framework to help journals develop open science policies. Theories of behaviour change can guide understanding of why journals do (not) implement open science policies and the development of interventions to improve these policies. In this study, we used the Theoretical Domains Framework to survey 88 journal editors on their capability, opportunity and motivation to implement TOP. Likert-scale questions assessed editor support for TOP, and enablers and barriers to implementing TOP. A qualitative question asked editors to provide reflections on their ratings. Most participating editors supported adopting TOP at their journal (71%) and perceived other editors in their discipline to support adopting TOP (57%). Most editors (93%) agreed their roles include maintaining policies that reflect current best practices. However, most editors (74%) did not see implementing TOP as a high priority compared with other editorial responsibilities. Qualitative responses expressed structural barriers to implementing TOP (e.g. lack of time, resources and authority to implement changes) and varying support for TOP depending on study type, open science standard, and level of implementation. We discuss how these findings could inform the development of theoretically guided interventions to increase open science policies, procedures and practices.

 

Further action toward valid science in Law and Human Behavior: Requiring open data, analytic code, and research materials.

“Beginning on March 1, 2023, Law and Human Behavior will raise its standard for data reporting and expand its focus to include analytic code and research materials. Adopting the recommended language from the TOP Guidelines (Nosek et al., 2015b), the journal will publish articles only if the data, analytic code, and research materials are clearly and precisely documented and are fully available to any researcher who wishes to reproduce the results or replicate the procedure.

Accordingly, authors using original data who seek to publish their research in the journal must make the following items publicly available: …

 

Authors reusing data from public repositories who pursue publication in Law and Human Behavior must provide program code, scripts for statistical packages, and other documentation sufficient to allow an informed researcher to precisely reproduce all published results….”

 

Evaluating Research Transparency and Openness in Communication Sciences and Disorders Journals | Journal of Speech, Language, and Hearing Research

Abstract:  Purpose:

To improve the credibility, reproducibility, and clinical utility of research findings, many scientific fields are implementing transparent and open research practices. Such open science practices include researchers making their data publicly available and preregistering their hypotheses and analyses. A way to enhance the adoption of open science practices is for journals to encourage or require submitting authors to participate in such practices. Accordingly, the American Speech-Language-Hearing Association’s Journals Program has recently announced their intention to promote open science practices. Here, we quantitatively assess the extent to which several journals in communication sciences and disorders (CSD) encourage or require participation in several open science practices by using the Transparency and Openness Promotion (TOP) Factor metric.

Method:

 

TOP Factors were assessed for 34 CSD journals, as well as several journals in related fields. TOP Factors measure the level of implementation across 10 open science–related practices (e.g., data transparency, analysis plan preregistration, and replication) for a total possible score of 29 points.

Results:

 

Collectively, CSD journals had very low TOP Factors (M = 1.4, range: 0–8). The related fields of Psychology (M = 4.0), Rehabilitation (M = 3.2), Linguistics (M = 1.7), and Education (M = 1.6) also had low scores, though Psychology and Rehabilitation had higher scores than CSD.

Conclusion:

 

CSD journals currently have low levels of encouraging or requiring participation in open science practices, which may impede adoption.

Publications | Free Full-Text | Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals

Abstract:  Journal policies continuously evolve to enable knowledge sharing and support reproducible science. However, that change happens within a certain framework. Eight modular standards with three levels of increasing stringency make Transparency and Openness Promotion (TOP) guidelines which can be used to evaluate to what extent and with which stringency journals promote open science. Guidelines define standards for data citation, transparency of data, material, code and design and analysis, replication, plan and study pre-registration, and two effective interventions: “Registered reports” and “Open science badges”, and levels of adoption summed up across standards define journal’s TOP Factor. In this paper, we analysed the status of adoption of TOP guidelines across two thousand journals reported in the TOP Factor metrics. We show that the majority of the journals’ policies align with at least one of the TOP’s standards, most likely “Data citation” (70%) followed by “Data transparency” (19%). Two-thirds of adoptions of TOP standard are of the stringency Level 1 (less stringent), whereas only 9% is of the stringency Level 3. Adoption of TOP standards differs across science disciplines and multidisciplinary journals (N = 1505) and journals from social sciences (N = 1077) show the greatest number of adoptions. Improvement of the measures that journals take to implement open science practices could be done: (1) discipline-specific, (2) journals that have not yet adopted TOP guidelines could do so, (3) the stringency of adoptions could be increased.

 

Measuring Research Transparency

“Measuring the transparency and credibility of research is fundamental to our mission. By having measures of transparency and credibility we can learn about the current state of research practice, we can evaluate the impact of our interventions, we can track progress on culture change, and we can investigate whether adopting transparency behaviors is associated with increasing credibility of findings….

Many groups have conducted research projects that manually code a sample of papers from a field to assess current practices. These are useful but highly effortful. If machines can be trained to do the work, we will get much more data, more consistently, and much faster. There are at least three groups that have made meaningful progress creating scalable solutions: Ripeta, SciScore, and DataSeer. These groups are trying to make it possible, accurate, and easy to assess many papers for whether the authors shared data, used reporting standards, identified their conflicts of interest, and other transparency relevant actions….”

Frontiers | Key Factors for Improving Rigor and Reproducibility: Guidelines, Peer Reviews, and Journal Technical Reviews | Cardiovascular Medicine

Abstract:  To respond to the NIH’s policy for rigor and reproducibility in preclinical research, many journals have implemented guidelines and checklists to guide authors in improving the rigor and reproducibility of their research. Transparency in developing detailed prospective experimental designs and providing raw data are essential premises of rigor and reproducibility. Standard peer reviews and journal-specific technical and statistical reviews are critical factors for enhancing rigor and reproducibility. This brief review also shares some experience from Arteriosclerosis, Thrombosis, and Vascular Biology, an American Heart Association journal, that has implemented several mechanisms to enhance rigor and reproducibility for preclinical research….

Getting Over TOP : Epidemiology

“In May 2015, the Center for Open Science invited Epidemiology to support the Transparency and Openness Promotion (TOP) Guidelines.1 After consulting our editors and former Editors-in-Chief, I declined this invitation and published an editorial to explain the rationale.2 Nonetheless, the Center for Open Science has assigned a TOP score to the journal and disseminated the score via Clarivate, which also disseminates the Journal Impact Factor. Given that Epidemiology has been scored despite opting not to support the TOP Guidelines, and that our score has been publicized by the Center for Open Science, we here restate and expand our concerns with the TOP Guidelines and emphasize that the guidelines are at odds with Epidemiology’s mission and principles. We declined the invitation to support the TOP Guidelines for three main reasons. First, Epidemiology prefers that authors, reviewers, and editors focus on the quality of the research and the clarity of its presentation over adherence to one-size guidelines. For this reason, among others, the editors of Epidemiology have consistently declined opportunities to endorse or implement endeavors such as the TOP Guidelines.3–5 Second, the TOP Guidelines did not include a concrete plan for program evaluation or revision. Well-meaning guidelines with similar goals sometimes have the opposite of their intended effect.6 Our community would never accept a public health or medical intervention that had little evidence to support its effectiveness (more on that below) and no plan for longitudinal evaluation. We hold publication guidelines to the same standard. Third, we declined the invitation to support the TOP Guidelines because they rest on the untenable premise that each research article’s results are right or wrong, as eventually determined by whether its results are reproducible or not. Too often, and including in the study of reproducibility that was foundational in the promulgation of the TOP Guidelines,7 reproducibility is evaluated by whether results are concordant in terms of statistical significance. This faulty approach has been used frequently, even though the idea that two results—one statistically significant and the other not—are necessarily different from one another is a well-known fallacy.8,9 ”

The Center for Open Science receives the Einstein Foundation Award for Promoting Quality in Research

“The Center for Open Science (COS) has been selected as the inaugural institutional recipient of the Einstein Foundation Award for Promoting Quality in Research.

The award “aims to provide recognition and publicity for outstanding efforts that enhance the rigor, reliability, robustness, and transparency of research in the natural sciences, the social sciences, and the humanities, and stimulate awareness and activities fostering research quality among scientists, institutions, funders, and politicians.”

COS is a nonprofit culture change organization founded in 2013 with the mission to increase openness, integrity, and reproducibility of research. COS takes a systems approach to supporting research culture change. COS builds and maintains a free, open source infrastructure for all disciplines of research, called the Open Science Framework (OSF), that enables the adoption of open practices across the research lifecycle. OSF flexibly integrates with other tools and services to make it efficient for researchers to plan, conduct, report on, and discover research within their current workflows. COS collaborates with grassroots organizations that support training and changing communities’ norms toward openness and integrity and provides solutions that empower communities to customize and promote open practices from within. COS works with funders, publishers, societies, and universities to shift incentives and policies to foster culture change toward rigor and transparency. Finally, COS investigates the state of research practices and evaluates the effectiveness of culture change initiatives. These interdependent activities incorporate a theory of change to create sustainable improvements to science as a social system.

The Einstein Foundation’s jury offered its official statement about the institutional award winner: “The Center for Open Science (COS) catalyzes global research culture change via a unique integrated behavior change model. By offering the Open Science Framework (OSF), collaborating with grassroots communities to grow engagement, advocating with stakeholders to adopt new policies and incentives, and evaluating effectiveness, COS helps to make open science the default. The Transparency and Openness Promotion (TOP) Guidelines, launched by COS in 2015, and supported by over 5,000 signatories, along with all of the major publishers, have initiated an overdue transformation in the publishing culture.”…”

Incorporating open science into evidence-based practice: The TRUST Initiative

Abstract:  To make evidence-based policy, the research ecosystem must produce trustworthy evidence. In the US, federal evidence clearinghouses evaluate research using published standards designed to identify “evidence-based” interventions. Because their evaluations can affect billions of dollars in funding, we examined 10 federal evidence clearinghouses. We found their standards focus on study design features such as randomization, but they overlook issues related to transparency, openness, and reproducibility that also affect whether research tends to produce true results. We identified intervention reports used by these clearinghouses, and we developed a method to assess journals that published those evaluations. Based on the Transparency and Openness Promotion (TOP) Guidelines, we created new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices). Of the 340 journals that published influential research, we found that some endorse the TOP Guidelines, but standards for transparency and openness are not applied systematically and consistently. Examining the quality of our tools, we also found varying levels of interrater reliability across different TOP standards. Using our results, we delivered a normative feedback intervention. We informed editors how their journals compare with TOP and with their peer journals. We also used the Theoretical Domains Framework to develop a survey concerning obstacles and facilitators to implementing TOP; 88 editors responded and reported that they are capable of implementing TOP but lack motivation to do so. The results of this program of research highlight ways to support and to assess transparency and openness throughout the evidence ecosystem.

 

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

 

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

Evaluating implementation of the Transparency and Openness Promotion (TOP) guidelines: the TRUST process for rating journal policies, procedures, and practices | Research Integrity and Peer Review | Full Text

Abstract:  Background

The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor is a metric to describe the extent to which journals have adopted the TOP Guidelines in their policies. Systematic methods and rating instruments are needed to calculate the TOP Factor. Moreover, implementation of these open science policies depends on journal procedures and practices, for which TOP provides no standards or rating instruments.

Methods

We describe a process for assessing journal policies, procedures, and practices according to the TOP Guidelines. We developed this process as part of the Transparency of Research Underpinning Social Intervention Tiers (TRUST) Initiative to advance open science in the social intervention research ecosystem. We also provide new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices) according to standards in the TOP Guidelines. In addition, we describe how to determine the TOP Factor score for a journal, calculate reliability of journal ratings, and assess coherence among a journal’s policies, procedures, and practices. As a demonstration of this process, we describe a protocol for studying approximately 345 influential journals that have published research used to inform evidence-based policy.

Discussion

The TRUST Process includes systematic methods and rating instruments for assessing and facilitating implementation of the TOP Guidelines by journals across disciplines. Our study of journals publishing influential social intervention research will provide a comprehensive account of whether these journals have policies, procedures, and practices that are consistent with standards for open science and thereby facilitate the publication of trustworthy findings to inform evidence-based policy. Through this demonstration, we expect to identify ways to refine the TOP Guidelines and the TOP Factor. Refinements could include: improving templates for adoption in journal instructions to authors, manuscript submission systems, and published articles; revising explanatory guidance intended to enhance the use, understanding, and dissemination of the TOP Guidelines; and clarifying the distinctions among different levels of implementation.