Measuring Research Transparency

“Measuring the transparency and credibility of research is fundamental to our mission. By having measures of transparency and credibility we can learn about the current state of research practice, we can evaluate the impact of our interventions, we can track progress on culture change, and we can investigate whether adopting transparency behaviors is associated with increasing credibility of findings….

Many groups have conducted research projects that manually code a sample of papers from a field to assess current practices. These are useful but highly effortful. If machines can be trained to do the work, we will get much more data, more consistently, and much faster. There are at least three groups that have made meaningful progress creating scalable solutions: Ripeta, SciScore, and DataSeer. These groups are trying to make it possible, accurate, and easy to assess many papers for whether the authors shared data, used reporting standards, identified their conflicts of interest, and other transparency relevant actions….”

Frontiers | Key Factors for Improving Rigor and Reproducibility: Guidelines, Peer Reviews, and Journal Technical Reviews | Cardiovascular Medicine

Abstract:  To respond to the NIH’s policy for rigor and reproducibility in preclinical research, many journals have implemented guidelines and checklists to guide authors in improving the rigor and reproducibility of their research. Transparency in developing detailed prospective experimental designs and providing raw data are essential premises of rigor and reproducibility. Standard peer reviews and journal-specific technical and statistical reviews are critical factors for enhancing rigor and reproducibility. This brief review also shares some experience from Arteriosclerosis, Thrombosis, and Vascular Biology, an American Heart Association journal, that has implemented several mechanisms to enhance rigor and reproducibility for preclinical research….

Getting Over TOP : Epidemiology

“In May 2015, the Center for Open Science invited Epidemiology to support the Transparency and Openness Promotion (TOP) Guidelines.1 After consulting our editors and former Editors-in-Chief, I declined this invitation and published an editorial to explain the rationale.2 Nonetheless, the Center for Open Science has assigned a TOP score to the journal and disseminated the score via Clarivate, which also disseminates the Journal Impact Factor. Given that Epidemiology has been scored despite opting not to support the TOP Guidelines, and that our score has been publicized by the Center for Open Science, we here restate and expand our concerns with the TOP Guidelines and emphasize that the guidelines are at odds with Epidemiology’s mission and principles. We declined the invitation to support the TOP Guidelines for three main reasons. First, Epidemiology prefers that authors, reviewers, and editors focus on the quality of the research and the clarity of its presentation over adherence to one-size guidelines. For this reason, among others, the editors of Epidemiology have consistently declined opportunities to endorse or implement endeavors such as the TOP Guidelines.3–5 Second, the TOP Guidelines did not include a concrete plan for program evaluation or revision. Well-meaning guidelines with similar goals sometimes have the opposite of their intended effect.6 Our community would never accept a public health or medical intervention that had little evidence to support its effectiveness (more on that below) and no plan for longitudinal evaluation. We hold publication guidelines to the same standard. Third, we declined the invitation to support the TOP Guidelines because they rest on the untenable premise that each research article’s results are right or wrong, as eventually determined by whether its results are reproducible or not. Too often, and including in the study of reproducibility that was foundational in the promulgation of the TOP Guidelines,7 reproducibility is evaluated by whether results are concordant in terms of statistical significance. This faulty approach has been used frequently, even though the idea that two results—one statistically significant and the other not—are necessarily different from one another is a well-known fallacy.8,9 ”

The Center for Open Science receives the Einstein Foundation Award for Promoting Quality in Research

“The Center for Open Science (COS) has been selected as the inaugural institutional recipient of the Einstein Foundation Award for Promoting Quality in Research.

The award “aims to provide recognition and publicity for outstanding efforts that enhance the rigor, reliability, robustness, and transparency of research in the natural sciences, the social sciences, and the humanities, and stimulate awareness and activities fostering research quality among scientists, institutions, funders, and politicians.”

COS is a nonprofit culture change organization founded in 2013 with the mission to increase openness, integrity, and reproducibility of research. COS takes a systems approach to supporting research culture change. COS builds and maintains a free, open source infrastructure for all disciplines of research, called the Open Science Framework (OSF), that enables the adoption of open practices across the research lifecycle. OSF flexibly integrates with other tools and services to make it efficient for researchers to plan, conduct, report on, and discover research within their current workflows. COS collaborates with grassroots organizations that support training and changing communities’ norms toward openness and integrity and provides solutions that empower communities to customize and promote open practices from within. COS works with funders, publishers, societies, and universities to shift incentives and policies to foster culture change toward rigor and transparency. Finally, COS investigates the state of research practices and evaluates the effectiveness of culture change initiatives. These interdependent activities incorporate a theory of change to create sustainable improvements to science as a social system.

The Einstein Foundation’s jury offered its official statement about the institutional award winner: “The Center for Open Science (COS) catalyzes global research culture change via a unique integrated behavior change model. By offering the Open Science Framework (OSF), collaborating with grassroots communities to grow engagement, advocating with stakeholders to adopt new policies and incentives, and evaluating effectiveness, COS helps to make open science the default. The Transparency and Openness Promotion (TOP) Guidelines, launched by COS in 2015, and supported by over 5,000 signatories, along with all of the major publishers, have initiated an overdue transformation in the publishing culture.”…”

Incorporating open science into evidence-based practice: The TRUST Initiative

Abstract:  To make evidence-based policy, the research ecosystem must produce trustworthy evidence. In the US, federal evidence clearinghouses evaluate research using published standards designed to identify “evidence-based” interventions. Because their evaluations can affect billions of dollars in funding, we examined 10 federal evidence clearinghouses. We found their standards focus on study design features such as randomization, but they overlook issues related to transparency, openness, and reproducibility that also affect whether research tends to produce true results. We identified intervention reports used by these clearinghouses, and we developed a method to assess journals that published those evaluations. Based on the Transparency and Openness Promotion (TOP) Guidelines, we created new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices). Of the 340 journals that published influential research, we found that some endorse the TOP Guidelines, but standards for transparency and openness are not applied systematically and consistently. Examining the quality of our tools, we also found varying levels of interrater reliability across different TOP standards. Using our results, we delivered a normative feedback intervention. We informed editors how their journals compare with TOP and with their peer journals. We also used the Theoretical Domains Framework to develop a survey concerning obstacles and facilitators to implementing TOP; 88 editors responded and reported that they are capable of implementing TOP but lack motivation to do so. The results of this program of research highlight ways to support and to assess transparency and openness throughout the evidence ecosystem.

 

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

 

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

Evaluating implementation of the Transparency and Openness Promotion (TOP) guidelines: the TRUST process for rating journal policies, procedures, and practices | Research Integrity and Peer Review | Full Text

Abstract:  Background

The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor is a metric to describe the extent to which journals have adopted the TOP Guidelines in their policies. Systematic methods and rating instruments are needed to calculate the TOP Factor. Moreover, implementation of these open science policies depends on journal procedures and practices, for which TOP provides no standards or rating instruments.

Methods

We describe a process for assessing journal policies, procedures, and practices according to the TOP Guidelines. We developed this process as part of the Transparency of Research Underpinning Social Intervention Tiers (TRUST) Initiative to advance open science in the social intervention research ecosystem. We also provide new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices) according to standards in the TOP Guidelines. In addition, we describe how to determine the TOP Factor score for a journal, calculate reliability of journal ratings, and assess coherence among a journal’s policies, procedures, and practices. As a demonstration of this process, we describe a protocol for studying approximately 345 influential journals that have published research used to inform evidence-based policy.

Discussion

The TRUST Process includes systematic methods and rating instruments for assessing and facilitating implementation of the TOP Guidelines by journals across disciplines. Our study of journals publishing influential social intervention research will provide a comprehensive account of whether these journals have policies, procedures, and practices that are consistent with standards for open science and thereby facilitate the publication of trustworthy findings to inform evidence-based policy. Through this demonstration, we expect to identify ways to refine the TOP Guidelines and the TOP Factor. Refinements could include: improving templates for adoption in journal instructions to authors, manuscript submission systems, and published articles; revising explanatory guidance intended to enhance the use, understanding, and dissemination of the TOP Guidelines; and clarifying the distinctions among different levels of implementation.

Health Psychology adopts Transparency and Openness Promotion (TOP) Guidelines.

“The Editors are pleased to announce that Health Psychology has adopted the Transparency and Openness Promotion (TOP) Guidelines (Center for Open Science, 2021). We and the other core American Psychological Association (APA) journals are implementing these guidelines at the direction of the APA Publications and Communications Board. Their decision was made with the support of the APA Council of Editors and the APA Open Science and Methodology Committee.

The TOP Guidelines were originally published in Science (Nosek et al., 2015) to encourage journals to incentivize open research practices. They are being implemented by a wide range of scientific publications, including some of the leading behavioral and medical research journals….”

Towards open science: what we know and what we need to know

“Open science presents itself as a set of policies and actions to disseminate research results in an accessible, free and reusable and reproducible way through public digital repositories. As a movement, it uses three basic elements: open access to publications; data opening (whether raw, models, specifications, or documentation); computational process opening (software and algorithms)(1).

Although it is not a new phenomenon, the term can still cause strangeness even to experienced researchers. Open access to articles, as the first element, encountered (and still finds) great resistance to becoming unanimous, although pressure from the scientific society and funding agencies has accelerated the progress of this stage. On the other hand, data opening seems to have been better received, at least in its interface related to the deposit of scientific manuscripts in the preprint format, however this is only the beginning.

Concerning the Brazilian experience, SciELO and the Brazilian Institute of Information in Science and Technology (IBICT – Instituto Brasileiro de Informação em Ciência e Tecnologia) have been leading the opening process and for some time have designed guidelines and strategies to guide their journals towards open science: TOP (Transparency and Openness Promotion)(2). This system interestingly presents levels of openness experimentation that range from pointing out what is a certain item to making it conditional on it being expressly fulfilled for the manuscript to be published.

Although it has existed since 2017, it was only in 2020 that the alignment of Brazilian journals to TOP was indeed accelerated, and significant changes will be adopted in the journals in the coming months and years to adapt to such principles.

Having this information and basing ourselves on the fact that historically changes have been the target of resistance, especially when they happen in an ancient system, like the scientific publication system, we use our privilege to take on multiple roles (author, reviewer, and editor) among the scientific publication process in Brazilian journals to reflect and point out in this editorial four central issues related to editorial management that should be recurrent among the actors involved in the publication process in the coming years months: …”

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

Data deposition required for all C19 Rapid Review publishers – OASPA

“The C19 Rapid Review Initiative – a large-scale collaboration of organisations across the scholarly publishing industry – has agreed to mandate data deposition across the original group of journals that set up the collaboration (eLife, F1000 Research, Hindawi, PeerJ, PLOS, Royal Society, FAIRsharing, Outbreak Science Rapid PREreview, GigaScience, Life Science Alliance, Ubiquity Press, UCL, MIT Press, Cambridge University Press, BMC, RoRi and AfricArXiv). New members aim to align in due course. 

The Initiative, which grew from a need to improve efficiency of peer review and publishing of crucial COVID-19 research, began in April 2020 and now involves over 20 publishers, industry experts, and scholarly communication organizations, supporting over 1,800 rapid reviewers across relevant fields. …”

:From Bioethics to Data Sharing for Transparency in Nursing Research

“Our journal, Journal of Korean Academy of Nursing (JKAN), adopted data sharing policy in December 2020 (https://www.jkan.or.kr/index.php?body=dataSharing) [

3] which was applied from volume 50 issue 6 after extensive discussion. As editor-in-chief, I would like to inform our readers to enhance their understanding of the data sharing policy….”