Health Psychology adopts Transparency and Openness Promotion (TOP) Guidelines.

“The Editors are pleased to announce that Health Psychology has adopted the Transparency and Openness Promotion (TOP) Guidelines (Center for Open Science, 2021). We and the other core American Psychological Association (APA) journals are implementing these guidelines at the direction of the APA Publications and Communications Board. Their decision was made with the support of the APA Council of Editors and the APA Open Science and Methodology Committee.

The TOP Guidelines were originally published in Science (Nosek et al., 2015) to encourage journals to incentivize open research practices. They are being implemented by a wide range of scientific publications, including some of the leading behavioral and medical research journals….”

Towards open science: what we know and what we need to know

“Open science presents itself as a set of policies and actions to disseminate research results in an accessible, free and reusable and reproducible way through public digital repositories. As a movement, it uses three basic elements: open access to publications; data opening (whether raw, models, specifications, or documentation); computational process opening (software and algorithms)(1).

Although it is not a new phenomenon, the term can still cause strangeness even to experienced researchers. Open access to articles, as the first element, encountered (and still finds) great resistance to becoming unanimous, although pressure from the scientific society and funding agencies has accelerated the progress of this stage. On the other hand, data opening seems to have been better received, at least in its interface related to the deposit of scientific manuscripts in the preprint format, however this is only the beginning.

Concerning the Brazilian experience, SciELO and the Brazilian Institute of Information in Science and Technology (IBICT – Instituto Brasileiro de Informação em Ciência e Tecnologia) have been leading the opening process and for some time have designed guidelines and strategies to guide their journals towards open science: TOP (Transparency and Openness Promotion)(2). This system interestingly presents levels of openness experimentation that range from pointing out what is a certain item to making it conditional on it being expressly fulfilled for the manuscript to be published.

Although it has existed since 2017, it was only in 2020 that the alignment of Brazilian journals to TOP was indeed accelerated, and significant changes will be adopted in the journals in the coming months and years to adapt to such principles.

Having this information and basing ourselves on the fact that historically changes have been the target of resistance, especially when they happen in an ancient system, like the scientific publication system, we use our privilege to take on multiple roles (author, reviewer, and editor) among the scientific publication process in Brazilian journals to reflect and point out in this editorial four central issues related to editorial management that should be recurrent among the actors involved in the publication process in the coming years months: …”

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

Data deposition required for all C19 Rapid Review publishers – OASPA

“The C19 Rapid Review Initiative – a large-scale collaboration of organisations across the scholarly publishing industry – has agreed to mandate data deposition across the original group of journals that set up the collaboration (eLife, F1000 Research, Hindawi, PeerJ, PLOS, Royal Society, FAIRsharing, Outbreak Science Rapid PREreview, GigaScience, Life Science Alliance, Ubiquity Press, UCL, MIT Press, Cambridge University Press, BMC, RoRi and AfricArXiv). New members aim to align in due course. 

The Initiative, which grew from a need to improve efficiency of peer review and publishing of crucial COVID-19 research, began in April 2020 and now involves over 20 publishers, industry experts, and scholarly communication organizations, supporting over 1,800 rapid reviewers across relevant fields. …”

:From Bioethics to Data Sharing for Transparency in Nursing Research

“Our journal, Journal of Korean Academy of Nursing (JKAN), adopted data sharing policy in December 2020 (https://www.jkan.or.kr/index.php?body=dataSharing) [

3] which was applied from volume 50 issue 6 after extensive discussion. As editor-in-chief, I would like to inform our readers to enhance their understanding of the data sharing policy….”

:From Bioethics to Data Sharing for Transparency in Nursing Research

“Our journal, Journal of Korean Academy of Nursing (JKAN), adopted data sharing policy in December 2020 (https://www.jkan.or.kr/index.php?body=dataSharing) [

3] which was applied from volume 50 issue 6 after extensive discussion. As editor-in-chief, I would like to inform our readers to enhance their understanding of the data sharing policy….”

Replicate Others as You Would Like to Be Replicated Yourself | PS: Political Science & Politics | Cambridge Core

“This article presents some principles on constructive ways to conduct replications. Following the style of the Transparency and Openness Promotion (TOP) guidelines for open science (Nosek et al. 2015), we summarize recommendations as a series of tiers from what is commonly done (Level I), what would be better (Level II), and what would be better still (Level III) (table 1). The aspiration of our constructive replication recommendations is to help fields move toward a research culture in which self-correction is welcomed, honest mistakes are normalized, and different interpretations of results are recognized as a routine outcome of the process. Changing culture is always difficult, of course, but conducting projects in line with ideals and encouraging ideals in others are available to researchers for contributing to an improved culture that is closer to reality….”

APA Joins as New Signatory to TOP Guidelines

“The American Psychological Association (APA), the nation’s largest organization representing psychologists and psychological researchers has become a signatory to the Transparency and Openness Promotion (TOP) Guidelines, an important step for helping to make research data and processes more open by default, according to the Center for Open Science (COS).

The TOP Guidelines are a community-driven effort to align research behaviors with scientific ideals. Transparency, open sharing, and reproducibility are core values of science, but not always part of daily practice. Journals, funders, and institutions can increase reproducibility and integrity of research by aligning their author or grantee guidelines with the TOP Guidelines.

The APA said it will officially begin implementing standardized disclosure requirements of data and underlying research materials (TOP Level 1). Furthermore, it encourages editors of core journals to move to Level 2 TOP (required transparency of data and research items when ethically possible). More information on the specific levels of adoption by each of the core journals will be coming in the first half of 2021….”

Open Science: Promises and Performance| Qualtrics Survey Solutions

“Although many scientists and organisations endorse this notion, progress has been slow. Some of my research explores the barriers that have impeded progress and makes recommendations to encourage future success. This  survey forms part of that work and addresses a variety of issues, including attitudes towards data storage and access, the role of journals in open science, and associated ethical issues. 

Those interested in scientific progress are invited to take part, and participation should take less than 10 minutes. Responses will be anonymous and participants can withdraw at any time.

The findings from the survey will be submitted to open access journal and made available as open access preprint. The raw data will be lodged with the Open Science Foundation …”

Transparency and open science principles… | Wellcome Open Research

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 28 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. This instrument rates the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.
Results: Across the 28 journals, we find low values on the TOP Factor (median [25th, 75th percentile] 2.5 [1, 3], min. 0, max. 9, out of a total possible score of 28) in sleep research and chronobiology journals.
Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support the recent developments in transparent and open science by implementing transparency and openness principles in their guidelines and making adherence to them mandatory.

Transparency and Openness Promotion (TOP) Guidelines

Abstract:  The Transparency and Openness Promotion (TOP) Committee met in November 2014 to address one important element of the incentive systems – journals’ procedures and policies for publication. The outcome of the effort is the TOP Guidelines. There are eight standards in the TOP guidelines; each move scientific communication toward greater openness. These standards are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources.

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.