Open science practices for eating disorders research

Abstract:  This editorial seeks to encourage the increased applicationof three open science practices in eating disordersresearch: Preregistration, Registered Reports, and the shar-ing of materials, data, and code. For each of these prac-tices, we introduce updated International Journal of Eating Disorders author and reviewer guidance. Updates include the introduction of open science badges; specific instruc-tions about how to improve transparency; and the intro-duction of Registered Reports of systematic or meta-analytical reviews. The editorial also seeks to encourage the study of open science practices. Open science prac-tices pose considerable time and other resource burdens.Therefore, research is needed to help determine the valueof these added burdens and to identify efficient strategies for implementing open science practices.

Do Open Science Badges Increase Trust in Scientists among Undergraduates, Scientists, and the Public?

Abstract:  Open science badges are a promising method to signal a study’s adherence to open science practices (OSP). In three experimental studies, we investigated whether badges affect trust in scientists by undergraduates (N = 270), scientists (N = 250), or the public (N = 257). Furthermore, we analyzed the moderating role of epistemic beliefs in this regard. Participants were randomly assigned to two of three conditions: Badges awarded (visible compliance to OSP), badges not awarded (visible noncompliance to OSP), and no badges (control). In all samples, our Bayesian analyses indicated that badges influence trust as expected with one exception in the public sample: an additional positive effect of awarded badges compared to no badges was not supported here. Further, we found evidence for the absence of a moderation by epistemic beliefs. Our results demonstrate that badges are an effective means to foster trust in scientists among target audiences of scientific papers.

Ouvrir la Science – Deuxième Plan national pour la science ouverte

From Google’s English:  “The National Open Science Plan announced in 2018 by the Minister of Higher Education, Research and Innovation, Frédérique Vidal, has enabled France to adopt a coherent and dynamic policy in the field of open science, coordinated by the Committee for Open Science, which brings together the ministry, research and higher education institutions and the scientific community. After three years of implementation, the progress made is notable. The rate of French scientific publications in open access rose from 41% to 56%. The National Open Science Fund was created, it launched two calls for projects in favor of open scientific publication and it supported structuring international initiatives. The National Research Agency and other funding agencies now require open access to publications and the drafting of data management plans for the projects they fund. The function of ministerial research data administrator has been created and a network is being deployed in the establishments. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published.

The steps already taken and the evolution of the international context invite us to extend, renew and strengthen our commitments by adopting a second National Plan for Open Science, the effects of which will be deployed until 2024. With this new plan, France is continuing the ambitious trajectory initiated by the law for a digital republic of 2016 and confirmed by the research programming law of 2020, which includes open science in the missions of researchers and teacher-researchers.

This second National Plan extends its scope to source codes resulting from research, it structures actions in favor of the opening or sharing of data through the creation of the Research Data Gouv platform, it multiplies the levers of transformation in order to generalize open science practices and it presents disciplinary and thematic variations. It is firmly in line with a European ambition and proposes, within the framework of the French Presidency of the European Union, to act to take effective account of open science practices in individual and collective research evaluations. It is about initiating a process of sustainable transformation in order to make open science a common and shared practice…”

Increasing transparency through open science badges

“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).

Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

NISO’s Recommended Practice on Reproducibility Badging and Definitions Now Published | Industry Announcements and Events SSP-L

“The National Information Standards Organization (NISO) today announces the publication of its Recommended Practice, RP-31-2021, Reproducibility Badging and Definitions. Developed by the NISO Taxonomy, Definitions, and Recognition Badging Scheme Working Group, this new Recommended Practice provides a set of recognition standards that can be deployed across scholarly publishing outputs, to easily recognize and reward the sharing of data and methods….”

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Full article: To share or not to share – 10 years of European Journal of Psychotraumatology

Abstract:  The European Journal of Psychotraumatology, owned by the European Society for Traumatic Stress Studies (ESTSS), launched as one of the first full Open Access ‘specialist’ journals in its field. Has this Open Access model worked in how the Journal has performed? With the European Journal of Psychotraumatology celebrating its ten-year anniversary we look back at the past decade of sharing our research with the world and with how the journal sits with the broader movement beyond Open Access to Open Research and we present new policies we have adopted to move the field of psychotraumatology to the next level of Open Research. While we as researchers now make our publications more often freely available to all, how often do we share our protocols, our statistical analysis plans, or our data? We all gain from more transparency and reproducibility, and big steps are being made in this direction. The journal’s decennial performance as well as the exciting new Open Research developments are presented in this editorial. The journal is no longer in its infancy and eager to step into the next decade of Open Research.

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study | Royal Society Open Science

Abstract:  For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

 

FAIR metrics and certification, rewards and recognition, skills and training: FAIRsFAIR contribution to the EOSC Strategic Research and Innovation Agenda | FAIRsFAIR

“FAIRsFAIR is a key contributor to the ongoing development of global standards for FAIR data and repository certification and to the policies and practices that will turn the EOSC programme into a functioning infrastructure. The project strongly endorses all of the guiding principles already identified as relevant to implementing the EOSC vision, with a special emphasis on the importance of FAIR-by-design tools. The guiding principles are a multi-stakeholder approach; data as open as possible and as closed as necessary; implementation of a Web of FAIR data and related services for science; federation of existing research infrastructures; and the need for machine-run algorithms transparent to researchers)….”