Investigating the Effectiveness of the Open Data Badge Policy at Psychological Science Through Computational Reproducibility

Abstract:  In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.

 

An open science argument against closed metrics

“In the Open Scientist Handbook, I argue that open science supports anti-rivalrous science collaborations where most metrics are of little, or of negative value. I would like to share some of these arguments here….

Institutional prestige is a profound drag on the potential for networked science. If your administration has a plan to “win” the college ratings game, this plan will only make doing science harder. It makes being a scientist less rewarding. Playing finite games of chasing arbitrary metrics or ‘prestige’ drags scientists away from the infinite play of actually doing science….

As Cameron Neylon said at the metrics breakout of the ‘Beyond the PDF’ conference some years ago, “reuse is THE metric.” Reuse reveals and confirms the advantage that open sharing has over current, market-based, practices. Reuse validates the work of the scientist who contributed to the research ecosystem. Reuse captures more of the inherent value of the original discovery and accelerates knowledge growth….”

Open Badges: Meaningful Credential for Continuing Education in Libraries? | ZBW MediaTalk

“Why can Open Badges be a suitable way of promoting informally acquired knowledge and self-directed learning? And to what extent do they make an important contribution to validating skills gained in the context of lifelong learning? In this post, interview guest Meik Schild-Steiniger explains why he sees the topic as being a possible answer to the paradigm shift taking place in the culture of learning….”

Incentivising research data sharing: a scoping review

Abstract:  Background: Numerous mechanisms exist to incentivise researchers to share their data. This scoping review aims to identify and summarise evidence of the efficacy of different interventions to promote open data practices and provide an overview of current research.

Methods: This scoping review is based on data identified from Web of Science and LISTA, limited from 2016 to 2021. A total of 1128 papers were screened, with 38 items being included. Items were selected if they focused on designing or evaluating an intervention or presenting an initiative to incentivise sharing. Items comprised a mixture of research papers, opinion pieces and descriptive articles.

Results: Seven major themes in the literature were identified: publisher/journal data sharing policies, metrics, software solutions, research data sharing agreements in general, open science ‘badges’, funder mandates, and initiatives.

Conclusions: A number of key messages for data sharing include: the need to build on existing cultures and practices, meeting people where they are and tailoring interventions to support them; the importance of publicising and explaining the policy/service widely; the need to have disciplinary data champions to model good practice and drive cultural change; the requirement to resource interventions properly; and the imperative to provide robust technical infrastructure and protocols, such as labelling of data sets, use of DOIs, data standards and use of data repositories.

A non-traditional open-source solution for altmetrics | Emerald Insight

Abstract:  Purpose

Altmetric carries the potential of highlighting scholarly content by measuring online interactions much before other forms of traditional metrics grow up. The aim of this paper is to be the single point of access for librarians, scientists, information specialists, researchers and other scholars in public to learn to embed the open-source embeddable badge provided by Altmetric in their websites and showcase their article altmetrics. Libraries can take advantage of this free and innovative tool by incorporating it in their own websites or digital repositories.

Design/methodology/approach

This paper elucidates steps for embedding altimetric institutional repository badges in personal websites or institutional repositories.

Findings

This open-source Altmetric tool tracks a range of sources to catch and collect the scholarly activity and assists in monitoring and reporting the attention surrounding an author’s work in a very timely manner.

Originality/value

This tool is freely available to libraries worldwide.

Open science practices for eating disorders research

Abstract:  This editorial seeks to encourage the increased applicationof three open science practices in eating disordersresearch: Preregistration, Registered Reports, and the shar-ing of materials, data, and code. For each of these prac-tices, we introduce updated International Journal of Eating Disorders author and reviewer guidance. Updates include the introduction of open science badges; specific instruc-tions about how to improve transparency; and the intro-duction of Registered Reports of systematic or meta-analytical reviews. The editorial also seeks to encourage the study of open science practices. Open science prac-tices pose considerable time and other resource burdens.Therefore, research is needed to help determine the valueof these added burdens and to identify efficient strategies for implementing open science practices.

Do Open Science Badges Increase Trust in Scientists among Undergraduates, Scientists, and the Public?

Abstract:  Open science badges are a promising method to signal a study’s adherence to open science practices (OSP). In three experimental studies, we investigated whether badges affect trust in scientists by undergraduates (N = 270), scientists (N = 250), or the public (N = 257). Furthermore, we analyzed the moderating role of epistemic beliefs in this regard. Participants were randomly assigned to two of three conditions: Badges awarded (visible compliance to OSP), badges not awarded (visible noncompliance to OSP), and no badges (control). In all samples, our Bayesian analyses indicated that badges influence trust as expected with one exception in the public sample: an additional positive effect of awarded badges compared to no badges was not supported here. Further, we found evidence for the absence of a moderation by epistemic beliefs. Our results demonstrate that badges are an effective means to foster trust in scientists among target audiences of scientific papers.

Ouvrir la Science – Deuxième Plan national pour la science ouverte

From Google’s English:  “The National Open Science Plan announced in 2018 by the Minister of Higher Education, Research and Innovation, Frédérique Vidal, has enabled France to adopt a coherent and dynamic policy in the field of open science, coordinated by the Committee for Open Science, which brings together the ministry, research and higher education institutions and the scientific community. After three years of implementation, the progress made is notable. The rate of French scientific publications in open access rose from 41% to 56%. The National Open Science Fund was created, it launched two calls for projects in favor of open scientific publication and it supported structuring international initiatives. The National Research Agency and other funding agencies now require open access to publications and the drafting of data management plans for the projects they fund. The function of ministerial research data administrator has been created and a network is being deployed in the establishments. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published.

The steps already taken and the evolution of the international context invite us to extend, renew and strengthen our commitments by adopting a second National Plan for Open Science, the effects of which will be deployed until 2024. With this new plan, France is continuing the ambitious trajectory initiated by the law for a digital republic of 2016 and confirmed by the research programming law of 2020, which includes open science in the missions of researchers and teacher-researchers.

This second National Plan extends its scope to source codes resulting from research, it structures actions in favor of the opening or sharing of data through the creation of the Research Data Gouv platform, it multiplies the levers of transformation in order to generalize open science practices and it presents disciplinary and thematic variations. It is firmly in line with a European ambition and proposes, within the framework of the French Presidency of the European Union, to act to take effective account of open science practices in individual and collective research evaluations. It is about initiating a process of sustainable transformation in order to make open science a common and shared practice…”

Increasing transparency through open science badges

“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).

Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

NISO’s Recommended Practice on Reproducibility Badging and Definitions Now Published | Industry Announcements and Events SSP-L

“The National Information Standards Organization (NISO) today announces the publication of its Recommended Practice, RP-31-2021, Reproducibility Badging and Definitions. Developed by the NISO Taxonomy, Definitions, and Recognition Badging Scheme Working Group, this new Recommended Practice provides a set of recognition standards that can be deployed across scholarly publishing outputs, to easily recognize and reward the sharing of data and methods….”

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”