What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

About Meta-Psychology

“Meta-Psychology publishes theoretical and empirical contributions that advance psychology as a science through critical discourse related to individual articles, research lines, research areas, or psychological science as a field. Important contributions include systematic reviews, meta-analyses, replicability reports, and replication studies. We encourage pre-registered studies and registered reports (i.e., peer-review on the basis of theory, methods, and planned data-analysis, before data has been collected). Manuscripts introducing novel methods are welcome, but also tutorials on established methods that are still poorly understood by psychology researchers. We further welcome papers introducing statistical packages or other software useful for psychology researchers….”

 

Scholastica announces integration with Altmetric Badges for its OA Publishing Platform

“Scholastica, a leading software solutions provider for academic journals, announced today that its open access publishing platform now includes an Altmetric Badge integration option to help journals, their authors, and readers track alternative impact indicators for articles.

Journals subscribed to Scholastica’s open access publishing platform with a paid Altmetric account can enable the new integration to have Altmetric Badges automatically displayed on the public metrics page for all the articles they publish via Scholastica. Each Altmetric Badge links to an Altmetric details page that features a breakdown of online attention received by the article….”

Open Science Badges at Taylor & Francis – Editor Resources

“Open Science Badges (OSB) were designed by the Center for Open Science to acknowledge and encourage open science practices. They are offered as incentives for researchers to share data, materials, or to preregister their research. The badges are a visual signal for readers, indicating that the content of the study is available in perpetuity….”

Investigating the Effectiveness of the Open Data Badge Policy at Psychological Science Through Computational Reproducibility

Abstract:  In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.

 

An open science argument against closed metrics

“In the Open Scientist Handbook, I argue that open science supports anti-rivalrous science collaborations where most metrics are of little, or of negative value. I would like to share some of these arguments here….

Institutional prestige is a profound drag on the potential for networked science. If your administration has a plan to “win” the college ratings game, this plan will only make doing science harder. It makes being a scientist less rewarding. Playing finite games of chasing arbitrary metrics or ‘prestige’ drags scientists away from the infinite play of actually doing science….

As Cameron Neylon said at the metrics breakout of the ‘Beyond the PDF’ conference some years ago, “reuse is THE metric.” Reuse reveals and confirms the advantage that open sharing has over current, market-based, practices. Reuse validates the work of the scientist who contributed to the research ecosystem. Reuse captures more of the inherent value of the original discovery and accelerates knowledge growth….”

Open Badges: Meaningful Credential for Continuing Education in Libraries? | ZBW MediaTalk

“Why can Open Badges be a suitable way of promoting informally acquired knowledge and self-directed learning? And to what extent do they make an important contribution to validating skills gained in the context of lifelong learning? In this post, interview guest Meik Schild-Steiniger explains why he sees the topic as being a possible answer to the paradigm shift taking place in the culture of learning….”

Incentivising research data sharing: a scoping review

Abstract:  Background: Numerous mechanisms exist to incentivise researchers to share their data. This scoping review aims to identify and summarise evidence of the efficacy of different interventions to promote open data practices and provide an overview of current research.

Methods: This scoping review is based on data identified from Web of Science and LISTA, limited from 2016 to 2021. A total of 1128 papers were screened, with 38 items being included. Items were selected if they focused on designing or evaluating an intervention or presenting an initiative to incentivise sharing. Items comprised a mixture of research papers, opinion pieces and descriptive articles.

Results: Seven major themes in the literature were identified: publisher/journal data sharing policies, metrics, software solutions, research data sharing agreements in general, open science ‘badges’, funder mandates, and initiatives.

Conclusions: A number of key messages for data sharing include: the need to build on existing cultures and practices, meeting people where they are and tailoring interventions to support them; the importance of publicising and explaining the policy/service widely; the need to have disciplinary data champions to model good practice and drive cultural change; the requirement to resource interventions properly; and the imperative to provide robust technical infrastructure and protocols, such as labelling of data sets, use of DOIs, data standards and use of data repositories.

A non-traditional open-source solution for altmetrics | Emerald Insight

Abstract:  Purpose

Altmetric carries the potential of highlighting scholarly content by measuring online interactions much before other forms of traditional metrics grow up. The aim of this paper is to be the single point of access for librarians, scientists, information specialists, researchers and other scholars in public to learn to embed the open-source embeddable badge provided by Altmetric in their websites and showcase their article altmetrics. Libraries can take advantage of this free and innovative tool by incorporating it in their own websites or digital repositories.

Design/methodology/approach

This paper elucidates steps for embedding altimetric institutional repository badges in personal websites or institutional repositories.

Findings

This open-source Altmetric tool tracks a range of sources to catch and collect the scholarly activity and assists in monitoring and reporting the attention surrounding an author’s work in a very timely manner.

Originality/value

This tool is freely available to libraries worldwide.

Open science practices for eating disorders research

Abstract:  This editorial seeks to encourage the increased applicationof three open science practices in eating disordersresearch: Preregistration, Registered Reports, and the shar-ing of materials, data, and code. For each of these prac-tices, we introduce updated International Journal of Eating Disorders author and reviewer guidance. Updates include the introduction of open science badges; specific instruc-tions about how to improve transparency; and the intro-duction of Registered Reports of systematic or meta-analytical reviews. The editorial also seeks to encourage the study of open science practices. Open science prac-tices pose considerable time and other resource burdens.Therefore, research is needed to help determine the valueof these added burdens and to identify efficient strategies for implementing open science practices.

Do Open Science Badges Increase Trust in Scientists among Undergraduates, Scientists, and the Public?

Abstract:  Open science badges are a promising method to signal a study’s adherence to open science practices (OSP). In three experimental studies, we investigated whether badges affect trust in scientists by undergraduates (N = 270), scientists (N = 250), or the public (N = 257). Furthermore, we analyzed the moderating role of epistemic beliefs in this regard. Participants were randomly assigned to two of three conditions: Badges awarded (visible compliance to OSP), badges not awarded (visible noncompliance to OSP), and no badges (control). In all samples, our Bayesian analyses indicated that badges influence trust as expected with one exception in the public sample: an additional positive effect of awarded badges compared to no badges was not supported here. Further, we found evidence for the absence of a moderation by epistemic beliefs. Our results demonstrate that badges are an effective means to foster trust in scientists among target audiences of scientific papers.

Ouvrir la Science – Deuxième Plan national pour la science ouverte

From Google’s English:  “The National Open Science Plan announced in 2018 by the Minister of Higher Education, Research and Innovation, Frédérique Vidal, has enabled France to adopt a coherent and dynamic policy in the field of open science, coordinated by the Committee for Open Science, which brings together the ministry, research and higher education institutions and the scientific community. After three years of implementation, the progress made is notable. The rate of French scientific publications in open access rose from 41% to 56%. The National Open Science Fund was created, it launched two calls for projects in favor of open scientific publication and it supported structuring international initiatives. The National Research Agency and other funding agencies now require open access to publications and the drafting of data management plans for the projects they fund. The function of ministerial research data administrator has been created and a network is being deployed in the establishments. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published. About twenty universities and research organizations have adopted an open science policy. Several guides and recommendations for putting open science into practice have been published.

The steps already taken and the evolution of the international context invite us to extend, renew and strengthen our commitments by adopting a second National Plan for Open Science, the effects of which will be deployed until 2024. With this new plan, France is continuing the ambitious trajectory initiated by the law for a digital republic of 2016 and confirmed by the research programming law of 2020, which includes open science in the missions of researchers and teacher-researchers.

This second National Plan extends its scope to source codes resulting from research, it structures actions in favor of the opening or sharing of data through the creation of the Research Data Gouv platform, it multiplies the levers of transformation in order to generalize open science practices and it presents disciplinary and thematic variations. It is firmly in line with a European ambition and proposes, within the framework of the French Presidency of the European Union, to act to take effective account of open science practices in individual and collective research evaluations. It is about initiating a process of sustainable transformation in order to make open science a common and shared practice…”