Open Metrics Require Open Infrastructure

“Today, Zenodo announced their intentions to remove the altmetrics.com badges from their landing pages–and we couldn’t be more energized by their commitment to open infrastructure, supporting their mission to make scientific information open and free.

“We strongly believe that metadata about records including citation data & other data used for computing metrics should be freely available without barriers” – Zenodo Leadership….

In light of emerging needs for metrics and our work at Make Data Count (MDC) to build open infrastructure for data metrics, we believe that it is necessary for corporations or entities that provide analytics and researcher tools to share the raw data sources behind their work. In short, if we trust these metrics enough to display on our websites or add to our CVs, then we should also demand that they be available for us to audit….

These principles are core to our mission to build the infrastructure for open data metrics. As emphasis shifts in scholarly communication toward “other research outputs” beyond the journal article, we believe it is important to build intentionally open infrastructure, not repeating mistakes made in the metrics systems developed for articles. We know that it is possible for the community to come together and develop the future of open metrics, in a non-prescriptive manner, and importantly built on completely open and reproducible infrastructure.”

Open Science Practices at the Journal of Traumatic Stress – Kerig – 2020 – Journal of Traumatic Stress – Wiley Online Library

Abstract:  This editorial describes new initiatives designed to promote and maintain open science practices (OSP) at the Journal of Traumatic Stress, to be enacted beginning January 2020. Following a brief description of the rationale underlying the argument for conducting and reporting research in ways that maximize transparency and replicability, this article summarizes changes in Journal submission and publication procedures that are designed to foster and highlight such practices. These include requesting an Open Science Practices Statement from authors of all accepted manuscripts, which will be published as supplementary material for each article, and providing authors with the opportunity to earn OSP badges for preregistering studies, making data available to other researchers by posting on a third party archive, and making available research materials and codes used in the study.

 

SSHOC WEBINAR: How to improve the quality of your repository? SSHOC and certification of repositories | DARIAH

“Certification is a sign of trust that benefits a data repository in many ways. How can your repository achieve certification? The SSHOC webinar will focus on the certification of digital repositories and how your repository can apply for the CoreTrustSeal. The webinar will also touch upon how SSHOC can support repositories seeking certification.

CoreTrustSeal is a community-driven certification framework with over 80 past certifications. The certification consists of sixteen requirements for which applicants are asked to provide self-assessment statements along with relevant evidence. CoreTrustSeal certification is sufficiently stringent for data repositories within the social sciences and humanities but significantly less costly and labour-intensive than formal audit against ISO/DIN standards. Certification requirements for the CoreTrustSeal are also reviewed every three years in comparison with every five years for ISO/DIN standards. CoreTrustSeal is open to feedback and continuously considering the widest possible range of certification candidates….”

Did awarding badges increase data sharing in BMJ Open? A randomized controlled trial | Royal Society Open Science

Abstract:  Sharing data and code are important components of reproducible research. Data sharing in research is widely discussed in the literature; however, there are no well-established evidence-based incentives that reward data sharing, nor randomized studies that demonstrate the effectiveness of data sharing policies at increasing data sharing. A simple incentive, such as an Open Data Badge, might provide the change needed to increase data sharing in health and medical research. This study was a parallel group randomized controlled trial (protocol registration: doi:10.17605/OSF.IO/PXWZQ) with two groups, control and intervention, with 80 research articles published in BMJ Open per group, with a total of 160 research articles. The intervention group received an email offer for an Open Data Badge if they shared their data along with their final publication and the control group received an email with no offer of a badge if they shared their data with their final publication. The primary outcome was the data sharing rate. Badges did not noticeably motivate researchers who published in BMJ Open to share their data; the odds of awarding badges were nearly equal in the intervention and control groups (odds ratio = 0.9, 95% CI [0.1, 9.0]). Data sharing rates were low in both groups, with just two datasets shared in each of the intervention and control groups. The global movement towards open science has made significant gains with the development of numerous data sharing policies and tools. What remains to be established is an effective incentive that motivates researchers to take up such tools to share their data.

 

 

Supporting journal publishing practices in the global south | Research Information

“Journals in the developing world face challenges in becoming known and respected in the international research landscape. Siân Harris describes Journal Publishing Practices and Standards, established and managed by African Journals Online and INASP.”

OSF | Badges to Acknowledge Open Practices Wiki

“A “PA” (Protected Access) notation may be added to open data badges if sensitive, personal data are available only from an approved third party repository that manages access to data to qualified researchers through a documented process. To be eligible for an open data badge with such a notation, the repository must publicly describe the steps necessary to obtain the data and detailed data documentation (e.g. variable names and allowed values) must be made available publicly. This notation is not available to researchers who state that they will make “data available upon request” and is not available if requests for data sharing are evaluated on any criteria beyond considerations for compliance with proper handling of sensitive data. For example, this notation is not available if limitations are placed on the permitted use of the data, such as for data that are only made available for the purposes of replicating previously published results or for which there is substantive review of analytical results. Review of results to avoid disclosure of confidential information is permissible….”

Open Science Comes To Policy Analysis – CEGA – Medium

“This post is co-authored by Fernando Hoces de la Guardia, BITSS postdoctoral scholar, along with Sean Grant (Associate Behavioral and Social Scientist at RAND) and CEGA Faculty Director Ted Miguel. It is cross-posted with the BITSS Blog.

The Royal Society’s motto, “Take nobody’s word for it,” reflects a key principle of scientific inquiry: as researchers, we aspire to discuss ideas in the open, to examine our analyses critically, to learn from our mistakes, and to constantly improve. This type of thinking shouldn’t guide only the creation of rigorous evidence?—?rather, it should extend to the work of policy analysts whose findings may affect very large numbers of people. At the end of the day, a commitment to scientific rigor in public policy analysis is the only durable response to potential attacks on credibility. We, the three authors of this blog?—?Fernando Hoces de la Guardia, Sean Grant, and Ted Miguel?—?recently published a working paper suggesting a parallel between the reproducibility crisis in social science and observed threats to the credibility of public policy analysis. Researchers and policy analysts both perform empirical analyses; have a large amount of undisclosed flexibility when collecting, analyzing, and reporting data; and may face strong incentives to obtaining “desired” results (for example, p-values of <0.05 in research, or large negative/positive effects in policy analysis)….”

APPRAISE (A Post-Publication Review and Assessment In Science Experiment) | ASAPbio

“I describe here a new project – called Appraise – that is both a model and experimental platform for what peer review can and should look like in a world without journals….

The rise of preprints gives us the perfect opportunity to create a new system that takes full advantage of the Internet to more rapidly, effectively and fairly engage the scientific community in assessing the validity, audience and impact of published works….

APPRAISE (A Post-Publication Review and Assessment In Science Experiment)…

It is perhaps easiest to think of Appraise as an editorial board without a journal (and we hope to be a model for how existing editorial boards can transition away from journals). Like journal editorial boards they will curate the scientific literature through the critical process of peer review. However members of Appraise will not be reviewing papers submitted to a journal and deciding whether it should be published. Rather Appraise reviewers are working in service of members of the scientific community, selecting papers they think warrant scrutiny and attention, and reviewing them to help others find, understand and assess published paper….

In the spirit of openness we encourage Appraise members to identify themselves, but recognize that the ability to speak freely sometimes requires anonymity. Appraise will allow members to post reviews anonymously provided that there are no conflicts of interest and the reviewer does not use anonymity as a shield for inappropriate behavior. Whether reviewers are publicly identified or not, Appraise will never tolerate personal attacks of any kind.

We are launching Appraise with a small group of scientists. This is for purely practical purposes – to develop our systems and practices without the challenges of managing a large, open community. But the goal is to as quickly as possible open the platform up to everyone.”

APA releases new journal article reporting standards

“Brian Nosek, PhD, co-founder and director of the Center for Open Science, welcomed the new standards. “Achieving the ideals of transparency in science requires knowing what one needs to be transparent about,” he said. “These updated standards will improve readers’ understanding of what happened in the research. This will improve both the accuracy of interpretation of the existing evidence, and the ability to replicate and extend the findings to improve understanding.” APA has partnered with the Center for Open Science to advance open science practices in psychological research through open science badges on articles, a data repository for APA published articles and designating the COS’ PsyArXiv as the preferred preprint server for APA titles….”

Emory Libraries Blog | Put a badge on it: incentives for data sharing and reproducibility

“How do you encourage researchers to share the data underlying their publications? The journal Psychological Science introduced a digital badge system in 2014 to signify when authors make the data and related materials accompanying their articles openly available. Criteria to earn the Open Data badge include (1) sharing data via a publicly accessible repository with a persistent identifier, such as a DOI, (2) assigning an open license, such as CC-BY or CC0, allowing reuse and credit to the data producer, and (3) providing enough documentation that another researcher could reproduce the reported results (Badges to Acknowledge Open Practices project on the Open Science Framework)….”

OSF | Badges to Acknowledge Open Practices

“There is no central authority determining the validity of scientific claims. Accumulation of scientific knowledge proceeds via open communication with the community. Sharing evidence for scientific claims facilitates critique, extension, and application. Despite the importance of open communication for scientific progress, present norms do not provide strong incentives for individual researchers to share data, materials, or their research process. Journals can provide such incentives by acknowledging open practices with badges in publications….”

Open Science Badges

“What are Open Science Badges?

Badges to acknowledge open science practices are incentives for researchers to share data, materials, or to preregister.

Badges signal to the reader that the content has been made available and certify its accessibility in a persistent location….

 

Badges seem silly. Do they work?

Yes. Implementing these badges dramatically increases the rate of data sharing (Kidwell et al, 2016).

A recent systematic review identified this badging program as the only evidence-based incentive program that this effective at increasing the rates of data sharing (Rowhani-Farid et al., 2017).

View a list of journals and organizations that have adopted badges here….”

CREDIT reflects Complete Workflow

“CREDIT is a cloud-enabled SaaS tool for data management to provide an opportunity to authors to register their Additional Research Outputs(AROs) reflecting RAW, REPEAT & NULL/NEGATIVE entities generated at various stages of research workflow to ensure their reusability & gaining credit. Hence contributing towards enriching research articles & reproducible science. CREDIT framework & interface is developed on FAIR data principles….The appearance of these badges happens dynamically, hence creates a possibility that the metrics around the data, when readers engage with it would be fed back to the main published article in real-time (accessible via the badge – Enhancing Discoverability and also giving credits to Authors). And in the near-future we also have plans to roll out Badges that can be embedded in PDF articles….”

Open Humans

“Open Humans is a program of the nonprofit Open Humans Foundation and has been funded by the Robert Wood Johnson Foundation and the Knight Foundation. Our 2015 launch was written up in Forbes, Newsweek, Scientific American, and more.

You decide when to share. You have valuable data, and you’ll decide when to share it. The data you provide will be private by default. You can choose which projects to share with. You can also opt to make some (or all) of your data public, so anyone can access and research it!

Studies, projects, and more. Browse our activities list to see the many potential data sources you can add, and interesting projects you can join.

Be a part of research. We’ll recognize your contributions with badges on your profile page, invite you to talk to other community members in our online forums, and periodically post new activities, study updates, and relevant interviews in our newsletters and on our blog….”

PsyArXiv Preprints | Suggestions to Advance Your Mission: An Open Letter to Dr. Shinobu Kitayama, Editor of JPSP:ASC

An open letter to the new editor-in-chief of Journal of Personality and Social Psychology: Attitudes and Social Cognition, urging the adoption of best practices for data sharing, reproducibility, and open science.