Data sharing: putting Nature’s policy to the test

“Policies for sharing research data promote reproducibility of published results by supporting independent verification of raw data, methods and conclusions (see, for example, go.nature.com/3oinwy4). Confirmation validates the efforts of the original researchers, reassures the scientific community and encourages others to build on the findings (see go.nature.com/3om9ken). Here we recount our experience of accessing data provided by the authors of two prominent Nature papers.

Our investigations, which took 12 people roughly a year, upheld the conclusions of both papers (V. L. Li et al. Nature 606, 785–790 (2022); T. Iram et al. Nature 605, 509–515; 2022). In each case, we found most of the data online and successfully reproduced most findings after discussion with the authors. When we had difficulty reproducing analyses on the basis of publicly available data and materials alone, the authors provided clarification about data and methods, which resolved most discrepancies.

This positive experience prompted us to generate a checklist to help researchers to facilitate reproducibility of their published findings through sharing of data and statistical methods (see https://osf.io/ps3y9).”

 

Reply to: Recognizing and marshalling the pre-publication error correction potential of open data for more reproducible science | Nature Ecology & Evolution

“In response to our paper, Chen et al.2 highlighted that mandatory open data policies also increase opportunities for detecting and correcting errors pre-publication. We welcome Chen et al.’s comment and acknowledge that we omitted discussing the important, positive impact that mandatory open data policies can have on various pre-publication processes. Our study design and the interpretation of our results were probably influenced by our prior experience of reporting data anomalies and research misconduct to journals, and witnessing first-hand the challenges of post-publication error correction3,4,5. As long-standing advocates of transparency and reproducibility in research, we would celebrate empirical evidence that data sharing mandates increase pre-publication error detection….”

Recognizing and marshalling the pre-publication error correction potential of open data for more reproducible science | Nature Ecology & Evolution

“We enthusiastically applaud Berberi and Roche’s1 effort to evaluate the effects of journals’ mandatory open data policies on the error correction potential of science. Berberi and Roche conclude that at present there is “no evidence that mandatory open data policies increase error correction”. This may come as a surprise and a disappointment to advocates of open science. However, we suggest that by only addressing effects on post-publication error correction, Berberi and Roche miss the crucial dimension of pre-publication error correction potential in their assessment and may therefore substantially underestimate the true merits of mandatory open data policies….”

 

New Guidelines for Presenting Electrochemical Data in All ACS Journals | ACS Measurement Science Au

“Electrochemistry has become a cornerstone in many facets of modern chemistry research. The past few years have witnessed the rapid growth of research areas that employ electrochemical principles and methods, including batteries, supercapacitors, solar cells, fuel cells, electrolyzers, carbon dioxide reduction, nitrogen reduction, and organic electrosynthesis, to just name a few. As such, there has been an expansion in the number of papers reporting electrochemical testing and characterization. Publications reporting electrochemistry-related experiments have become prevalent in many ACS journals including, but not limited to, ACS Applied Materials and Interfaces, ACS Catalysis, ACS Energy Letters, ACS Measurement Science Au, ACS Organic & Inorganic Au, Journal of the American Chemical Society, Organic Letters, The Journal of Organic Chemistry, and The Journal of Physical Chemistry. There have been a variety of guidelines and checklists developed for some of the experimental protocols required for characterizing specific technologies (e.g., rotating ring disk electrochemistry measurements for oxygen reduction electrocatalysis and isotope experiments for nitrogen reduction to ammonia testing). However, no guidelines are available for the presentation of characterization data from general electrochemical measurements. This lack of standardization has resulted in papers being published with insufficient details for readers to reliably replicate the experiments. To outline best practices, we have developed a set of guidelines for reporting electrochemical experimentation and characterization in ACS journals. These guidelines, similar to the existing ACS guidelines for reporting NMR data and X-ray crystallography data for chemical compound and materials characterization, can be found on our ACS Research Data Guidelines website. (1) The guidelines for reporting electrochemical data are split into two sections: guidelines for reporting voltammetry and amperometry measurements, and guidelines for reporting bulk electrolysis procedures….”

Is open science a double-edged sword?: data sharing and the changing citation pattern of Chinese economics articles | SpringerLink

Abstract:  Data sharing is an important part of open science (OS), and more and more institutions and journals have been enforcing open data (OD) policies. OD is advocated to help increase academic influences and promote scientific discovery and development, but such a proposition has not been elaborated on well. This study explores the nuanced effects of the OD policies on the citation pattern of articles by using the case of Chinese economics journals. China Industrial Economics (CIE) is the first and only Chinese social science journal so far to adopt a compulsory OD policy, requiring all published articles to share original data and processing codes. We use the article-level data and difference-in-differences (DID) approach to compare the citation performance of articles published in CIE and 36 comparable journals. Firstly, we find that the OD policy quickly increased the number of citations, and each article on average received 0.25, 1.19, 0.86, and 0.44 more citations in the first four years after publication respectively. Furthermore, we also found that the citation benefit of the OD policy rapidly decreased over time, and even became negative in the fifth year after publication. In conclusion, this changing citation pattern suggests that an OD policy can be double edged sword, which can quickly increase citation performance but simultaneously accelerate the aging of articles.

 

Guest Post – Are We Providing What Researchers Need in the Transition to Open Science? – The Scholarly Kitchen

“Why — despite live examples of seeing the impact of open research practices and the indication from researchers and the academic community that they want open research practices to be the norm — is there such a disparity between awareness, behavior, and action? How can we close this gap so that behaviors align with aspirations around open science?

Putting all these studies together, the reasons presented for the gap are mixed but include concerns around data misuse; lack of credit for sharing data; and the need for better support in how to make data and research sustainably open. Mandates, particularly funder mandates for this particular sample group, seem to have a limited role in driving authors to practice open research (although that may well change with new mandates for data sharing coming into effect from very large funding bodies such as federal agencies in the US). Comparatively, institutional encouragement had relatively good success. Where applicable, journal requirements to share materials, code, or data, or journal encouragement to facilitate preprint deposition, drove the same or greater degree of success as institutional encouragement….

One conclusion that becomes apparent is that more can be done by publishers and their partners to directly help and facilitate the adoption of open research practices. Encouraging or mandating sharing of objects as part of the manuscript publication process is an effective and efficient way of ensuring that open science practices are followed. Journals have been successful in the past in enforcing data-sharing mandates around the release of protein and nucleic acid sequences, for example, so we know that the right policies and initiatives can bring positive change….”

Preprinting and Data Sharing in a New Normal? | Journal of the ASEAN Federation of Endocrine Societies

“JAFES [Journal of the ASEAN Federation of Endocrine Societies] should carefully consider the details in adopting data sharing as a policy. What will be the form and format of the data to be archived and shared? How will it impact or change a participant’s informed consent? How do these policies relate with the prevailing regulations on Data Privacy? How should data be organized, presented, and framed, to prevent misinterpretation or misanalysis?Another development is the emergence of preprints, which may become the norm in future publications. Preprints are scientific articles that are already published online despite not having undergone or completed full peer review–a seemingly unusual concept in a research world where peer review is the most critical requirement and standard for scholarly publications. What preprints make up for despite the lack of peer review, is the swiftness of publication, which may be particularly helpful in the setting of a novel disease or public health emergency, such as the COVID-19 pandemic. Issues such as article quality, ethics, citations and retractions need to be considered. JAFES will need to weigh the value of preprints as a platform for sharing knowledge, or for sharing of data for that matter. Sharing data and using preprints are two new publication trends. JAFES continues to thoroughly review these strategies to determine how these will be useful for our journal and its readers. We expect more innovations to come. Indeed, the learning continues.”

Further action toward valid science in Law and Human Behavior: Requiring open data, analytic code, and research materials.

“Beginning on March 1, 2023, Law and Human Behavior will raise its standard for data reporting and expand its focus to include analytic code and research materials. Adopting the recommended language from the TOP Guidelines (Nosek et al., 2015b), the journal will publish articles only if the data, analytic code, and research materials are clearly and precisely documented and are fully available to any researcher who wishes to reproduce the results or replicate the procedure.

Accordingly, authors using original data who seek to publish their research in the journal must make the following items publicly available: …

 

Authors reusing data from public repositories who pursue publication in Law and Human Behavior must provide program code, scripts for statistical packages, and other documentation sufficient to allow an informed researcher to precisely reproduce all published results….”

 

Evaluating Research Transparency and Openness in Communication Sciences and Disorders Journals | Journal of Speech, Language, and Hearing Research

Abstract:  Purpose:

To improve the credibility, reproducibility, and clinical utility of research findings, many scientific fields are implementing transparent and open research practices. Such open science practices include researchers making their data publicly available and preregistering their hypotheses and analyses. A way to enhance the adoption of open science practices is for journals to encourage or require submitting authors to participate in such practices. Accordingly, the American Speech-Language-Hearing Association’s Journals Program has recently announced their intention to promote open science practices. Here, we quantitatively assess the extent to which several journals in communication sciences and disorders (CSD) encourage or require participation in several open science practices by using the Transparency and Openness Promotion (TOP) Factor metric.

Method:

 

TOP Factors were assessed for 34 CSD journals, as well as several journals in related fields. TOP Factors measure the level of implementation across 10 open science–related practices (e.g., data transparency, analysis plan preregistration, and replication) for a total possible score of 29 points.

Results:

 

Collectively, CSD journals had very low TOP Factors (M = 1.4, range: 0–8). The related fields of Psychology (M = 4.0), Rehabilitation (M = 3.2), Linguistics (M = 1.7), and Education (M = 1.6) also had low scores, though Psychology and Rehabilitation had higher scores than CSD.

Conclusion:

 

CSD journals currently have low levels of encouraging or requiring participation in open science practices, which may impede adoption.

Reproducibility Policy | Sociological Science

“Starting with submissions received after April 1, 2023, authors of articles relying on statistical or computational methods will be required to deposit replication packages as a condition of publication in Sociological Science. Replication packages must contain both the statistical code and — when legally and ethically possible — the data required to fully reproduce the reported results. With this policy, Sociological Science hopes other high-impact journals in Sociology will follow suit in setting standards for reproducibility of published work….”

Opinion: The Promise and Plight of Open Data | TS Digest | The Scientist

“At the same time, open data allow anyone to reproduce a study’s analyses and validate its findings. Occasionally, readers identify errors in the data or analyses that slipped through the peer-review process. These errors can be handled through published corrections or retractions, depending on their severity. One would expect open data to result in more errors being identified and fixed in published papers. 

But are journals with open-data policies more likely than their traditional counterparts to correct published research with erroneous results? To answer this, we collected information on data policies and article retractions for 199 journals that publish research in the fields of ecology and evolution, and compared retraction rates before and after open-data policies were implemented. 

Surprisingly, we found no detectable link between data-sharing policies and annual rates of article retractions. We also found that the publication of corrections was not affected by requirements to share data, and that these results persisted after accounting for differences in publication rates among journals and time lags between policy implementation and study publication due to the peer-review process. While our analysis was restricted to studies in ecology and evolution, colleagues in psychology and medicine have suggested to us that they expect similar patterns in their fields of study. 

Do these results mean that open-data policies are ineffective? No. There is no doubt that open data promote transparency, but our results suggest that a greater potential for error detection does not necessarily translate into greater error correction. We propose three additional practices, some of which could actually improve open-data practices, to help science self-correct. …”

Clinical Trial Data-sharing Policies Among Journals, Funding Agencies, Foundations, and Other Professional Organizations: A Scoping Review – Journal of Clinical Epidemiology

Abstract:  Objectives

To identify the similarities and differences in data-sharing policies for clinical trial data that are endorsed by biomedical journals, funding agencies, and other professional organizations. Additionally, to determine the beliefs, and opinions regarding data-sharing policies for clinical trials discussed in articles published in biomedical journals.

 

Study Design

Two searches were conducted, a bibliographic search for published articles that present beliefs, opinions, similarities, and differences regarding policies governing the sharing of clinical trial data. The second search analyzed the gray literature (non-peer-reviewed publications) to identify important data-sharing policies in selected biomedical journals, foundations, funding agencies, and other professional organizations.

 

Results

A total of 471 articles were included after database search and screening, with 45 from the bibliographic search and 426 from the gray literature search. A total of 424 data-sharing policies were included. Fourteen of the 45 published articles from the bibliographic search (31.1%) discussed only advantages specific to data-sharing policies, 27 (27/45; 60%) discussed both advantages and disadvantages, and 4 (4/45; 8.9%) discussed only disadvantages specific. A total of 216 journals (of 270; 80%) specified a data-sharing policy provided by the journal itself. One hundred industry data-sharing policies were included, and 32 (32%) referenced a data-sharing policy on their website. One hundred and thirty-six (42%) organizations (of 327) specified a data-sharing policy.

 

Conclusion

We found many similarities listed as advantages to data-sharing and fewer disadvantages were discussed within the literature. Additionally, we found a wide variety of commonalities and differences — such as the lack of standardization between policies, and inadequately addressed details regarding the accessibility of research data — that exist in data-sharing policies endorsed by biomedical journals, funding agencies, and other professional organizations. Our study may not include information on all data sharing policies and our data is limited to the entities’ descriptions of each policy.

How society publishers practice open science beyond open access publishing? | PUBMET

Abstract:  Scholarly publishing has rapidly moved towards open access (OA) over the last few decades. However, OA publishing is only one part of a larger open science movement. The recent UNESCO recommendation for open science (UNESCO, 2021) defines open science broadly to cover the openness of scientific knowledge, science infrastructures, engagement with societal actors, and dialogue with other knowledge systems. In its recommendations, open scientific knowledge includes OA to scientific publications but also open research data, metadata, open educational resources, software, and source code and hardware. Earlier research about open scientific knowledge from the point of view of academic publishers has mainly been focused on one element such as OA publishing and neglected other elements.

This paper aims to fill this gap by surveying how society publishers in Finland adopted other elements of open scientific knowledge. In Finland, learned societies account for 70% of national journal output (Late et al., 2020) and their publishing model is mainly diamond open access, which excludes author processing charges and relies on publishing subsidies (Pölönen et al., 2020). Furthermore, their activities often go beyond scholarly publishing to include education and research activities such as funding research and collecting and storing research data (Korkeamäki et al., 2019).

We conducted an electronic survey addressed to Finnish learned societies in November 2021 to answer the following research questions: “To what degree society publishers take up the elements of open scientific knowledge including open access to publications, open data, and open education?” (RQ1) and “Are elements of open scientific knowledge cumulative?” (RQ2)

In total 97 society publishers responded (40% response rate). We analysed their responses through nine variables measuring how they adopted different elements of open scholarly knowledge (Table 1) in view of the UNESCO recommendations for open science.

The results show elements related to open scholarly publications prevail. Almost 70% of respondents publish either gold, green, or hybrid OA publications. Most society publishers reported to support open data policies only some do collect, store, and provide open access to research datasets. Furthermore, only a few societies offer training for opening research data. Even so, a high share of publishers offers open education, and some share their educational materials openly. Although earlier studies have reported differences in adopting open science between disciplines (Rousi & Laakso, 2020), our analysis does not support these findings.

However, it has confirmed that adopting the elements of open scholarly knowledge is cumulative, as OA publishers are more likely to take up other elements of open scholarly knowledge. However, adopting all elements is not yet common. Since activities (e.g. collecting research data, offering education, etc.) of publishers other than societies seem to influence the take up of these elements, further research of their activities is needed. For example, it will show how often and how openly these other publishers provide research data or education beyond.

C&RL Data Sharing Policy Survey

“College & Research Libraries (C&RL), the official research journal of the Association of College & Research Libraries, is in the process of developing a data sharing policy to encourage authors to share the data and any documentation underlying the results of their research. The C&RL Editorial Board would like to hear from the journal’s authors and others concerning this forthcoming policy.

 

We are hoping you would be willing to answer a few questions to help inform this effort….”