Guest Post – Are We Providing What Researchers Need in the Transition to Open Science? – The Scholarly Kitchen

“Why — despite live examples of seeing the impact of open research practices and the indication from researchers and the academic community that they want open research practices to be the norm — is there such a disparity between awareness, behavior, and action? How can we close this gap so that behaviors align with aspirations around open science?

Putting all these studies together, the reasons presented for the gap are mixed but include concerns around data misuse; lack of credit for sharing data; and the need for better support in how to make data and research sustainably open. Mandates, particularly funder mandates for this particular sample group, seem to have a limited role in driving authors to practice open research (although that may well change with new mandates for data sharing coming into effect from very large funding bodies such as federal agencies in the US). Comparatively, institutional encouragement had relatively good success. Where applicable, journal requirements to share materials, code, or data, or journal encouragement to facilitate preprint deposition, drove the same or greater degree of success as institutional encouragement….

One conclusion that becomes apparent is that more can be done by publishers and their partners to directly help and facilitate the adoption of open research practices. Encouraging or mandating sharing of objects as part of the manuscript publication process is an effective and efficient way of ensuring that open science practices are followed. Journals have been successful in the past in enforcing data-sharing mandates around the release of protein and nucleic acid sequences, for example, so we know that the right policies and initiatives can bring positive change….”

Preprinting and Data Sharing in a New Normal? | Journal of the ASEAN Federation of Endocrine Societies

“JAFES [Journal of the ASEAN Federation of Endocrine Societies] should carefully consider the details in adopting data sharing as a policy. What will be the form and format of the data to be archived and shared? How will it impact or change a participant’s informed consent? How do these policies relate with the prevailing regulations on Data Privacy? How should data be organized, presented, and framed, to prevent misinterpretation or misanalysis?Another development is the emergence of preprints, which may become the norm in future publications. Preprints are scientific articles that are already published online despite not having undergone or completed full peer review–a seemingly unusual concept in a research world where peer review is the most critical requirement and standard for scholarly publications. What preprints make up for despite the lack of peer review, is the swiftness of publication, which may be particularly helpful in the setting of a novel disease or public health emergency, such as the COVID-19 pandemic. Issues such as article quality, ethics, citations and retractions need to be considered. JAFES will need to weigh the value of preprints as a platform for sharing knowledge, or for sharing of data for that matter. Sharing data and using preprints are two new publication trends. JAFES continues to thoroughly review these strategies to determine how these will be useful for our journal and its readers. We expect more innovations to come. Indeed, the learning continues.”

Further action toward valid science in Law and Human Behavior: Requiring open data, analytic code, and research materials.

“Beginning on March 1, 2023, Law and Human Behavior will raise its standard for data reporting and expand its focus to include analytic code and research materials. Adopting the recommended language from the TOP Guidelines (Nosek et al., 2015b), the journal will publish articles only if the data, analytic code, and research materials are clearly and precisely documented and are fully available to any researcher who wishes to reproduce the results or replicate the procedure.

Accordingly, authors using original data who seek to publish their research in the journal must make the following items publicly available: …


Authors reusing data from public repositories who pursue publication in Law and Human Behavior must provide program code, scripts for statistical packages, and other documentation sufficient to allow an informed researcher to precisely reproduce all published results….”


Evaluating Research Transparency and Openness in Communication Sciences and Disorders Journals | Journal of Speech, Language, and Hearing Research

Abstract:  Purpose:

To improve the credibility, reproducibility, and clinical utility of research findings, many scientific fields are implementing transparent and open research practices. Such open science practices include researchers making their data publicly available and preregistering their hypotheses and analyses. A way to enhance the adoption of open science practices is for journals to encourage or require submitting authors to participate in such practices. Accordingly, the American Speech-Language-Hearing Association’s Journals Program has recently announced their intention to promote open science practices. Here, we quantitatively assess the extent to which several journals in communication sciences and disorders (CSD) encourage or require participation in several open science practices by using the Transparency and Openness Promotion (TOP) Factor metric.



TOP Factors were assessed for 34 CSD journals, as well as several journals in related fields. TOP Factors measure the level of implementation across 10 open science–related practices (e.g., data transparency, analysis plan preregistration, and replication) for a total possible score of 29 points.



Collectively, CSD journals had very low TOP Factors (M = 1.4, range: 0–8). The related fields of Psychology (M = 4.0), Rehabilitation (M = 3.2), Linguistics (M = 1.7), and Education (M = 1.6) also had low scores, though Psychology and Rehabilitation had higher scores than CSD.



CSD journals currently have low levels of encouraging or requiring participation in open science practices, which may impede adoption.

Reproducibility Policy | Sociological Science

“Starting with submissions received after April 1, 2023, authors of articles relying on statistical or computational methods will be required to deposit replication packages as a condition of publication in Sociological Science. Replication packages must contain both the statistical code and — when legally and ethically possible — the data required to fully reproduce the reported results. With this policy, Sociological Science hopes other high-impact journals in Sociology will follow suit in setting standards for reproducibility of published work….”

Opinion: The Promise and Plight of Open Data | TS Digest | The Scientist

“At the same time, open data allow anyone to reproduce a study’s analyses and validate its findings. Occasionally, readers identify errors in the data or analyses that slipped through the peer-review process. These errors can be handled through published corrections or retractions, depending on their severity. One would expect open data to result in more errors being identified and fixed in published papers. 

But are journals with open-data policies more likely than their traditional counterparts to correct published research with erroneous results? To answer this, we collected information on data policies and article retractions for 199 journals that publish research in the fields of ecology and evolution, and compared retraction rates before and after open-data policies were implemented. 

Surprisingly, we found no detectable link between data-sharing policies and annual rates of article retractions. We also found that the publication of corrections was not affected by requirements to share data, and that these results persisted after accounting for differences in publication rates among journals and time lags between policy implementation and study publication due to the peer-review process. While our analysis was restricted to studies in ecology and evolution, colleagues in psychology and medicine have suggested to us that they expect similar patterns in their fields of study. 

Do these results mean that open-data policies are ineffective? No. There is no doubt that open data promote transparency, but our results suggest that a greater potential for error detection does not necessarily translate into greater error correction. We propose three additional practices, some of which could actually improve open-data practices, to help science self-correct. …”

Clinical Trial Data-sharing Policies Among Journals, Funding Agencies, Foundations, and Other Professional Organizations: A Scoping Review – Journal of Clinical Epidemiology

Abstract:  Objectives

To identify the similarities and differences in data-sharing policies for clinical trial data that are endorsed by biomedical journals, funding agencies, and other professional organizations. Additionally, to determine the beliefs, and opinions regarding data-sharing policies for clinical trials discussed in articles published in biomedical journals.


Study Design

Two searches were conducted, a bibliographic search for published articles that present beliefs, opinions, similarities, and differences regarding policies governing the sharing of clinical trial data. The second search analyzed the gray literature (non-peer-reviewed publications) to identify important data-sharing policies in selected biomedical journals, foundations, funding agencies, and other professional organizations.



A total of 471 articles were included after database search and screening, with 45 from the bibliographic search and 426 from the gray literature search. A total of 424 data-sharing policies were included. Fourteen of the 45 published articles from the bibliographic search (31.1%) discussed only advantages specific to data-sharing policies, 27 (27/45; 60%) discussed both advantages and disadvantages, and 4 (4/45; 8.9%) discussed only disadvantages specific. A total of 216 journals (of 270; 80%) specified a data-sharing policy provided by the journal itself. One hundred industry data-sharing policies were included, and 32 (32%) referenced a data-sharing policy on their website. One hundred and thirty-six (42%) organizations (of 327) specified a data-sharing policy.



We found many similarities listed as advantages to data-sharing and fewer disadvantages were discussed within the literature. Additionally, we found a wide variety of commonalities and differences — such as the lack of standardization between policies, and inadequately addressed details regarding the accessibility of research data — that exist in data-sharing policies endorsed by biomedical journals, funding agencies, and other professional organizations. Our study may not include information on all data sharing policies and our data is limited to the entities’ descriptions of each policy.

How society publishers practice open science beyond open access publishing? | PUBMET

Abstract:  Scholarly publishing has rapidly moved towards open access (OA) over the last few decades. However, OA publishing is only one part of a larger open science movement. The recent UNESCO recommendation for open science (UNESCO, 2021) defines open science broadly to cover the openness of scientific knowledge, science infrastructures, engagement with societal actors, and dialogue with other knowledge systems. In its recommendations, open scientific knowledge includes OA to scientific publications but also open research data, metadata, open educational resources, software, and source code and hardware. Earlier research about open scientific knowledge from the point of view of academic publishers has mainly been focused on one element such as OA publishing and neglected other elements.

This paper aims to fill this gap by surveying how society publishers in Finland adopted other elements of open scientific knowledge. In Finland, learned societies account for 70% of national journal output (Late et al., 2020) and their publishing model is mainly diamond open access, which excludes author processing charges and relies on publishing subsidies (Pölönen et al., 2020). Furthermore, their activities often go beyond scholarly publishing to include education and research activities such as funding research and collecting and storing research data (Korkeamäki et al., 2019).

We conducted an electronic survey addressed to Finnish learned societies in November 2021 to answer the following research questions: “To what degree society publishers take up the elements of open scientific knowledge including open access to publications, open data, and open education?” (RQ1) and “Are elements of open scientific knowledge cumulative?” (RQ2)

In total 97 society publishers responded (40% response rate). We analysed their responses through nine variables measuring how they adopted different elements of open scholarly knowledge (Table 1) in view of the UNESCO recommendations for open science.

The results show elements related to open scholarly publications prevail. Almost 70% of respondents publish either gold, green, or hybrid OA publications. Most society publishers reported to support open data policies only some do collect, store, and provide open access to research datasets. Furthermore, only a few societies offer training for opening research data. Even so, a high share of publishers offers open education, and some share their educational materials openly. Although earlier studies have reported differences in adopting open science between disciplines (Rousi & Laakso, 2020), our analysis does not support these findings.

However, it has confirmed that adopting the elements of open scholarly knowledge is cumulative, as OA publishers are more likely to take up other elements of open scholarly knowledge. However, adopting all elements is not yet common. Since activities (e.g. collecting research data, offering education, etc.) of publishers other than societies seem to influence the take up of these elements, further research of their activities is needed. For example, it will show how often and how openly these other publishers provide research data or education beyond.

C&RL Data Sharing Policy Survey

“College & Research Libraries (C&RL), the official research journal of the Association of College & Research Libraries, is in the process of developing a data sharing policy to encourage authors to share the data and any documentation underlying the results of their research. The C&RL Editorial Board would like to hear from the journal’s authors and others concerning this forthcoming policy.


We are hoping you would be willing to answer a few questions to help inform this effort….”

When open data closes the door: A critical examination of the past, present and the potential future for open data guidelines in journals – Prosser – British Journal of Social Psychology – Wiley Online Library

Abstract:  Opening data promises to improve research rigour and democratize knowledge production. But it also presents practical, theoretical, and ethical considerations for qualitative researchers in particular. Discussion about open data in qualitative social psychology predates the replication crisis. However, the nuances of this ongoing discussion have not been translated into current journal guidelines on open data. In this article, we summarize ongoing debates about open data from qualitative perspectives, and through a content analysis of 261 journals we establish the state of current journal policies for open data in the domain of social psychology. We critically discuss how current common expectations for open data may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We advise that future open data guidelines should aim to reflect the nuance of arguments surrounding data sharing in qualitative research, and move away from a universal “one-size-fits-all” approach to data sharing. This article outlines the past, present, and the potential future of open data guidelines in social-psychological journals. We conclude by offering recommendations for how journals might more inclusively consider the use of open data in qualitative methods, whilst recognizing and allowing space for the diverse perspectives, needs, and contexts of all forms of social-psychological research.


Long-term availability of data associated with articles in PLOS ONE | PLOS ONE

Abstract:  The adoption of journal policies requiring authors to include a Data Availability Statement has helped to increase the availability of research data associated with research articles. However, having a Data Availability Statement is not a guarantee that readers will be able to locate the data; even if provided with an identifier like a uniform resource locator (URL) or a digital object identifier (DOI), the data may become unavailable due to link rot and content drift. To explore the long-term availability of resources including data, code, and other digital research objects associated with papers, this study extracted 8,503 URLs and DOIs from a corpus of nearly 50,000 Data Availability Statements from papers published in PLOS ONE between 2014 and 2016. These URLs and DOIs were used to attempt to retrieve the data through both automated and manual means. Overall, 80% of the resources could be retrieved automatically, compared to much lower retrieval rates of 10–40% found in previous papers that relied on contacting authors to locate data. Because a URL or DOI might be valid but still not point to the resource, a subset of 350 URLs and 350 DOIs were manually tested, with 78% and 98% of resources, respectively, successfully retrieved. Having a DOI and being shared in a repository were both positively associated with availability. Although resources associated with older papers were slightly less likely to be available, this difference was not statistically significant, suggesting that URLs and DOIs may be an effective means for accessing data over time. These findings point to the value of including URLs and DOIs in Data Availability Statements to ensure access to data on a long-term basis.



Embracing the value of research data: introducing the JCHLA/JABSC Data Sharing Policy | Journal of the Canadian Health Libraries Association / Journal de l’Association des bibliothèques de la santé du Canada

Abstract:  As health sciences researchers have been asked to share their data more frequently due to funder policies, journal requirements, or interest from their peers, health sciences librarians (HSLs) have simultaneously begun to provide support to researchers in this space through training, participating in RDM efforts on research grants, and developing comprehensive data services programs. If supporting researchers’ data sharing efforts is a worthwhile investment for HSLs, it is crucial that we practice data sharing in our own research endeavours. sharing data is a positive step in the right direction, as it can increase the transparency, reliability, and reusability of HSL-related research outputs. Furthermore, having the ability to identify and connect with researchers in relation to the challenges associated with data sharing can help HSLs empathize with their communities and gain new perspectives on improving support in this area. To that end, the Journal of the Canadian Health Libraries Association / Journal de l’Association des bibliothèques de la santé du Canada (JCHLA / JABSC) has developed a Data Sharing Policy to improve the transparency and reusability of research data underlying the results of its publications. This paper will describe the approach taken to inform and develop this policy. 


Facts and Figures for open research data

“Figures and case studies related to accessing and reusing the data produced in the course of scientific production.”

Many researchers say they’ll share data — but don’t

“Most biomedical and health researchers who declare their willingness to share the data behind journal articles do not respond to access requests or hand over the data when asked, a study reports1. …

But of the 1,792 manuscripts for which the authors stated they were willing to share their data, more than 90% of corresponding authors either declined or did not respond to requests for raw data (see ‘Data-sharing behaviour’). Only 14%, or 254, of the contacted authors responded to e-mail requests for data, and a mere 6.7%, or 120 authors, actually handed over the data in a usable format. The study was published in the Journal of Clinical Epidemiology on 29 May….

Puljak’s results square with those of a study that Danchev led, which found low rates of data sharing by authors of papers in leading medical journals that stipulate all clinical trials must share data2. …

Past research suggests that some fields, such as ecology, embrace data sharing more than others. But multiple analyses of COVID-19 clinical trials — including some from Li4,5 and Tan6 — have reported that anywhere from around half to 80% of investigators are unwilling or not planning to share data freely….

To encourage researchers to prepare their data, Li says, journals could make data-sharing statements more prescriptive. They could require authors to detail where they will share raw data, who will be able to access it, when and how.


Funders could also raise the bar for data sharing. The US National Institutes of Health, in an effort to curb wasteful, irreproducible research, will soon mandate that grant applicants include a data-management and sharing plan in their applications. Eventually, they will be required to share data publicly….”

Data Sharing and Reanalyses Among Randomized Clinical Trials Published in Surgical Journals Before and After Adoption of a Data Availability and Reproducibility Policy | Medical Journals and Publishing | JAMA Network Open | JAMA Network

Abstract:  Importance  Clinical trial data sharing holds promise for maximizing the value of clinical research. The International Committee of Medical Journal Editors (ICMJE) adopted a policy promoting data sharing in July 2018.

Objective  To evaluate the association of the ICMJE data sharing policy with data availability and reproducibility of main conclusions among leading surgical journals.

Design, Setting, and Participants  This cross-sectional study, conducted in October 2021, examined randomized clinical trials (RCTs) in 10 leading surgical journals before and after the implementation of the ICMJE data sharing policy in July 2018.

Exposure  Implementation of the ICMJE data sharing policy.

Main Outcomes and Measures  To demonstrate a pre-post increase in data availability from 5% to 25% (??=?.05; ??=?0.1), 65 RCTs published before and 65 RCTs published after the policy was issued were included, and their data were requested. The primary outcome was data availability (ie, the receipt of sufficient data to enable reanalysis of the primary outcome). When data sharing was available, the primary outcomes reported in the journal articles were reanalyzed to explore reproducibility. The reproducibility features of these studies were detailed.

Results  Data were available for 2 of 65 RCTs (3.1%) published before the ICMJE policy and for 2 of 65 RCTs (3.1%) published after the policy was issued (odds ratio, 1.00; 95% CI, 0.07-14.19; P?>?.99). A data sharing statement was observed in 11 of 65 RCTs (16.9%) published after the policy vs none before the policy (risk ratio, 2.20; 95% CI, 1.81-2.68; P = .001). Data obtained for reanalysis (n?=?4) were not from RCTs published with a data sharing statement. Of the 4 RCTs with available data, all of them had primary outcomes that were fully reproduced. However, discrepancies or inaccuracies that were not associated with study conclusions were identified in 3 RCTs. These concerned the number of patients included in 1 RCT, the management of missing values in another RCT, and discrepant timing for the principal outcome declared in the study registration and reported in the third RCT.

Conclusions and Relevance  This cross-sectional study suggests that data sharing practices are rare in surgical journals despite the ICMJE policy and that most RCTs published in these journals lack transparency. The results of these studies may not be reproducible by external researchers.