Cross-Sectional Evaluation of Open Science Practices at Imaging Journals: A Meta-Research Study – Mohammed Kashif Al-Ghita, Kelly Cobey, David Moher, Mariska M.G. Leeflang, Sanam Ebrahimzadeh, Eric Lam, Paul Rooprai, Ahmed Al Khalil, Nabil Islam, Hamza Algodi, Haben Dawit, Robert Adamo, Mahdi Zeghal, Matthew D.F. McInnes, 2023

Abstract:  Objective: To evaluate open science policies of imaging journals, and compliance to these policies in published articles. Methods: From imaging journals listed we extracted open science policy details: protocol registration, reporting guidelines, funding, ethics and conflicts of interest (COI), data sharing, and open access publishing. The 10 most recently published studies from each journal were assessed to determine adherence to these policies. We calculated the proportion of open science policies into an Open Science Score (OSS) for all journals and articles. We evaluated relationships between OSS and journal/article level variables. Results: 82 journals/820 articles were included. The OSS of journals and articles was 58.3% and 31.8%, respectively. Of the journals, 65.9% had registration and 78.1% had reporting guideline policies. 79.3% of journals were members of COPE, 81.7% had plagiarism policies, 100% required disclosure of funding, and 97.6% required disclosure of COI and ethics approval. 81.7% had data sharing policies and 15.9% were fully open access. 7.8% of articles had a registered protocol, 8.4% followed a reporting guideline, 77.4% disclosed funding, 88.7% disclosed COI, and 85.6% reported ethics approval. 12.3% of articles shared their data. 51% of articles were available through open access or as a preprint. OSS was higher for journal with DOAJ membership (80% vs 54.2%; P < .0001). Impact factor was not correlated with journal OSS. Knowledge synthesis articles has a higher OSS scores (44.5%) than prospective/retrospective studies (32.6%, 30.0%, P < .0001). Conclusion: Imaging journals endorsed just over half of open science practices considered; however, the application of these practices at the article level was lower.

 

Results of PLOS experiments to increase sharing and discovery of research data – The Official PLOS Blog

“For PLOS, increasing data-sharing rates—and especially increasing the amount of data shared in a repository—is a high priority. 

Research data is a vital part of the scientific record, essential to both understanding and reproducing published research. And data repositories are the most effective and impactful way to share research data. Not only is deposited data safer and more discoverable, articles with data in a repository have a 25% higher citation rate on average.

With support from the Wellcome Trust, we’ve been experimenting with two solutions designed to increase awareness about data repositories and promote data repository use among both authors and readers. One solution didn’t achieve its expected outcome in the context we tested it (a “negative” result) while the other shows promise as a tool for increasing engagement with deposited data. The mixed outcomes are an example of why it’s so important to share all research results regardless of their outcome – whether “positive” or “negative” results. We hope that our experiences, what we’ve learned, and above all the data and results, can help the scholarly communications community to develop new and better solutions to meet the challenges we all face, and advance Open Science.

Read on for a quick summary of the studies we conducted. Or get the full details from our new preprint on Figshare, and explore the data for yourself….”

Incentivising best practice in research data sharing: Experiments to increase use of and engagement with data repositories

Abstract:  Improving the uptake of repositories to share research data is an aim of many publishers, funders and infrastructure providers. Even at the publisher PLOS, which has a mandatory data sharing policy, repositories are still used less commonly than Supporting Information to share data. This preprint presents the results of two experiments that tested solutions that aimed to increase the use of repositories for data sharing as well as increase engagement with shared data. The experiments—integration of the Dryad repository into the manuscript submission system at PLOS Pathogens and implementing an Accessible Data icon to signal data shared in a repository on published articles across the PLOS journal portfolio—were designed to be interventions that required minimal extra effort for authors (researchers). We collected usage data on these solutions as well as survey (n=654 and n=4,898) and interview (n=12) data from submitting authors. The results show that author uptake of the integrated repository (used by ~2% of submissions) was lower than expected in part due to lack of awareness despite various communication methods being used. Integration of data repositories into the journal submission process, in the context in which we tested it, may not increase use of repositories without additional visibility, or policy incentives. Our survey results suggest the Accessible Data icon did have some effect on author behaviour, although not in the expected way, as it influenced repository choice for authors who had already planned to use a repository rather than influencing the choice of sharing method. Furthermore, the Accessible Data icon was successful in increasing engagement with shared data, as measured by an increase in average monthly views of datasets linked to a cohort of 543 published articles that displayed it from 2.5 to 3.0 (an increase of 20%) comparing 12-month periods either side of the introduction of the icon. The results of these two experiments provide valuable insights to publishers and other stakeholders about strategies for increasing the use of repositories for sharing research data.

 

Use of plain language summaries in anaesthesia journals – Keane – Anaesthesia – Wiley Online Library

“Plain language summaries (also described as lay summaries) aim to make scientific research more accessible to a wider audience (including patients, caregivers and the general public) by describing research findings in clear, concise language that avoids technical jargon [1, 2]. Plain language summaries can enhance knowledge and understanding of research, increase participation in clinical trials and encourage patient engagement and involvement in research [2-4]. Summaries of proposed research written in language understood by the general public are a crucial component of most research grant applications and are mandatory for clinical trials conducted in European Union member states [5]. Plain language summaries have become increasingly popular in recent years as a means of improving the dissemination and understanding of research findings. Some anaesthesia journals now require authors to provide plain language summaries alongside their research articles. These summaries typically include information about the research question; methods used to answer the question; main findings of the study; and implications of those findings for clinical practice [2]. They typically consist of around 250 words that avoid technical jargon or complex statistical terminology [1, 2]. We aimed to determine the extent and characteristics of plain language summaries in high-ranking anaesthesia journals….”

Evaluation of Transparency and Openness Guidelines in Physical Therapy Journals | Physical Therapy | Oxford Academic

Abstract:  Objective The goals of this study were to evaluate the extent that physical therapy journals support open science research practices by adhering to the Transparency and Openness Promotion guidelines and to assess the relationship between journal scores and their respective journal impact factor. Methods Scimago, mapping studies, the National Library of Medicine, and journal author guidelines were searched to identify physical therapy journals for inclusion. Journals were graded on 10 standards (29 available total points) related to transparency with data, code, research materials, study design and analysis, preregistration of studies and statistical analyses, replication, and open science badges. The relationship between journal transparency and openness scores and their journal impact factor was determined. Results Thirty-five journals’ author guidelines were assigned transparency and openness factor scores. The median score (interquartile range) across journals was 3.00 out of 29 (3.00) points (for all journals the scores ranged from 0–8). The 2 standards with the highest degree of implementation were design and analysis transparency (reporting guidelines) and study preregistration. No journals reported on code transparency, materials transparency, replication, and open science badges. Transparency and openness promotion factor scores were a significant predictor of journal impact factor scores. Conclusion There is low implementation of the transparency and openness promotion standards by physical therapy journals. Transparency and openness promotion factor scores demonstrated predictive abilities for journal impact factor scores. Policies from journals must improve to make open science practices the standard in research. Journals are in an influential position to guide practices that can improve the rigor of publication which, ultimately, enhances the evidence-based information used by physical therapists. Impact Transparent, open, and reproducible research will move the profession forward by improving the quality of research and increasing the confidence in results for implementation in clinical care.

Measuring protocol sharing: are we on the right track? – The Official PLOS Blog

“For almost a year, Open Science Indicators have offered the ability to measure three Open Science practices: data sharing, code sharing, and preprint posting. Now, PLOS and DataSeer are adding a fourth indicator for protocol sharing. As we expand the tool’s capabilities, we invite your feedback on the approach we’ve taken in this preliminary data release….

We drafted a set of requirements built on our OSI measurement framework and consulted on them with stakeholders including tool providers, meta-researchers, and other methods experts. We then worked with DataSeer to operationalize the requirements. Our current approach detects links to or citations of outputs from an allowlist of publications and repositories known to focus on protocols. In keeping with our approach to measuring data and code sharing, we also detect relevant metadata from supplementary information where available. Please consult our methods documentation for more detail….

Our roadmap for further developing the protocols indicator includes adding detection of protocols on lab websites and other online locations. We plan to look more deeply at citations of published protocols, so that we can understand the extent to which authors are pointing to procedures actually used in their study as opposed to referencing protocols for some other reason. We also want to be able to assess how often researchers share their own protocols versus protocols created by others.

 

Just as importantly, we’d like to hear from you: are there publications or repositories missing from our allowlist? How should we address the limitations of an allowlist-based approach? And are there other ways of communicating detailed methods information that we should consider? We’d be grateful for your input by November 15; you can comment below or write to mlaflamme [at] plos.org to share your perspective.”

How does mandated code-sharing change peer review? – The Official PLOS Blog

“On March 31 2021, PLOS Computational Biology introduced a new journal requirement: mandated code sharing. If the research process included the creation of custom code, the authors were required to make it available during the peer review assessment, and to make it public upon the publication of their research article—similar to the longstanding data sharing requirement for all PLOS journals. The aim, of course, is to improve reproducibility and increase understanding of research.

At the end of the year-long trial period, code sharing had risen from 53% in 2019 to 87% for 2021 articles submitted after the policy went into effect. Evidence in hand, the journal Editors-in-Chief decided to make code sharing a permanent feature of the journal. Today, the sharing rate is 96%….”

The use and acceptability of preprints in health and social care settings: A scoping review | PLOS ONE

Abstract:  Background

Preprints are open and accessible scientific manuscript or report that is shared publicly, through a preprint server, before being submitted to a journal. The value and importance of preprints has grown since its contribution during the public health emergency of the COVID-19 pandemic. Funders and publishers are establishing their position on the use of preprints, in grant applications and publishing models. However, the evidence supporting the use and acceptability of preprints varies across funders, publishers, and researchers. The scoping review explored the current evidence on the use and acceptability of preprints in health and social care settings by publishers, funders, and the research community throughout the research lifecycle.

Methods

A scoping review was undertaken with no study or language limits. The search strategy was limited to the last five years (2017–2022) to capture changes influenced by COVID-19 (e.g., accelerated use and role of preprints in research). The review included international literature, including grey literature, and two databases were searched: Scopus and Web of Science (24 August 2022).

Results

379 titles and abstracts and 193 full text articles were assessed for eligibility. Ninety-eight articles met eligibility criteria and were included for full extraction. For barriers and challenges, 26 statements were grouped under four main themes (e.g., volume/growth of publications, quality assurance/trustworthiness, risks associated to credibility, and validation). For benefits and value, 34 statements were grouped under six themes (e.g., openness/transparency, increased visibility/credibility, open review process, open research, democratic process/systems, increased productivity/opportunities).

Conclusions

Preprints provide opportunities for rapid dissemination but there is a need for clear policies and guidance from journals, publishers, and funders. Cautionary measures are needed to maintain the quality and value of preprints, paying particular attention to how findings are translated to the public. More research is needed to address some of the uncertainties addressed in this review.

Data sharing: putting Nature’s policy to the test

“Policies for sharing research data promote reproducibility of published results by supporting independent verification of raw data, methods and conclusions (see, for example, go.nature.com/3oinwy4). Confirmation validates the efforts of the original researchers, reassures the scientific community and encourages others to build on the findings (see go.nature.com/3om9ken). Here we recount our experience of accessing data provided by the authors of two prominent Nature papers.

Our investigations, which took 12 people roughly a year, upheld the conclusions of both papers (V. L. Li et al. Nature 606, 785–790 (2022); T. Iram et al. Nature 605, 509–515; 2022). In each case, we found most of the data online and successfully reproduced most findings after discussion with the authors. When we had difficulty reproducing analyses on the basis of publicly available data and materials alone, the authors provided clarification about data and methods, which resolved most discrepancies.

This positive experience prompted us to generate a checklist to help researchers to facilitate reproducibility of their published findings through sharing of data and statistical methods (see https://osf.io/ps3y9).”

 

Reply to: Recognizing and marshalling the pre-publication error correction potential of open data for more reproducible science | Nature Ecology & Evolution

“In response to our paper, Chen et al.2 highlighted that mandatory open data policies also increase opportunities for detecting and correcting errors pre-publication. We welcome Chen et al.’s comment and acknowledge that we omitted discussing the important, positive impact that mandatory open data policies can have on various pre-publication processes. Our study design and the interpretation of our results were probably influenced by our prior experience of reporting data anomalies and research misconduct to journals, and witnessing first-hand the challenges of post-publication error correction3,4,5. As long-standing advocates of transparency and reproducibility in research, we would celebrate empirical evidence that data sharing mandates increase pre-publication error detection….”

Recognizing and marshalling the pre-publication error correction potential of open data for more reproducible science | Nature Ecology & Evolution

“We enthusiastically applaud Berberi and Roche’s1 effort to evaluate the effects of journals’ mandatory open data policies on the error correction potential of science. Berberi and Roche conclude that at present there is “no evidence that mandatory open data policies increase error correction”. This may come as a surprise and a disappointment to advocates of open science. However, we suggest that by only addressing effects on post-publication error correction, Berberi and Roche miss the crucial dimension of pre-publication error correction potential in their assessment and may therefore substantially underestimate the true merits of mandatory open data policies….”

 

New Guidelines for Presenting Electrochemical Data in All ACS Journals | ACS Measurement Science Au

“Electrochemistry has become a cornerstone in many facets of modern chemistry research. The past few years have witnessed the rapid growth of research areas that employ electrochemical principles and methods, including batteries, supercapacitors, solar cells, fuel cells, electrolyzers, carbon dioxide reduction, nitrogen reduction, and organic electrosynthesis, to just name a few. As such, there has been an expansion in the number of papers reporting electrochemical testing and characterization. Publications reporting electrochemistry-related experiments have become prevalent in many ACS journals including, but not limited to, ACS Applied Materials and Interfaces, ACS Catalysis, ACS Energy Letters, ACS Measurement Science Au, ACS Organic & Inorganic Au, Journal of the American Chemical Society, Organic Letters, The Journal of Organic Chemistry, and The Journal of Physical Chemistry. There have been a variety of guidelines and checklists developed for some of the experimental protocols required for characterizing specific technologies (e.g., rotating ring disk electrochemistry measurements for oxygen reduction electrocatalysis and isotope experiments for nitrogen reduction to ammonia testing). However, no guidelines are available for the presentation of characterization data from general electrochemical measurements. This lack of standardization has resulted in papers being published with insufficient details for readers to reliably replicate the experiments. To outline best practices, we have developed a set of guidelines for reporting electrochemical experimentation and characterization in ACS journals. These guidelines, similar to the existing ACS guidelines for reporting NMR data and X-ray crystallography data for chemical compound and materials characterization, can be found on our ACS Research Data Guidelines website. (1) The guidelines for reporting electrochemical data are split into two sections: guidelines for reporting voltammetry and amperometry measurements, and guidelines for reporting bulk electrolysis procedures….”

Is open science a double-edged sword?: data sharing and the changing citation pattern of Chinese economics articles | SpringerLink

Abstract:  Data sharing is an important part of open science (OS), and more and more institutions and journals have been enforcing open data (OD) policies. OD is advocated to help increase academic influences and promote scientific discovery and development, but such a proposition has not been elaborated on well. This study explores the nuanced effects of the OD policies on the citation pattern of articles by using the case of Chinese economics journals. China Industrial Economics (CIE) is the first and only Chinese social science journal so far to adopt a compulsory OD policy, requiring all published articles to share original data and processing codes. We use the article-level data and difference-in-differences (DID) approach to compare the citation performance of articles published in CIE and 36 comparable journals. Firstly, we find that the OD policy quickly increased the number of citations, and each article on average received 0.25, 1.19, 0.86, and 0.44 more citations in the first four years after publication respectively. Furthermore, we also found that the citation benefit of the OD policy rapidly decreased over time, and even became negative in the fifth year after publication. In conclusion, this changing citation pattern suggests that an OD policy can be double edged sword, which can quickly increase citation performance but simultaneously accelerate the aging of articles.