Investigating the Effectiveness of the Open Data Badge Policy at Psychological Science Through Computational Reproducibility

Abstract:  In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.

 

Which solutions best support sharing and reuse of code? – The Official PLOS Blog

“PLOS has released a preprint and supporting data on research conducted to understand the needs and habits of researchers in relation to code sharing and reuse as well as to gather feedback on prototype code notebooks and help determine strategies that publishers could use to increase code sharing.

Our previous research led us to implement a mandatory code sharing policy at PLOS Computational Biology in March 2021 to increase the amount of code shared alongside published articles. As well as exploring policy to support code sharing, we have also been collaborating with NeuroLibre, an initiative of the Canadian Open Neuroscience Platform, to learn more about the potential role of technological solutions for enhancing code sharing. Neurolibre is one of a growing number of interactive or executable technologies for sharing and publishing research, some of which have become integrated with publishers’ workflows….”

Open Data – PLOS

“Publishing in a PLOS journal carries with it a commitment to make the data underlying the conclusions in your research article publicly available upon publication.

Our data policy underscores the rigor of the research we publish, and gives readers a fuller understanding of each study….”

A year of open access

“It’s been just over a year since the journals published by the American Society for Biochemistry and Molecular Biology became fully open access. We asked the editors of the ASBMB’s journals how the transition has gone and what they’re planning for the future. Here’s what they told us….

To achieve gold open access, we partnered with commercial publisher Elsevier; however, it is important to recognize that JBC remains, at its core, a journal “for scientists, run by scientists.” Full editorial control of all manuscripts remains with the editors at JBC. In addition, JBC is one of the few journals that performs data-integrity analysis on the papers it publishes.

But what does the future hold? The implementation of open access raises an equally important aspect of science publishing in 2021 and beyond: open science….”

Trust, scholarship and data sharing – Thistlethwaite – 2022 – The Clinical Teacher – Wiley Online Library

“For some journals, publishers and editors require that all raw data are deposited on submission, for example into a public repository. However, what should an editor do with these? Most editors are part-time with other academic or clinical responsibilities, as at The Clinical Teacher, and do not have the capacity to scrutinise all data and analyse them again. Consider the amount of text arising from a qualitative study and the time it takes for the team of researchers to analyse, interpret and synthesise these data. In addition, I could not be sure that all collected data have been deposited. In a scholarly system reliant on altruism as well as trust, I would not expect unremunerated reviewers to put in long hours to check that research data can be trusted. Some journals do employ statisticians to comment specifically on statistical tests and results….”

Implementing an Open & FAIR data sharing policy—A case study in the earth and environmental sciences – Cannon – 2022 – Learned Publishing – Wiley Online Library

Abstract:  This paper outlines the impact of the introduction of an Open & FAIR (findable, accessible, interoperable, and reusable) data sharing policy on six earth and environmental science journals published by Taylor & Francis, beginning in November 2019. Notably, 18 months after implementing this new policy, we observed minimal impacts on submission, acceptance rates, or peer-review times for the participating journals. This paper describes the changes that were required to internal systems and processes in order to implement the new policy, and compares our findings with recent literature reports on the impact of journals introducing data-sharing policies.

Full article: Open science and sharing personal data widely – legally impossible for Europeans?

“A requirement for having a research paper published in many medical journals is that the authors include a data sharing statement. Although the requirement from the International Committee of Medical Journal Editors is not very strict, simply requiring a statement [1], interpretation varies. Some journals essentially require that data must be readily available for other researchers for the paper to be accepted.

While most of us eagerly welcome open science and reuse of data to ensure reproducible science, the EU General Data Protection Regulation (GDPR) provides strong protection of privacy and rather restricts and counteracts open sharing of personal data [2]. Some editors will accept that data are not readily sharable with others than peer reviewers for legal reasons. However, editors of non-European journals will often object to a GDPR-compatible data sharing statement and, consequently and often at the last minute, reject the research paper.

Why is this an issue? How difficult is it for European researchers to share data with researchers in other parts of the world?”

Keeping science reproducible in a world of custom code and data | Ars Technica

“Daniella Lowenberg, principal investigator of the Make Data Count initiative, describes the ideals to which these data-sharing requirements aspire. “We want a world where data are routinely being used for discovery, to advance science, for evidence-based and data-driven policy,” she says. In some places, the future is already here. “There are data sets that drive entire fields,” she says, and “the field of research would not be where it is without these open data sets that are driving it.” As an example, she points to this data set of the wood density of 16,468 trees, which has been downloaded over 17,000 times.

With that ideal in mind, journal editors increasingly make publication contingent upon open data and code. I checked about 2,700 journals published by Springer, one of the largest publishers of academic journals, for submission guidelines that state that authors must make all materials like data and code available.

The results suggest that open data and code is more of a custom in some fields than others. Among ecology journals, 37 percent have an availability requirement, while only 7 percent of surgery and 6 percent of education journals do. Other fields are between these extremes, with 16 to 23 percent of management, engineering, math, economics, medicine, and psychology journals stating such a requirement….”

Medical journal requirements for clinical trial data sharing: Ripe for improvement

“Summary points

Efficient sharing and reuse of data from clinical trials are critical in advancing medical knowledge and developing improved treatments.
We believe that the International Committee of Medical Journal Editors (ICMJE) clinical trial data sharing policy is currently inadequate.
Although data sharing plans help increase transparency, they do not ensure that data are shared, and they are often inadequately implemented.
We believe that the ICMJE should adapt a stronger policy on data sharing that is enforced rigorously in all ICMJE members and affiliated journals.
The policy should include a strong evaluation component to ensure that all clinical trial data are shared, their value maximized, and data producers incentivized….”

Converting Access Microbiology to an open research platform: community survey results | Microbiology Society

Abstract:  Following the Microbiology Society’s successful bid for a Learned Society Curation Award from the Wellcome Trust and Howard Hughes Medical Institute, the Society is converting our sound science, open access journal, Access Microbiology, to an open research platform. As part of this, we conducted a survey of our community to gauge current attitudes towards the platform and here we present some of these results. The majority of respondents (57?%) said they would always or sometimes want to remain anonymous on their peer review report, whilst 75?% of respondents said that as an author they would be happy to make the data underlying their research open. There was a clear desire for a range of research types that are often seen with sound science publications and rigorous research. An encouraging 94?% of respondents stated that the platform is somewhere they would consider publishing, demonstrating the enthusiasm in these respondents for a new publishing platform for their community. Given this data and that from our previous focus group research, the platform will launch as outlined in the original project proposal and adopt a transparent peer review model with an open data policy.

Data sharing policies: share well and you shall be rewarded | Synthetic Biology | Oxford Academic

Abstract:  Sharing research data is an integral part of the scientific publishing process. By sharing data, authors enable their readers to use their results in a way that the textual description of the results does not allow by itself. In order to achieve this objective, data should be shared in a way that makes it as easy as possible for readers to import them in computer software where they can be viewed, manipulated and analyzed. Many authors and reviewers seem to misunderstand the purpose of the data sharing policies developed by journals. Rather than being an administrative burden that authors should comply with to get published, the objective of these policies is to help authors maximize the impact of their work by allowing other members of the scientific community to build upon it. Authors and reviewers need to understand the purpose of data sharing policies to assist editors and publishers in their efforts to ensure that every article published complies with them.

Prospective Clinical Trial Registration: A Prerequisite for Publishing Your Results | Radiology

“The ICMJE requires that clinical trial results be published in the same clinical trial depository where the trial is registered. These results are in the form of a short (?500 words) abstract or table (6,7). Full disclosure of the existing results publication in a clinical trial registry should be explicitly stated when the manuscript is submitted for publication. The Food and Drug Administration (FDA) has indicated it will enforce trial results reporting related to ClinicalTrials.gov (8). The FDA is authorized to seek civil monetary penalties from responsible parties, including additional civil monetary penalties. In the United States, the sponsor of an applicable clinical trial is considered the responsible party, unless or until the sponsor designates a qualified principal investigator as the responsible party. The FDA issued its first Notice of Noncompliance in April 2021 for failure to report results in ClinicalTrials.gov based on a lack of reporting the safety and effectiveness results for the drug dalantercept in combination with axitinib in patients with advanced renal cell carcinoma (8).

Finally, as of July 1, 2018, manuscripts submitted to ICMJE journals that report the results of clinical trials must contain a data sharing statement. Clinical trials that begin enrolling participants on or after January 1, 2019, must include a data sharing plan in the trial registration. (for further information, see www.icmje.org/recommendations/browse/publishing-and-editorial-issues/clinical-trial-registration.html). Since most clinical trials take 2 or more years for results to be reported, the Radiology editorial board had expected such mandatory data sharing plans to be reported in the current year. However, because of the COVID-19 pandemic, many clinical trials were halted. Thus, journal publication requirements to include data sharing statements are more likely to impact authors beginning in 2023. Data sharing statements required for Radiological Society of North America (RSNA) journals may be found at https://pubs.rsna.org/page/policies#clinical.

In conclusion, prospective clinical trial registration is a mechanism allowing us to ensure transparency in clinical research conduct, honest and complete reporting of the clinical trial results, and minimization of selective result publications. Since its inception in 2004, this requirement has evolved into a policy that is practiced by major medical journals worldwide, is mandatory for publication of trial results, and, in some circumstances, is enforced by the FDA. Further, ICMJE journals, including RSNA journals, are expecting manuscripts that report trial results to include statements on data sharing. As each clinical trial design is unique, we encourage authors to refer to the full description of the current ICMJE policy at icmje.org for additional information pertaining to their specific circumstances.”

PsyArXiv Preprints | When open data closes the door: Problematising a one size fits all approach to open data in journal submission guidelines

Abstract:  Opening data promises to improve research rigour and democratise knowledge production. But it also poses practical, theoretical, and ethical risks for qualitative research. Despite discussion about open data in qualitative social psychology predating the replication crisis, the nuances of this discussion have not been translated into current journal policies. Through a content analysis of 261 journals in the domain of social psychology, we establish the state of current journal policies for open data. We critically discuss how these expectations may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We assert that open data requirements should include clearer guidelines that reflect the nuance of data sharing in qualitative research, and move away from a universal ‘one-size-fits-all’ approach to data sharing.

 

OSF Preprints | A survey of funders’ and institutions’ needs for understanding researchers’ open research practices

Abstract:  A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software. However, funders and institutions lack sufficient tools, time or resources to monitor compliance with these policies.

  To better understand funder and institution needs related to understanding open research practices of researchers, we targeted funders and institutions with a survey in 2020 and received 122 completed responses. Our survey assessed and scored, (from 0-100), the importance of and satisfaction with 17 factors associated with understanding open research practices. This includes things such as knowing if a research paper includes links to research data in a repository; knowing if a research grant made code available in a public repository; knowing if research data were made available in a reusable form; and knowing reasons why research data are not publicly available. Half of respondents had tried to evaluate researchers’ open research practices in the past and 78% plan to do this in the future. The most common method used to find out if researchers are practicing open research was personal contact with researchers and the most common reason for doing it was to increase their knowledge of researchers’ sharing practices (e.g. determine current state of sharing; track changes in practices over time; compare different departments/disciplines). The results indicate that nearly all of the 17 factors we asked about in the survey were underserved. The mean importance of all factors to respondents was 71.7, approaching the 75 threshold of “very important”. The average satisfaction of all factors was 41.3, indicating a negative level of satisfaction with ability to complete these tasks. The results imply an opportunity for better solutions to meet these needs. The growth of policies and requirements for making research data and code available does not appear to be matched with solutions for determining if these policies have been complied with. We conclude that publishers can better support some of the needs of funders and institutions by introducing simple solutions such as: – Mandatory data availability statements (DAS) in research articles – Not permitting generic “data available on request” statements – Enabling and encouraging the use of data repositories and other methods that make data available in a more reusable way – Providing visible links to research data on publications – Making information available on data and code sharing practices in publications available to institutions and funding agencies – Extending policies that require transparency in sharing of research data, to sharing of code

How can publishers better meet the open research needs of funders and institutions?

“Publishers investing in simple solutions in their workflows can help to better meet the needs of funders and institutions who wish to support open research practices, research released this week by PLOS concludes.

Policies can be an effective solution for changing research culture and practice. A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software — as do publishers. Seeking to deepen our understanding of funder and institution needs related to open research, we surveyed more than 100 funders and institutions in 2020. We wanted to know if they are evaluating how researchers share data and code, how they are doing it, why they are doing it, and how satisfied they are with their ability to get these tasks done. Our results are available as a preprint along with an anonymised dataset….

Simple solutions more publishers could provide include:

Mandatory Data Availability Statements (DAS) in all relevant publications.
Across the STM industry around 15% of papers include a DAS. Since we introduced our data availability policy in 2014, 100% of PLOS research articles include a DAS.
Supporting researchers to provide information on why research data (and code) are not publicly available with their publications.
Time and again “data available on request” has been shown to be ineffective at supporting new research — and is not permitted in PLOS journals. 
Enabling and encouraging the use of data repositories.
Recommending the use of data repositories is a useful step, but making them easily and freely accessible — integrated into the publishing process — can be even more effective. Rates of repository use are higher in journals that partner closely with repositories and remove cost barriers to their use.
Providing visible links to research data on publications. Many researchers also struggle to find data they can reuse, hence PLOS will soon be experimenting with improving this functionality in our articles, and integrating the Dryad repository with submission….”