“While publishers in multiple fields are adopting preprints , we have discovered a great deal of confusion about the pros and cons of preprinting as well as disparity in publishers’ policies regarding preprinting in health professions education (HPE). In seeking to resolve this confusion, we documented preprint policies at 74 journals within HPE (e.g. nursing, medicine, pharmacy, dentistry, rehabilitation sciences, nutrition). We culled preprint policies for 43 (58%) journals using journal websites, JISC’s Sherpa Romeo tool, and Wikipedia’s list of academic publishers by preprint policy. We then obtained information from email solicitations for an additional 27 (36%), leaving us without information for 4 (5%). Of the 70 journals for which we have information, 53 (76%) will review/accept preprinted manuscripts; 11 (16%) do not, and 6 (9%) are unclear or make decisions on a case-by-case basis. (For a link to our list of HPE journals and our understanding of their policies regarding preprinted manuscripts, see https://jahse.med.utah.edu/submission/ and select “Where to Publish”.) No wonder there is confusion.
We encourage our colleagues across the health professions to join our call to eliminate this confusion by encouraging all HPE journals to support and promote preprinting. The value of preprinting has only become more important during the COVID-19 pandemic . Being able to preprint scholarship prior to formal submission enhances formative review and revision, augments the benefits of peer coaching, and promotes higher quality publications. Preprinting also makes work available to others more quickly, which can enhance collaboration and uptake of new ideas without compromising the eventual copyright of the final published product.”
Abstract: In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.
“PLOS has released a preprint and supporting data on research conducted to understand the needs and habits of researchers in relation to code sharing and reuse as well as to gather feedback on prototype code notebooks and help determine strategies that publishers could use to increase code sharing.
Our previous research led us to implement a mandatory code sharing policy at PLOS Computational Biology in March 2021 to increase the amount of code shared alongside published articles. As well as exploring policy to support code sharing, we have also been collaborating with NeuroLibre, an initiative of the Canadian Open Neuroscience Platform, to learn more about the potential role of technological solutions for enhancing code sharing. Neurolibre is one of a growing number of interactive or executable technologies for sharing and publishing research, some of which have become integrated with publishers’ workflows….”
“It’s been just over a year since the journals published by the American Society for Biochemistry and Molecular Biology became fully open access. We asked the editors of the ASBMB’s journals how the transition has gone and what they’re planning for the future. Here’s what they told us….
To achieve gold open access, we partnered with commercial publisher Elsevier; however, it is important to recognize that JBC remains, at its core, a journal “for scientists, run by scientists.” Full editorial control of all manuscripts remains with the editors at JBC. In addition, JBC is one of the few journals that performs data-integrity analysis on the papers it publishes.
But what does the future hold? The implementation of open access raises an equally important aspect of science publishing in 2021 and beyond: open science….”
“Nature Portfolio journals encourage posting of preprints of primary research manuscripts on preprint servers of the authors’ choice, authors’ or institutional websites, and open communications between researchers whether on community preprint servers or preprint commenting platforms….
Preprints may be posted at any time during the peer review process. Posting of preprints is not considered prior publication and will not jeopardize consideration at Nature Portfolio journals….
Springer Nature has partnered with Research Square (Springer Nature has a majority interest in Research Square) to provide In Review, a journal-integrated solution for preprint sharing, supporting authors across all the communities we serve to share their research early….
Authors may choose any license of their choice for the preprint including Creative Commons licenses. …
Preprints may be cited in the reference list of articles under consideration at Nature Portfolio journals….”
“For some journals, publishers and editors require that all raw data are deposited on submission, for example into a public repository. However, what should an editor do with these? Most editors are part-time with other academic or clinical responsibilities, as at The Clinical Teacher, and do not have the capacity to scrutinise all data and analyse them again. Consider the amount of text arising from a qualitative study and the time it takes for the team of researchers to analyse, interpret and synthesise these data. In addition, I could not be sure that all collected data have been deposited. In a scholarly system reliant on altruism as well as trust, I would not expect unremunerated reviewers to put in long hours to check that research data can be trusted. Some journals do employ statisticians to comment specifically on statistical tests and results….”
Abstract: This paper outlines the impact of the introduction of an Open & FAIR (findable, accessible, interoperable, and reusable) data sharing policy on six earth and environmental science journals published by Taylor & Francis, beginning in November 2019. Notably, 18 months after implementing this new policy, we observed minimal impacts on submission, acceptance rates, or peer-review times for the participating journals. This paper describes the changes that were required to internal systems and processes in order to implement the new policy, and compares our findings with recent literature reports on the impact of journals introducing data-sharing policies.
Abstract: India’s primary science funding agencies, the Department of Science & Technology, and the Department of Biotechnology (DST & DBT) together formulated an open access (OA) policy in 2014. This policy mandates immediate self-archival of research articles generated from publicly funded research across all the institutions in suitable repositories. But with inadequate infrastructure and awareness, the OA mandate did not flourish as expected. This paper aims to understand whether journal policies impede the prospect of DST-DBT OA policy and the possible routes to achieve policy compliance. The analysis presented in this paper tracks down the journal self-archiving policies of the top 50 popular journals (among Indian authors) from each of the six STEM fields—Biology, Chemistry, Clinical-Medicine, Engineering, Materials science, and Physics. The results show that most journals have an embargo of 12–24?months on self-archiving of the post-print (final author version after peer-review), which contradicts the DST-DBT OA mandate. The study also reveals that hybrid journals dominate, and article processing charges craft a new form of inequity for Indian STEM researchers. We expect that these findings will be helpful for the funding agencies to restructure their policies, and negotiate with journal publishers to resolve the contradictions.
Abstract: Background: Numerous mechanisms exist to incentivise researchers to share their data. This scoping review aims to identify and summarise evidence of the efficacy of different interventions to promote open data practices and provide an overview of current research.
Methods: This scoping review is based on data identified from Web of Science and LISTA, limited from 2016 to 2021. A total of 1128 papers were screened, with 38 items being included. Items were selected if they focused on designing or evaluating an intervention or presenting an initiative to incentivise sharing. Items comprised a mixture of research papers, opinion pieces and descriptive articles.
Results: Seven major themes in the literature were identified: publisher/journal data sharing policies, metrics, software solutions, research data sharing agreements in general, open science ‘badges’, funder mandates, and initiatives.
Conclusions: A number of key messages for data sharing include: the need to build on existing cultures and practices, meeting people where they are and tailoring interventions to support them; the importance of publicising and explaining the policy/service widely; the need to have disciplinary data champions to model good practice and drive cultural change; the requirement to resource interventions properly; and the imperative to provide robust technical infrastructure and protocols, such as labelling of data sets, use of DOIs, data standards and use of data repositories.
“Daniella Lowenberg, principal investigator of the Make Data Count initiative, describes the ideals to which these data-sharing requirements aspire. “We want a world where data are routinely being used for discovery, to advance science, for evidence-based and data-driven policy,” she says. In some places, the future is already here. “There are data sets that drive entire fields,” she says, and “the field of research would not be where it is without these open data sets that are driving it.” As an example, she points to this data set of the wood density of 16,468 trees, which has been downloaded over 17,000 times.
With that ideal in mind, journal editors increasingly make publication contingent upon open data and code. I checked about 2,700 journals published by Springer, one of the largest publishers of academic journals, for submission guidelines that state that authors must make all materials like data and code available.
The results suggest that open data and code is more of a custom in some fields than others. Among ecology journals, 37 percent have an availability requirement, while only 7 percent of surgery and 6 percent of education journals do. Other fields are between these extremes, with 16 to 23 percent of management, engineering, math, economics, medicine, and psychology journals stating such a requirement….”
Efficient sharing and reuse of data from clinical trials are critical in advancing medical knowledge and developing improved treatments.
We believe that the International Committee of Medical Journal Editors (ICMJE) clinical trial data sharing policy is currently inadequate.
Although data sharing plans help increase transparency, they do not ensure that data are shared, and they are often inadequately implemented.
We believe that the ICMJE should adapt a stronger policy on data sharing that is enforced rigorously in all ICMJE members and affiliated journals.
The policy should include a strong evaluation component to ensure that all clinical trial data are shared, their value maximized, and data producers incentivized….”
Abstract: Following the Microbiology Society’s successful bid for a Learned Society Curation Award from the Wellcome Trust and Howard Hughes Medical Institute, the Society is converting our sound science, open access journal, Access Microbiology, to an open research platform. As part of this, we conducted a survey of our community to gauge current attitudes towards the platform and here we present some of these results. The majority of respondents (57?%) said they would always or sometimes want to remain anonymous on their peer review report, whilst 75?% of respondents said that as an author they would be happy to make the data underlying their research open. There was a clear desire for a range of research types that are often seen with sound science publications and rigorous research. An encouraging 94?% of respondents stated that the platform is somewhere they would consider publishing, demonstrating the enthusiasm in these respondents for a new publishing platform for their community. Given this data and that from our previous focus group research, the platform will launch as outlined in the original project proposal and adopt a transparent peer review model with an open data policy.
Abstract: Sharing research data is an integral part of the scientific publishing process. By sharing data, authors enable their readers to use their results in a way that the textual description of the results does not allow by itself. In order to achieve this objective, data should be shared in a way that makes it as easy as possible for readers to import them in computer software where they can be viewed, manipulated and analyzed. Many authors and reviewers seem to misunderstand the purpose of the data sharing policies developed by journals. Rather than being an administrative burden that authors should comply with to get published, the objective of these policies is to help authors maximize the impact of their work by allowing other members of the scientific community to build upon it. Authors and reviewers need to understand the purpose of data sharing policies to assist editors and publishers in their efforts to ensure that every article published complies with them.
“The ICMJE requires that clinical trial results be published in the same clinical trial depository where the trial is registered. These results are in the form of a short (?500 words) abstract or table (6,7). Full disclosure of the existing results publication in a clinical trial registry should be explicitly stated when the manuscript is submitted for publication. The Food and Drug Administration (FDA) has indicated it will enforce trial results reporting related to ClinicalTrials.gov (8). The FDA is authorized to seek civil monetary penalties from responsible parties, including additional civil monetary penalties. In the United States, the sponsor of an applicable clinical trial is considered the responsible party, unless or until the sponsor designates a qualified principal investigator as the responsible party. The FDA issued its first Notice of Noncompliance in April 2021 for failure to report results in ClinicalTrials.gov based on a lack of reporting the safety and effectiveness results for the drug dalantercept in combination with axitinib in patients with advanced renal cell carcinoma (8).
Finally, as of July 1, 2018, manuscripts submitted to ICMJE journals that report the results of clinical trials must contain a data sharing statement. Clinical trials that begin enrolling participants on or after January 1, 2019, must include a data sharing plan in the trial registration. (for further information, see www.icmje.org/recommendations/browse/publishing-and-editorial-issues/clinical-trial-registration.html). Since most clinical trials take 2 or more years for results to be reported, the Radiology editorial board had expected such mandatory data sharing plans to be reported in the current year. However, because of the COVID-19 pandemic, many clinical trials were halted. Thus, journal publication requirements to include data sharing statements are more likely to impact authors beginning in 2023. Data sharing statements required for Radiological Society of North America (RSNA) journals may be found at https://pubs.rsna.org/page/policies#clinical.
In conclusion, prospective clinical trial registration is a mechanism allowing us to ensure transparency in clinical research conduct, honest and complete reporting of the clinical trial results, and minimization of selective result publications. Since its inception in 2004, this requirement has evolved into a policy that is practiced by major medical journals worldwide, is mandatory for publication of trial results, and, in some circumstances, is enforced by the FDA. Further, ICMJE journals, including RSNA journals, are expecting manuscripts that report trial results to include statements on data sharing. As each clinical trial design is unique, we encourage authors to refer to the full description of the current ICMJE policy at icmje.org for additional information pertaining to their specific circumstances.”