Abstract: A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software. However, funders and institutions lack sufficient tools, time or resources to monitor compliance with these policies.
To better understand funder and institution needs related to understanding open research practices of researchers, we targeted funders and institutions with a survey in 2020 and received 122 completed responses. Our survey assessed and scored, (from 0-100), the importance of and satisfaction with 17 factors associated with understanding open research practices. This includes things such as knowing if a research paper includes links to research data in a repository; knowing if a research grant made code available in a public repository; knowing if research data were made available in a reusable form; and knowing reasons why research data are not publicly available. Half of respondents had tried to evaluate researchers’ open research practices in the past and 78% plan to do this in the future. The most common method used to find out if researchers are practicing open research was personal contact with researchers and the most common reason for doing it was to increase their knowledge of researchers’ sharing practices (e.g. determine current state of sharing; track changes in practices over time; compare different departments/disciplines). The results indicate that nearly all of the 17 factors we asked about in the survey were underserved. The mean importance of all factors to respondents was 71.7, approaching the 75 threshold of “very important”. The average satisfaction of all factors was 41.3, indicating a negative level of satisfaction with ability to complete these tasks. The results imply an opportunity for better solutions to meet these needs. The growth of policies and requirements for making research data and code available does not appear to be matched with solutions for determining if these policies have been complied with. We conclude that publishers can better support some of the needs of funders and institutions by introducing simple solutions such as: – Mandatory data availability statements (DAS) in research articles – Not permitting generic “data available on request” statements – Enabling and encouraging the use of data repositories and other methods that make data available in a more reusable way – Providing visible links to research data on publications – Making information available on data and code sharing practices in publications available to institutions and funding agencies – Extending policies that require transparency in sharing of research data, to sharing of code
“Publishers investing in simple solutions in their workflows can help to better meet the needs of funders and institutions who wish to support open research practices, research released this week by PLOS concludes.
Policies can be an effective solution for changing research culture and practice. A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software — as do publishers. Seeking to deepen our understanding of funder and institution needs related to open research, we surveyed more than 100 funders and institutions in 2020. We wanted to know if they are evaluating how researchers share data and code, how they are doing it, why they are doing it, and how satisfied they are with their ability to get these tasks done. Our results are available as a preprint along with an anonymised dataset….
Simple solutions more publishers could provide include:
Mandatory Data Availability Statements (DAS) in all relevant publications.
Across the STM industry around 15% of papers include a DAS. Since we introduced our data availability policy in 2014, 100% of PLOS research articles include a DAS.
Supporting researchers to provide information on why research data (and code) are not publicly available with their publications.
Time and again “data available on request” has been shown to be ineffective at supporting new research — and is not permitted in PLOS journals.
Enabling and encouraging the use of data repositories.
Recommending the use of data repositories is a useful step, but making them easily and freely accessible — integrated into the publishing process — can be even more effective. Rates of repository use are higher in journals that partner closely with repositories and remove cost barriers to their use.
Providing visible links to research data on publications. Many researchers also struggle to find data they can reuse, hence PLOS will soon be experimenting with improving this functionality in our articles, and integrating the Dryad repository with submission….”
“Imagine if every statistical analysis was accompanied by comments from a professional statistician; if every method described had been critiqued by a methodological expert; if interpretations could be published of the same analysis from a wide variety of people with different training, backgrounds or experience. How much richer would the research record be? How much more useful than each of us only passing our own personal judgement on each article, bounded by our own inevitably narrow experience, and unshared with others?
This was all part of my thinking when coming up with Octopus, the platform that is designed to be the new, digital-first primary research record for scientific work….”
“The number of preprint servers has increased substantially in the last five years and now stands at no less than sixty. More than thirty new servers have appeared in the past five years. These servers are diverse, focusing on subdisciplines, or specific geographies, or specific languages, and have varying degrees of penetration and technical sophistication. Existing publishing services and workflows are now being reimagined to accommodate preprints. This essay examines how a publisher-centric approach simplifies workflows and speeds the process of peer review through preprint pre-assessment and the checks and balances being implemented by publishers and third-parties to build trust and confidence in preprints….”
“This year, Peer Review Week is taking place September 20th through 24th, and ScienceOpen has put together an expert panel to discuss the why’s and how’s of open peer review. Peer Review Week is an annual weeklong event that is led by the community to celebrate the essential role that peer review plays in scientific and academic communication. The event brings together all those committed to sharing the central message that quality peer review, whatever shape or form it might take, is critical to scholarly research. As a proponent of open peer review, we thought it would be the perfect topic to discuss during Peer Review Week this year. Below you will find the details and registration link for the panel we have assembled to discuss everything on the topic of open peer review which will take place on September 24th at 4 pm London time (UTC+1). We would love to have you join this free, online event during Peer Review Week!…”
Abstract: Peer review is an integral component of contemporary science. While peer review focuses attention on promising and interesting science, it also encourages scientists to pursue some questions at the expense of others. Here, we use ideas from forecasting assessment to examine how two modes of peer review — ex ante review of proposals for future work and ex post review of completed science — motivate scientists to favor some questions instead of others. Our main result is that ex ante and ex post peer review push investigators toward distinct sets of scientific questions. This tension arises because ex post review allows an investigator to leverage her own scientific beliefs to generate results that others will find surprising, whereas ex ante review does not. Moreover, ex ante review will favor different research questions depending on whether reviewers rank proposals in anticipation of changes to their own personal beliefs, or to the beliefs of their peers. The tension between ex ante and ex post review puts investigators in a bind, because most researchers need to find projects that will survive both. By unpacking the tension between these two modes of review, we can understand how they shape the landscape of science and how changes to peer review might shift scientific activity in unforeseen directions.
“To sum up, the existence of a public version of a manuscript (i.e., the preprint) opens up many new avenues for peer review, and these are largely positive for the integrity of the scientific record. However, many of these peer review efforts run in parallel to peer review at the journal. As I hope I’ve illustrated above, there’s no clear way to decide what counts as legitimate discussion of a preprint and what is unethical duplicate peer review. As preprints become more prevalent we may need to abandon our hopes of enforcing sequential peer review entirely, and that may not be a bad thing.”
Abstract: Coronavirus pandemic has radically changed the scientific world. During these difficult times, standard peer-review processes could be too long for the continuously evolving knowledge about this disease. We wanted to assess whether the use of other types of network could be a faster way to disseminate the knowledge about Coronavirus disease. We retrospectively analyzed the data flow among three distinct groups of networks during the first three months of the pandemic: PubMed, preprint repositories (biorXiv and arXiv) and social media in Italy (Facebook and Twitter). The results show a significant difference in the number of original research articles published by PubMed and preprint repositories. On social media, we observed an incredible number of physicians participating to the discussion, both on three distinct Italian-speaking Facebook groups and on Twitter. The standard scientific process of publishing articles (i.e., the peer-review process) remains the best way to get access to high-quality research. Nonetheless, this process may be too long during an emergency like a pandemic. The thoughtful use of other types of network, such as preprint repositories and social media, could be taken into consideration in order to improve the clinical management of COVID-19 patients.
“PLOS keeps a watchful and enthusiastic eye on emerging research, and we update our policies as needed to address new challenges and opportunities that surface. In doing so, we work to advance our core mission and values aimed at transforming research communication and promoting Open Science.
Here, I summarize a few key updates we made between 2016-2021….”
“We see PREreview Communities as elements of an interconnected network of preprint reviewers, each with its own shared purpose, rules, and components—all working together towards a shared goal. This goal is to build a better research culture, one where everyone regardless of their career level, identity, cultural and educational background, is empowered to share their constructive feedback openly, is valued, and is recognized for their contributions….”
“In early August, it was announced that UK Research and Innovation (UKRI) would provide significant funding for a new open publishing platform. Called Octopus, this initiative is not yet fully launched, but when it is it plans to “provide a new ‘primary research record’ for recording and appraising research “as it happens’”; UKRI calls Octopus “a ground-breaking global service which could positively disrupt research culture for the better.” I reached out to Octopus’s founder, Dr. Alexandra Freeman, to ask some questions about Octopus and its plans for the future….”
Abstract: This paper outlines a creative Wikipedia-based project developed by the University of Kansas (KU) Libraries and the KU Biology Department. Inspired by the tenets of open pedagogy, the purpose of this project is to use Wikipedia as a way for students to learn about the scholarly peer review process while also producing material that can be shared and used by the world outside the classroom. The paper is divided into three sections, with the first summarizing pertinent related literature related to the paper’s topic. From here, the paper describes the proposed assignment, detailing a process wherein students write new articles for the encyclopedia which are then anonymously peer reviewed by other students in the class; when articles are deemed acceptable, they are published via Wikipedia. The parallels between this project and academic peer review are emphasized throughout. The paper closes by discussing the importance of this project, arguing that it fills a known scholarly need, actively produces knowledge, furthers the aims of the open access movement, and furthers scientific outreach initiatives.
“What’s the deal with how we review papers for venues (like conferences and journals) for free and then they go on to sell and restrict access to them? How about let’s only review for venues that freely distribute papers and stop reviewing for those that restrict access? We can also stop reviewing for those that charge high publishing charges, I believe over $100 per submission is unacceptable.
We have the power to put an end to closed access research. By only reviewing for venues that freely distribute papers, we will ensure they have the best publications and become the premier venues. It will then become in everyone’s best interest to publish in venues with freely accessible papers….”
“Publications that are based on wrong data, methodological mistakes, or contain other types of severe errors can spoil the scientific record if they are not retracted. Retraction of publications is one of the effective ways to correct the scientific record. However, before a problematic publication can be retracted, the problem has to be found and brought to the attention of the people involved (the authors of the publication and editors of the journal). The earlier a problem with a published paper is detected, the earlier the publication can be retracted and the less wasted effort goes into new research that is based on disinformation within the scientific record. Therefore, it would be advantageous to have an early warning system that spots potential problems with published papers, or maybe even before based on a preprint version….”