“Outside of eLife and , to an extent , PLoS , no one of scale and weight in the commercial publishing sector has really climbed aboard the Open Science movement with a recognition of the sort of data and communication control that Open Science will require .
So what is that requirement ? In two words – Replicability and Retraction . …”
“‘Open research’ (used interchangeably with ‘open science’) is an all-encompassing term speaking to the set of practices that aim to improve the accessibility, reproducibility, and integrity of research outputs. It’s also complex, spanning issues such as open access, open practices that increase the integrity and reproducibility of research (e.g., Registered Reports, open data and code), open collaboration, and open recognition (e.g. transparent peer review and CRediT Contributor Roles Taxonomy).
So, what do researchers think about open research? We invited researchers to participate in Wiley’s Open Research Survey to share their views and experiences of open research practices. It’s clear from our findings that researchers welcome open research initiatives in terms of their motivation for publishing open access, willingness to share data and to experiment with opening up the peer review process (see overview below for more detail).
Recent studies have shown that articles that are freely available obtain more citations and are downloaded more often. Institutions are beginning to reward and recognise open research practices, especially in recruitment and for promotion. Funders are also requiring that researchers publish open access and share data (for example, Horizon Europe).
Open research isn’t the future – it’s the here and now, and journal editors have a vital role to play in facilitating open research and open publishing practices alongside researchers, institutions, funders, and publishers. Editors can play their part by supporting open access publishing, adopting Registered Reports, adopting open data policies and data availability statements, recognizing and celebrating open research practices such as displaying open research badges on published articles, and opening up peer review. If you want to implement one or more of these initiatives on your journal, please speak with your Wiley Journal Publishing Manager….”
“Open science reduces waste and accelerates the discovery of knowledge, solutions, and cures for the world’s most pressing needs. Shifting research culture toward greater openness, transparency, and reproducibility is challenging, but there are incremental steps at every stage of the research lifecycle that can improve rigor and reduce waste. Visit cos.io to learn more.”
Abstract: In January 2020, I presented at the Librarians Building Momentum for Reproducibility virtual conference. The theme of the presentation was preregistration and registered reports and their role in reproducibility of research results. The presentation was twofold in that it provided background information on these themes and then advocated for the adoption of a registered reports submission track in Library and Information Science journals. I asked attendees to notify me if they wanted to learn more and to join me in contacting LIS journals to advocate for this model. The first journal that we targeted was College & Research Libraries. We drafted a letter that was sent to editor Wendi Arant Kaspar who discussed the topic with the editorial board and ultimately asked me to write a guest editorial for C&RL.
Ignore citation counts. Given that citations are unrelated to (easily-predictable) replicability, let alone any subtler quality aspects, their use as an evaluative tool should stop immediately.
Open data, enforced by the NSF/NIH. There are problems with privacy but I would be tempted to go as far as possible with this. Open data helps detect fraud. And let’s have everyone share their code, too—anything that makes replication/reproduction easier is a step in the right direction.
Financial incentives for universities and journals to police fraud. It’s not easy to structure this well because on the one hand you want to incentivize them to minimize the frauds published, but on the other hand you want to maximize the frauds being caught. Beware Goodhart’s law!
Why not do away with the journal system altogether? The NSF could run its own centralized, open website; grants would require publication there. Journals are objectively not doing their job as gatekeepers of quality or truth, so what even is a journal? A combination of taxonomy and reputation. The former is better solved by a simple tag system, and the latter is actually misleading. Peer review is unpaid work anyway, it could continue as is. Attach a replication prediction market (with the estimated probability displayed in gargantuan neon-red font right next to the paper title) and you’re golden. Without the crutch of “high ranked journals” maybe we could move to better ways of evaluating scientific output. No more editors refusing to publish replications. You can’t shift the incentives: academics want to publish in “high-impact” journals, and journals want to selectively publish “high-impact” research. So just make it impossible. Plus as a bonus side-effect this would finally sink Elsevier….”
“We believe that the value of science is in the rigor of the method, not the appeal of the results – an ethos at the heart of our publishing model. Choosing to publish your research as a Registered Report puts this into practice, by shifting the focus away from the results and back to the research question. Registered Reports can be used for research in almost any field of study, from psychology and neuroscience, to medicine or ecology.
Registered Reports on F1000Research follow a two-stage process: firstly, the Study Protocol (Stage 1) is published and peer-reviewed by subject experts before data collection begins. Then, once the research has been completed, the Research Article (Stage 2) is published, peer reviewed, and awarded a Registered Report badge.
F1000Research is the first publisher to combine the Registered Reports format with an open, post-publication peer review model. Alongside our open data policy, this format enhances credibility, and takes transparency and reproducibility in research to the next level….”
“3.5.7 Registered reports and open practices badges
One possible way to incorporate all the information listed above and to combat the stigma against papers that report nonsignificant findings is through the implementation of Registered Reports or rewarding transparent research practices. Registered Reports are empirical articles designed to eliminate publication bias and incentivize best scientific practice. Registered Reports are a form of empirical article in which the methods and the proposed analyses are preregistered and reviewed prior to research being conducted. This format is designed to minimize bias, while also allowing complete flexibility to conduct exploratory (unregistered) analyses and report serendipitous findings. The cornerstone of the Registered Reports format is that the authors submit as a Stage 1 manuscript an introduction, complete and transparent methods, and the results of any pilot experiments (where applicable) that motivate the research proposal, written in the future tense. These proposals will include a description of the key research question and background literature, hypotheses, experimental design and procedures, analysis pipeline, a statistical power analysis, and full description of the planned comparisons. Submissions, which are reviewed by editors, peer reviewers and in some journals, statistical editors, meeting the rigorous and transparent requirements for conducting the research proposed are offered an in?principle acceptance, meaning that the journal guarantees publication if the authors conduct the experiment in accordance with their approved protocol. Many journals publish the Stage 1 report, which could be beneficial not only for citations, but for the authors’ progress reports and tenure packages. Following data collection, the authors prepare and resubmit a Stage 2 manuscript that includes the introduction and methods from the original submission plus their obtained results and discussion. The manuscript will undergo full review; referees will consider whether the data test the authors’ proposed hypotheses by satisfying the approved outcome?neutral conditions, will ensure the authors adhered precisely to the registered experimental procedures, and will review any unregistered post hoc analyses added by the authors to confirm they are justified, methodologically sound, and informative. At this stage, the authors must also share their data (see also Wiley’s Data Sharing and Citation Policy) and analysis scripts on a public and freely accessible archive such as Figshare and Dryad or at the Open Science Framework. Additional details, including template reviewer and author guidelines, can be found by clicking the link to the Open Science Framework from the Center for Open Science (see also94).
The authors who practice transparent and rigorous science should be recognized for this work. Funders can encourage and reward open practice in significant ways (see https://wellcome.ac.uk/what?we?do/our?work/open?research). One way journals can support this is to award badges to the authors in recognition of these open scientific practices. Badges certify that a particular practice was followed, but do not define good practice. As defined by the Open Science Framework, three badges can be earned. The Open Data badge is earned for making publicly available the digitally shareable data necessary to reproduce the reported results. These data must be accessible via an open?access repository, and must be permanent (e.g., a registration on the Open Science Framework, or an independent repository at www.re3data.org). The Open Materials badge is earned when the components of the research methodology needed to reproduce the reported procedure and analysis are made publicly available. The Preregistered badge is earned for having a preregistered design, whereas the Preregistered+Analysis Plan badge is earned for having both a preregistered research design and an analysis plan for the research; the authors must report results according to that plan. Additional information about the badges, including the necessary information to be awarded a badge, can be found by clicking this link to the Open Science Framework from the Center for Open Science….”
Abstract: The Open Science movement has gained considerable traction in the last decade. The Open Science movement tries to increase trust in research results and open the access to all elements of a research project to the public. Central to these goals, Open Science has promoted five critical tenets: Open Data, Open Analysis, Open Materials, Preregistration, and Open Access. All Open Science elements can be thought of as extensions to the traditional way of achieving openness in science, which has been scientific publication of research outcomes in journals or books. Open Science in education sciences, however, has the potential to be much more than a safeguard against questionable research. Open Science in education science provides opportunities to (a) increase the transparency and therefore replicability of research and (b) develop and answer research questions about individuals with learning disabilities and learning difficulties that were previously impossible to answer due to complexities in data analysis methods. We will provide overviews of the main tenets of Open Science (i.e., Open Data, Open Analysis, Open Materials, Preregistration, and Open Access), show how they are in line with grant funding agencies’ expectations for rigorous research processes, and present resources on best practices for each of the tenets.
Abstract: Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.
Methods: We scored the author guidelines of a comprehensive set of 28 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. This instrument rates the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.
Results: Across the 28 journals, we find low values on the TOP Factor (median [25th, 75th percentile] 2.5 [1, 3], min. 0, max. 9, out of a total possible score of 28) in sleep research and chronobiology journals.
Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support the recent developments in transparent and open science by implementing transparency and openness principles in their guidelines and making adherence to them mandatory.
“During the 2010s, I gradually adopted open science practices. With each study I started, I began to take more and more steps to make my research transparent. I started uploading my data, documenting analysis procedures, pre-registering my work, and taking other steps to ensure my research was transparent. After adding components of open science to my work, I finally decided in fall 2017 that I would conduct a fully open science project. My only regret was that I didn’t fully embrace open science earlier….
This article is the result of the first fully open science study of my career, though I had adopted pieces of open science beforehand. Here is what I learned from this study: …”
“Expectations by funders for transparent and reproducible methods are on the rise. This session will cover expectations for preregistration, data sharing, and open access results of three key funders of education research including the Institute of Education Sciences, the National Science Foundation, and Arnold Ventures. Presenters will cover practical resources for meeting these requirements such as the Registry for Efficacy and Effectiveness Studies (REES), the Open Science Framework (OSF), and EdArXiv.”
“3) That all publicly funded research is registered and published in designated Research Repositories The majority of research is funded by public and charitable funds. Yet, huge amounts of research is never published at all, which aside from being an indefensible waste of public money, is a major source of publication bias 3 . Meanwhile, basic research documentation which is essential to ensure appropriate research conduct, such as protocols, are only sometimes available, either on voluntary databases or upon agreement of study authors. The World Health Organization (WHO) has long urged registration of trials in affiliated ‘primary registries’, such as ClinicalTrials.gov 17 and the EU Clinical Trials Register 18 which can all be searched simultaneously a dedicated WHO website 19 . Mandatory registration of trials has improved transparency , although compliance with publication requirements is poor 20 , possibly hampered by problems with the basic functionality of some major registries 21 22 . Even where trials have been registered, usually only very limited information is shared, rather than the full protocols requir ed to really understand study plans. Most researchers don’t work in trials. Some principled scientists do register their work but while this remains voluntary such researc hers are likely to remain a minority . A ll publically funded research, not just trials, comprehensive documentation including protocols , statistical analysis plans, statistical analysis code and raw or appropriately de-identified summary data should be available on a single WHO affiliated repository, designated for that purpose by each state or groups of states . Depositing documentation need not become onerous for researchers and could actually replace much of the overly bureaucratic reporting currently required for funders and ethics committees. Different solutions may exist in different countries. For example, England’s Health Research Authority could develop such a registry 23 , by building on the its existing public databases 24 . Or, through additional national funding and international support existing platforms which promote transparency and accessibility 25 26 27 could be designated for this purpose through collaboration with national research bodies.”
“A new ranking system for academic journals measuring their commitment to research transparency will be launched next month – providing what many believe will be a useful alternative to journal impact scores.
Under a new initiative from the Center for Open Science, based in Charlottesville, Virginia, more than 300 scholarly titles in psychology, education and biomedical science will be assessed on 10 measures related to transparency, with their overall result for each category published in a publicly available league table.
The centre aims to provide scores for about 1,000 journals within six to eight months of their site’s launch in early February….”
“Are traditional research articles still meeting researchers’ communication needs? Over the past decade, Open Science and the rise in digital publications together have facilitated a more agile ecosystem of research-sharing. For researchers, that means: faster pathways to sharing their discoveries; greater transparency of assessment which helps increase reliability and public trust; and more opportunities for collaborations that accelerate advancements in the field.
With increased options for sharing and evaluating science, we’re looking at ways to segment the research-sharing lifecycle to fit the research process. How do we share important, urgent discoveries earlier, without compromising quality? What other essential products of research can we be more transparent about?…”