Mapping contemporary “research on research” and “science studies”: how new methods change the traditional academic landscape and inform public open science policies

Abstract:  One of the ambitions outlined by France’s second National Plan for Open Science (July 2021), was to create an Open Science Lab dedicated to developing “research on research” with a focus on open science and with the objective to inform French public policy decisions. As part of the groundwork for setting up this Lab, the French Committee for Open Science called for an exploratory study to better understand the international scope and context of “research on research” (RoR), and its connections with open science as well as other research currently being carried out in metascience, science of science, and science and technology studies.

Far from presenting a static landscape, the study found that while some research on science and scientific communities are based on well-established, pre-existing academic fields and methods, other more recent trends (metascience, metaresearch, RoR, etc.) have adopted a prescriptive commitment to fostering better and more open science. It highlights the debates contemporary research on research and science is fueling around key issues such as reproducibility, evidence-based policy, integrity and inclusivity. It also echos some community-issued concerns about “reinventing the wheel” when it comes to studying science, scientific communities and their productions.

Nature welcomes Registered Reports

“This year marks the 50th anniversary of Nature’s decision to mandate peer review for all papers. It’s an appropriate time to introduce readers and authors to Registered Reports, a research-article format that Nature is offering from this week for studies designed to test whether a hypothesis is supported (see go.nature.com/3kivjh1).

The fundamental principle underpinning a Registered Report is that a journal commits to publishing a paper if the research question and the methodology chosen to address it pass peer review, with the result itself taking a back seat. For now, Nature is offering Registered Reports in the field of cognitive neuroscience and in the behavioural and social sciences. In the future, we plan to extend this to other fields, as well as to other types of study, such as more exploratory research.

Why are we introducing this format? In part to try to address publication bias, the tendency of the research system — editors, reviewers and authors — to favour the publication of positive over negative results. Registered Reports help to incentivize research regardless of the result. An elegant and robust study should be appreciated as much for its methodology as for its results….”

Nature welcomes Registered Reports

“This year marks the 50th anniversary of Nature’s decision to mandate peer review for all papers. It’s an appropriate time to introduce readers and authors to Registered Reports, a research-article format that Nature is offering from this week for studies designed to test whether a hypothesis is supported (see go.nature.com/3kivjh1).

The fundamental principle underpinning a Registered Report is that a journal commits to publishing a paper if the research question and the methodology chosen to address it pass peer review, with the result itself taking a back seat. For now, Nature is offering Registered Reports in the field of cognitive neuroscience and in the behavioural and social sciences. In the future, we plan to extend this to other fields, as well as to other types of study, such as more exploratory research.

Why are we introducing this format? In part to try to address publication bias, the tendency of the research system — editors, reviewers and authors — to favour the publication of positive over negative results. Registered Reports help to incentivize research regardless of the result. An elegant and robust study should be appreciated as much for its methodology as for its results….”

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

Open science, closed doors: The perils and potential of open science for research in practice | Industrial and Organizational Psychology | Cambridge Core

This paper advocates for the value of open science in many areas of research. However, after briefly reviewing the fundamental principles underlying open science practices and their use and justification, the paper identifies four incompatibilities between those principles and scientific progress through applied research. The incompatibilities concern barriers to sharing and disclosure, limitations and deficiencies of overidentifying with hypothetico-deductive methods of inference, the paradox of replication efforts resulting in less robust findings, and changes to the professional research and publication culture such that it will narrow in favor of a specific style of research. Seven recommendations are presented to maximize the value of open science while minimizing its adverse effects on the advancement of science in practice.

Water science must be Open Science | Nature Water

“Since water is a common good, the outcome of water-related research should be accessible to everyone. Since Open Science is more than just open access research articles, journals must work with the research community to enable fully open and FAIR science…”

Open Science for water research | Luxembourg Institute of Science and Technology

“For the launch of the new scientific journal Nature Water, researchers Emma and Stan Schymanski contributed an article about the future of water research. This opinion paper focuses on the importance of open science in a field where, due to its global societal relevance, knowledge and research results should be freely accessible by a wide range of stakeholders. The publication also highlights the interdisciplinary expertise brought to Luxembourg by the two FNR ATTRACT fellows on such a topical subject….

Research on water systems can help us face these considerable challenges but needs to consider the global societal relevance of its subject. “Since water is a common good, it should be natural that the outcome of water-related research is accessible to everyone,” explains Dr Stan Schymanski. “It needs to become freely available and re-usable for everybody, without the need for paid licenses to view publications or use data.”

The two researchers insist on the importance of implementing Open Science in its broadest definition. It has to go beyond open access to research articles: it must also include open data and open-source computer code. Additionally, open data should be aligned with the FAIR Principles, which describe how to make data findable, accessible, interoperable and reusable. Open reproducible research can only be achieved through the combination of all these aspects.

Their Nature Water article details how this is vital for the development of Early Warning Systems for floods for example, as reliable forecasting relies heavily on real-time sharing of meteorological data. It is also crucial when studying processes on long time scales such as groundwater recharge, that can take centuries in arid systems. Understanding these natural mechanisms is only possible through free access to long time series of hydrological data across the globe.

After reviewing the tools already available to perform open water research – such as open repositories, templates to facilitate reproducibility assessments, practical guidelines for sharing code and choosing appropriate licenses – the two authors call for substantial additional efforts toward fully open science….”

Toward open and reproducible epidemiology | American Journal of Epidemiology | Oxford Academic

Abstract:  Starting in the 2010s, researchers in the experimental social sciences rapidly began to adopt increasingly open and reproducible scientific practices. These practices include publicly sharing deidentified data when possible, sharing analysis code, and preregistering study protocols. Empirical evidence from the social sciences suggests such practices are feasible, can improve analytic reproducibility, and can reduce selective reporting. In academic epidemiology, adoption of open-science practices has been slower than in the social sciences (with some notable exceptions, such as registering clinical trials). Epidemiologic studies are often large, complex, conceived after data have already been collected, and difficult to directly replicate by collecting new data. These characteristics makes it especially important to ensure their integrity and analytic reproducibility. Open-science practices can also pay immediate dividends to researchers’ own work by clarifying scientific reasoning and encouraging well-documented, organized workflows. We consider how established epidemiologists and early-career researchers alike can help midwife a culture of open science in epidemiology through their research practices, mentorship, and editorial activities.

 

TIER2

“Enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility…

TIER2 aims to boost knowledge on reproducibility, create tools, engage communities, implement interventions and policy across different contexts to increase re-use and overall quality of research results….”

Boosting the reproducibility of research

“Recent years have seen perceptions of a “reproducibility crisis” grow in various disciplines. Scientists see poor levels of reproducibility as a severe threat to scientific self-correction, the efficiency of research processes, and societal trust in research results. A major Horizon Europe-funded project named TIER2 starts this month to study these issues and improve reproducibility across diverse scientific contexts….

The interdisciplinary TIER2 consortium comprises ten members from universities and research centers across Europe. They share a long history of successful cooperation and have extensive experience in completed EU projects, especially in the fields of Open Science, Research Integrity, and Science Policy: Know Center (Austria), Athena Research Center (Greece), Amsterdam University Medical Center (Netherlands), Aarhus University (Denmark), Pensoft Publishing (Bulgaria), GESIS Leibniz Institute for the Social Sciences (Germany), OpenAIRE (EU), Charite? – University of Medicine Berlin (Geramany), Oxford University (UK), and Alexander Fleming Biomedical Sciences Research Center (Greece)….”

 

MetaArXiv Preprints | Reproducible research practices and transparency across linguistics

Abstract:  Scientific studies of language span across many disciplines and provide evidence for social, cultural, cognitive, technological, and biomedical studies of human nature and behavior. By becoming increasingly empirical and quantitative, linguistics has been facing challenges and limitations of the scientific practices that pose barriers to reproducibility and replicability. One of the proposed solutions to the widely acknowledged reproducibility and replicability crisis has been the implementation of transparency practices, e.g. open access publishing, preregistrations, sharing study materials, data, and analyses, performing study replications and declaring conflicts of interest. Here, we have assessed the prevalence of these practices in randomly sampled 600 journal articles from linguistics across two time points. In line with similar studies in other disciplines, we found a moderate amount of articles published open access, but overall low rates of sharing materials, data, and protocols, no preregistrations, very few replications and low rates of conflict of interest reports. These low rates have not increased noticeably between 2008/2009 and 2018/2019, pointing to remaining barriers and slow adoption of open and reproducible research practices in linguistics. As linguistics has not yet firmly established transparency and reproducibility as guiding principles in research, we provide recommendations and solutions for facilitating the adoption of these practices.

 

Open Science to Increase Reproducibility in Science – Research Data Management

“OSIRIS will investigate, trial and implement interventions to improve reproducibility in science

While over the past decade many interventions to improve reproducibility have been introduced, targeted at funders, publishers or individual researchers, only few of them have been empirically tested. OSIRIS will do just that, testing existing and newly developed interventions, including Open Science practices, through controlled trials. The underlying drivers, barriers and incentives of reproducibility will also be studied in a systematic way. Aim is to deliver and disseminate guidance about evidence-based interventions that can improve the reproducibility of scientific findings….”

Ten simple rules for implementing open and reproducible research practices after attending a training course | PLOS Computational Biology

Abstract:  Open, reproducible, and replicable research practices are a fundamental part of science. Training is often organized on a grassroots level, offered by early career researchers, for early career researchers. Buffet style courses that cover many topics can inspire participants to try new things; however, they can also be overwhelming. Participants who want to implement new practices may not know where to start once they return to their research team. We describe ten simple rules to guide participants of relevant training courses in implementing robust research practices in their own projects, once they return to their research group. This includes (1) prioritizing and planning which practices to implement, which involves obtaining support and convincing others involved in the research project of the added value of implementing new practices; (2) managing problems that arise during implementation; and (3) making reproducible research and open science practices an integral part of a future research career. We also outline strategies that course organizers can use to prepare participants for implementation and support them during this process.