Abstract: The state of open science needs to be monitored to track changes over time and identify areas to create interventions to drive improvements. In order to monitor open science practices, they first need to be well defined and operationalized. To reach consensus on what open science practices to monitor at biomedical research institutions, we conducted a modified 3-round Delphi study. Participants were research administrators, researchers, specialists in dedicated open science roles, and librarians. In rounds 1 and 2, participants completed an online survey evaluating a set of potential open science practices, and for round 3, we hosted two half-day virtual meetings to discuss and vote on items that had not reached consensus. Ultimately, participants reached consensus on 19 open science practices. This core set of open science practices will form the foundation for institutional dashboards and may also be of value for the development of policy, education, and interventions.
Abstract: Starting in the 2010s, researchers in the experimental social sciences rapidly began to adopt increasingly open and reproducible scientific practices. These practices include publicly sharing deidentified data when possible, sharing analysis code, and preregistering study protocols. Empirical evidence from the social sciences suggests such practices are feasible, can improve analytic reproducibility, and can reduce selective reporting. In academic epidemiology, adoption of open-science practices has been slower than in the social sciences (with some notable exceptions, such as registering clinical trials). Epidemiologic studies are often large, complex, conceived after data have already been collected, and difficult to directly replicate by collecting new data. These characteristics makes it especially important to ensure their integrity and analytic reproducibility. Open-science practices can also pay immediate dividends to researchers’ own work by clarifying scientific reasoning and encouraging well-documented, organized workflows. We consider how established epidemiologists and early-career researchers alike can help midwife a culture of open science in epidemiology through their research practices, mentorship, and editorial activities.
Citizen science (CS), as an enabler of open science (OS) practices, is a low-cost and accessible method for data collection in biodiversity monitoring, which can empower and educate the public both on scientific research priorities and on environmental change. Where OS increases research transparency and scientific democratisation; if properly implemented, CS should do the same. Here, we present the findings of a systematic review exploring “openness” of CS in biodiversity monitoring. CS projects were scored between???1 (closed) and 1 (open) on their adherence to defined OS principles: accessible data, code, software, publication, data management plans, and preregistrations. Openness scores per principle were compared to see where OS is more frequently utilised across the research process. The relationship between interest in CS and openness within the practice was also tested. Overall, CS projects had an average open score of 0.14. There was a significant difference in open scores between OS principles (p?=??<?0.0001), where “open data” was the most adhered to practice compared to the lowest scores found in relation to preregistrations. The apparent level of interest in CS was not shown to correspond to a significant increase in openness within CS (p?=?0.8464). These results reveal CS is not generally “open” despite being an OS approach, with implications for how the public can interact with the research that they play an active role in contributing to. The development of systematic recommendations on where and how OS can be implemented across the research process in citizen science projects is encouraged.
Abstract: Scientific studies of language span across many disciplines and provide evidence for social, cultural, cognitive, technological, and biomedical studies of human nature and behavior. By becoming increasingly empirical and quantitative, linguistics has been facing challenges and limitations of the scientific practices that pose barriers to reproducibility and replicability. One of the proposed solutions to the widely acknowledged reproducibility and replicability crisis has been the implementation of transparency practices, e.g. open access publishing, preregistrations, sharing study materials, data, and analyses, performing study replications and declaring conflicts of interest. Here, we have assessed the prevalence of these practices in randomly sampled 600 journal articles from linguistics across two time points. In line with similar studies in other disciplines, we found a moderate amount of articles published open access, but overall low rates of sharing materials, data, and protocols, no preregistrations, very few replications and low rates of conflict of interest reports. These low rates have not increased noticeably between 2008/2009 and 2018/2019, pointing to remaining barriers and slow adoption of open and reproducible research practices in linguistics. As linguistics has not yet firmly established transparency and reproducibility as guiding principles in research, we provide recommendations and solutions for facilitating the adoption of these practices.
“OSIRIS will investigate, trial and implement interventions to improve reproducibility in science
While over the past decade many interventions to improve reproducibility have been introduced, targeted at funders, publishers or individual researchers, only few of them have been empirically tested. OSIRIS will do just that, testing existing and newly developed interventions, including Open Science practices, through controlled trials. The underlying drivers, barriers and incentives of reproducibility will also be studied in a systematic way. Aim is to deliver and disseminate guidance about evidence-based interventions that can improve the reproducibility of scientific findings….”
“The Sustainable Development Goals (SDGs) and open science are symbiotic processes. No SDG reveals this connection more strongly than SDG 13-Climate Action. This perspective uses the SDGs as a lens to explore open science practices and prospects. It illustrates, through the concept of Net-Zero, how open science has been an accelerator of SDG 13-Climate Action. It also shows how open science can be further advanced in the context of SDG 13, discussing related SDGs such as Goal 9-Industry, Innovation and Infrastructure; Goal 16-Peace, Justice, and Strong Institutions; and Goal 17-Partnerships for the Goals. In these ways, this perspective describes opportunities for open science and SDG-Climate Action to support and accelerate one another.”
“Open science, the practice of sharing findings and resources towards the collaborative pursuit of scientific progress and societal good, can accelerate the pace of research and contribute to a more equitable society. However, the current culture of scientific research is not optimally structured to promote extensive sharing of a range of outputs. In this policy position paper, we outline current open science practices and key bottlenecks in their broader adoption. We propose that national science agencies create a digital infrastructure framework that would standardize open science principles and make them actionable. We also suggest ways of redefining research success to align better with open science, and to incentivize a system where sharing various research outputs is beneficial to researchers.”
“Implementing FAIR Workflows: A Proof of Concept Study in the Field of Consciousness is a 3-year project funded by the Templeton World Charity Foundation. In this project, DataCite works with a number of partners on providing an exemplar workflow that researchers can use to implement FAIR practices throughout their research lifecycle. In this monthly blog series, the different project participants will share perspectives on FAIR practices and recommendations.
In this post, Xiaoli Chen, project lead at DataCite, reflects on the gap between acknowledging FAIR and practicing FAIR….”
For many years now, the open social scholarship community in Canada has examined its practices and capacities for scholarly communication in the digital age, both in terms of making the scholarly discourse richer, more efficient, and more responsive, and with an eye to making scholarly discourse in the humanities more relevant and interesting to audiences outside our specific disciplines and indeed the academy itself. Attention to “new knowledge environments” has proved both fruitful and inspiring, but the scholarly community remains rooted in a set of very traditional scholarly communications forms/practices: conference presentations, journal articles, and books. These traditional forms are rooted in—even arguably constitutionally defined by—peer review practices. Whether these traditional forms have bright futures in the digital age is a topic for another discussion, but it seems fair to argue that peer review itself is and will continue to be a constitutional component of scholarly communications.
What to make of peer review, then? As an artifact largely of the twentieth century and the late age of print, we might expect its role to shift in new, digital formats and genres, and its form and function to be responsive to disciplinary and methodological innovations. And yet, there is a sense in which peer review remains a stubborn, poorly understood, and ritualized practice. We generally lack good conceptual models of the what, the how, and the why of peer review practices, even as we consistently uphold their centrality to scholarly work.
As such, this essay is an exploration of peer review in theory and practice, and an attempt to work out what it might mean in the context of the humanities specifically, and especially in terms of open social scholarship. In this essay, I take the scholarly journal as the fundamental case, and as such much of the discussion that follows is an appraisal and attempted re-imagining of some fairly conventional forms. My aim here is not to praise or condemn peer review itself, nor any particular flavour or format of it in practice. Rather, the goal is to understand what peer review might ideally be for in open, humanities scholarship: how we might think about it, how to identify the precepts upon which our practices might be founded, and indeed, where its heart lies.
Abstract: Purpose: Open science is a collection of practices that seek to improve the accessibility, transparency, and replicability of science. Although these practices have garnered interest in related fields, it remains unclear whether open science practices have been adopted in the field of communication sciences and disorders (CSD). This study aimed to survey the knowledge, implementation, and perceived benefits and barriers of open science practices in CSD.
Method: An online survey was disseminated to researchers in the United States actively engaged in CSD research. Four-core open science practices were examined: preregistration, self-archiving, gold open access, and open data. Data were analyzed using descriptive statistics and regression models.
Results: Two hundred twenty-two participants met the inclusion criteria. Most participants were doctoral students (38%) or assistant professors (24%) at R1 institutions (58%). Participants reported low knowledge of preregistration and gold open access. There was, however, a high level of desire to learn more for all practices. Implementation of open science practices was also low, most notably for preregistration, gold open access, and open data (< 25%). Predictors of knowledge and participation, as well as perceived barriers to implementation, are discussed.
Conclusion: Although participation in open science appears low in the field of CSD, participants expressed a strong desire to learn more in order to engage in these practices in the future.
“The National Academies of Sciences, Engineering, and Medicine’s Roundtable on Aligning Incentives for Open Scholarship will organize a one-day public workshop in conjunction with its Fall 2022 meeting. The workshop will explore actions being taken by various stakeholder organizations to foster the broad adoption of policies and practices in support of open scholarship. It will bring together participants from universities, scholarly societies, federal agencies, and private research funders. A Proceedings of a Workshop–in Brief will be prepared by designated rapporteurs and distributed broadly.
The public is invited to register to join virtually….”
Hyde, A., Pattinson, D., & Shannon, P. (2022). Designing for Emergent Workflow Cultures: eLife, PRC, and Kotahi. Commonplace. https://doi.org/10.21428/6ffd8432.ef6691ea
Scholarly publishing is evolving, and there is a need to understand and design the new (emergent) workflows while also designing technology to capture and support these processes. This article documents an ongoing collaboration to develop technology to meet emergent workflows in scholarly publishing, namely Publish-Review-Curate (PRC). We explore this topic with different eLife PRC community stakeholders using Kotahi, a flexible open-source scholarly publishing platform that can support variant workflows (built by Coko).
Ruehling, B., & Piersig, K. (2022). A Book Sprint as a concurrent editorial process. Commonplace. https://doi.org/10.21428/6ffd8432.e1116e78
The Book Sprints method facilitates collaborative content production. A group of experts is guided by a facilitator to write, edit, and produce a book in five days. The strength of the method is the focus on collaboration. It allows the contributors to combine their different research approaches and experiences into a cohesive work. The outcome is not a collection of articles, but a co-authored book with a streamlined reader journey. […]
Steinhardt I, Kruschick F (2022) Knowledge Equity and Open Science in qualitative research – Practical research considerations. Research Ideas and Outcomes 8: e86387. https://doi.org/10.3897/rio.8.e86387
How can Knowledge In/Equity be addressed in qualitative research by taking the idea of Open Science into account? Two projects from the Open Science Fellows Programme by Wikimedia Deutschland will be used to illustrate how Open Science practices can succeed in qualitative research, thereby reducing In/Equity. In this context, In/Equity is considered as a fair and equal representation of people, their knowledge and insights and comprehends questions about how epistemic, structural, institutional and personal biases generate and shape knowledge as guidance. Three questions guide this approach: firstly, what do we understand by In/Equity in the context of knowledge production in these projects? Secondly, who will be involved in knowledge generation and to what extent will they be valued or unvalued? Thirdly, how can data be made accessible for re-use to enable true participation and sharing?
Fischer C, Hirsbrunner SD, Teckentrup V (2022) Producing Open Data. Research Ideas and Outcomes 8: e86384. https://doi.org/10.3897/rio.8.e86384
Open data offer the opportunity to economically combine data into large-scale datasets, fostering collaboration and re-use in the interest of treating researchers’ resources as well as study participants with care. Whereas advantages of utilising open data might be self-evident, the production of open datasets also challenges individual researchers. This is especially true for open data that include personal data, for which higher requirements have been legislated. Mainly building on our own experience as scholars from different research traditions (life sciences, social sciences and humanities), we describe best-practice approaches for opening up research data. We reflect on common barriers and strategies to overcome them, condensed into a step-by-step guide focused on actionable advice in order to mitigate the costs and promote the benefit of open data on three levels at once: society, the disciplines and individual researchers. Our contribution may prevent researchers and research units from re-inventing the wheel when opening data and enable them to learn from our experience.