Open Science Practices in Communication Sciences and Disorders: A Survey | Journal of Speech, Language, and Hearing Research

Abstract:  Purpose: Open science is a collection of practices that seek to improve the accessibility, transparency, and replicability of science. Although these practices have garnered interest in related fields, it remains unclear whether open science practices have been adopted in the field of communication sciences and disorders (CSD). This study aimed to survey the knowledge, implementation, and perceived benefits and barriers of open science practices in CSD.

Method: An online survey was disseminated to researchers in the United States actively engaged in CSD research. Four-core open science practices were examined: preregistration, self-archiving, gold open access, and open data. Data were analyzed using descriptive statistics and regression models.

Results: Two hundred twenty-two participants met the inclusion criteria. Most participants were doctoral students (38%) or assistant professors (24%) at R1 institutions (58%). Participants reported low knowledge of preregistration and gold open access. There was, however, a high level of desire to learn more for all practices. Implementation of open science practices was also low, most notably for preregistration, gold open access, and open data (< 25%). Predictors of knowledge and participation, as well as perceived barriers to implementation, are discussed.

Conclusion: Although participation in open science appears low in the field of CSD, participants expressed a strong desire to learn more in order to engage in these practices in the future.

Methodology over metrics: current scientific standards are a disservice to patients and society – Journal of Clinical Epidemiology

Abstract:  Covid-19 research made it painfully clear that the scandal of poor medical research, as denounced by Altman in 1994, persists today. The overall quality of medical research remains poor, despite longstanding criticisms. The problems are well known, but the research community fails to properly address them. We suggest that most problems stem from an underlying paradox: although methodology is undeniably the backbone of high-quality and responsible research, science consistently undervalues methodology. The focus remains more on the destination (research claims and metrics) than on the journey. Notwithstanding, research should serve society more than the reputation of those involved. While we notice that many initiatives are being established to improve components of the research cycle, these initiatives are too disjointed. The overall system is monolithic and slow to adapt. We assert that top-down action is needed from journals, universities, funders and governments to break the cycle and put methodology first. These actions should involve the widespread adoption of registered reports, balanced research funding between innovative, incremental and methodological research projects, full recognition and demystification of peer review, improved methodological review of reports, adherence to reporting guidelines, and investment in methodological education and research. Currently, the scientific enterprise is doing a major disservice to patients and society.

 

Left in the dark: the importance of publicly available clinical trial protocols – Braat – 2022 – Medical Journal of Australia – Wiley Online Library

“Prospective registration of a randomised controlled trial (RCT) based on a protocol with formal ethics approval is a benchmark for transparent medical research. The reporting of the primary results of the study should correspond to the design, analysis, and reporting specified in the protocol and trial registration. However, modifications to various aspects of the trial are often made after registration, ranging from administrative updates to substantial protocol amendments. To track the history of revisions, the protocol and registry entry should be updated, and the documentation trail should support an independent appraisal of whether any biases have been introduced that could affect interpretation of trial results.

In this issue of the MJA, Coskinas and colleagues report their investigation of changes to 181 phase 3 RCTs registered with the Australian New Zealand Clinical Trials Registry (ANZCTR) during 1 September 2007 – 31 December 2013.1 The authors compared protocol documents (including ANZCTR registration information) with subsequent journal publications for any changes to the primary outcome, treatment comparisons, analysis set definition, eligibility criteria, sample size, or primary analysis method. They found that protocols were available for only 124 trials (69%); it could be determined that no major changes had been made to eleven of these trials (9%), while 78 had definitely been modified (63%). By comparing publications with trial registration information, it was found that no changes were made to five of the 57 trials without available protocols (9%), and it could not be determined whether changes had been made to a further ten (18%)….”

PsyArXiv Preprints | Three Myths about Open Science That Just Won’t Die

Abstract:  Knowledge and implementation of open science principles and behaviors remains uneven between and within sub-disciplines in psychology, despite over 10 years of education and advocacy. One reason for the slow and uneven progress of the movement is a set of closely-held myths about the implications of open science practices, exacerbated by the relative isolation of various sub-disciplines in the field. This talk will cover three of the major recurring myths: that open science is in conflict with prioritizing diversity, that “open data” is a binary choice between fully open and accessible and completely closed off, and that preregistration and registered reports are only appropriate for certain types of research designs. Putting these myths to rest is necessary as we work towards improving our scientific practice.

 

Easing Into Open Science: A Guide for Graduate Students and Their Advisors | Collabra: Psychology | University of California Press

Abstract:  This article provides a roadmap to assist graduate students and their advisors to engage in open science practices. We suggest eight open science practices that novice graduate students could begin adopting today. The topics we cover include journal clubs, project workflow, preprints, reproducible code, data sharing, transparent writing, preregistration, and registered reports. To address concerns about not knowing how to engage in open science practices, we provide a difficulty rating of each behavior (easy, medium, difficult), present them in order of suggested adoption, and follow the format of what, why, how, and worries. We give graduate students ideas on how to approach conversations with their advisors/collaborators, ideas on how to integrate open science practices within the graduate school framework, and specific resources on how to engage with each behavior. We emphasize that engaging in open science behaviors need not be an all or nothing approach, but rather graduate students can engage with any number of the behaviors outlined.

 

Reporting and transparent research practices in sports medicine and orthopaedic clinical trials: a meta-research study | BMJ Open

Abstract:  Objectives Transparent reporting of clinical trials is essential to assess the risk of bias and translate research findings into clinical practice. While existing studies have shown that deficiencies are common, detailed empirical and field-specific data are scarce. Therefore, this study aimed to examine current clinical trial reporting and transparent research practices in sports medicine and orthopaedics.

Setting Exploratory meta-research study on reporting quality and transparent research practices in orthopaedics and sports medicine clinical trials.

Participants The sample included clinical trials published in the top 25% of sports medicine and orthopaedics journals over 9 months.

Primary and secondary outcome measures Two independent reviewers assessed pre-registration, open data and criteria related to scientific rigour, like randomisation, blinding, and sample size calculations, as well as the study sample, and data analysis.

Results The sample included 163 clinical trials from 27 journals. While the majority of trials mentioned rigour criteria, essential details were often missing. Sixty per cent (95% confidence interval (CI) 53% to 68%) of trials reported sample size calculations, but only 32% (95% CI 25% to 39%) justified the expected effect size. Few trials indicated the blinding status of all main stakeholders (4%; 95% CI 1% to 7%). Only 18% (95% CI 12% to 24%) included information on randomisation type, method and concealed allocation. Most trials reported participants’ sex/gender (95%; 95% CI 92% to 98%) and information on inclusion and exclusion criteria (78%; 95% CI 72% to 84%). Only 20% (95% CI 14% to 26%) of trials were pre-registered. No trials deposited data in open repositories.

Conclusions These results will aid the sports medicine and orthopaedics community in developing tailored interventions to improve reporting. While authors typically mention blinding, randomisation and other factors, essential details are often missing. Greater acceptance of open science practices, like pre-registration and open data, is needed. As these practices have been widely encouraged, we discuss systemic interventions that may improve clinical trial reporting.

OSF Preprints | Open science practices in psychiatric genetics: a primer

Abstract:  Open science is a set of practices to ensure that all research elements are transparently reported and freely accessible for all to learn, assess, and build on. Psychiatric genetics has led among the health sciences in implementing some open science practices in common study designs, such as replication as part of genome-wide association studies. However, while additional open science practices could be embedded in genetics research to further improve its quality and accessibility, guidelines for doing so are limited. They are largely not specific to data, privacy, and research conduct challenges in psychiatric genetics. Here, we present a primer of open science practices in psychiatric genetics for multiple steps of the research process, including deciding on a research topic with patients/non-academic collaborators, equitable authorship and citation practices, considerations in designing a replicable, reproducible study, pre-registrations, open data, and privacy issues. We provide tips for creating informative figures, using inclusive, precise language, and following reporting standards. We also discuss considerations in working with non-academic research collaborators (citizen scientists) and outline ways of disseminating research through preprints, blogs, social media, and accessible lecture materials. Finally, we provide a list of extra resources to support every step of the research process.

 

Three Steps to Open Science for Qualitative Research in Psychology

Abstract: Principles and applications of open science (also referred to as open research or open scholarship) inpsychology have emerged in response to growing concerns about the replicability, transparency, reproducibility, and robustness of psychological research alongside global moves to open science in many fields. Our objective in this paper is to inform ways of collectively constructing open science practices and systems that are appropriate to, and get the best out of, the full range of qualitative and mixed-method approaches used in psychology. We achieve this by describing three areas of open research practice (contributorship, pre-registration, and open data) and explore how and why qualitative researchers might consider engaging with these in ways that are compatible with a qualitative research paradigm. We argue it is crucial that open research practices do not (even inadvertently) exclude qualitative research, and that qualitative researchers reflect on how we can meaningfully engage with open science in psychology.

 

Advances in transparency and reproducibility in the social sciences – ScienceDirect

Abstract:  Worries about a “credibility crisis” besieging science have ignited interest in research transparency and reproducibility as ways of restoring trust in published research. For quantitative social science, advances in transparency and reproducibility can be seen as a set of developments whose trajectory predates the recent alarm. We discuss several of these developments, including preregistration, data-sharing, formal infrastructure in the form of resources and policies, open access to research, and specificity regarding research contributions. We also discuss the spillovers of this predominantly quantitative effort towards transparency for qualitative research. We conclude by emphasizing the importance of mutual accountability for effective science, the essential role of openness for this accountability, and the importance of scholarly inclusiveness in figuring out the best ways for openness to be accomplished in practice.

 

 

 

Open Science Badges at Taylor & Francis – Editor Resources

“Open Science Badges (OSB) were designed by the Center for Open Science to acknowledge and encourage open science practices. They are offered as incentives for researchers to share data, materials, or to preregister their research. The badges are a visual signal for readers, indicating that the content of the study is available in perpetuity….”

Open Science Practices in Gambling Research Publications (2016–2019): A Scoping Review | SpringerLink

Abstract:  The replication crisis has stimulated researchers around the world to adopt open science research practices intended to reduce publication bias and improve research quality. Open science practices include study pre-registration, open data, open access, and avoiding methods that can lead to publication bias and low replication rates. Although gambling studies uses similar research methods as behavioral research fields that have struggled with replication, we know little about the uptake of open science research practices in gambling-focused research. We conducted a scoping review of 500 recent (1/1/2016–12/1/2019) studies focused on gambling and problem gambling to examine the use of open science and transparent research practices. Our results showed that a small percentage of studies used most practices: whereas 54.6% (95% CI: [50.2, 58.9]) of studies used at least one of nine open science practices, each practice’s prevalence was: 1.6% for pre-registration (95% CI: [0.8, 3.1]), 3.2% for open data (95% CI: [2.0, 5.1]), 0% for open notebook, 35.2% for open access (95% CI: [31.1, 39.5]), 7.8% for open materials (95% CI: [5.8, 10.5]), 1.4% for open code (95% CI: [0.7, 2.9]), and 15.0% for preprint posting (95% CI: [12.1, 18.4]). In all, 6.4% (95% CI: [4.6, 8.9]) of the studies included a power analysis and 2.4% (95% CI: [1.4, 4.2]) were replication studies. Exploratory analyses showed that studies that used any open science practice, and open access in particular, had higher citation counts. We suggest several practical ways to enhance the uptake of open science principles and practices both within gambling studies and in science more generally.

 

Assessing Open Science practices in physical activity behaviour change intervention evaluations | BMJ Open Sport & Exercise Medicine

Abstract:  Objectives Concerns on the lack of reproducibility and transparency in science have led to a range of research practice reforms, broadly referred to as ‘Open Science’. The extent that physical activity interventions are embedding Open Science practices is currently unknown. In this study, we randomly sampled 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practices.

Methods One hundred reports of randomised controlled trial physical activity behaviour change interventions published between 2018 and 2021 were identified, as used within the Human Behaviour-Change Project. Open Science practices were coded in identified reports, including: study pre-registration, protocol sharing, data, materials and analysis scripts sharing, replication of a previous study, open access publication, funding sources and conflict of interest statements. Coding was performed by two independent researchers, with inter-rater reliability calculated using Krippendorff’s alpha.

Results 78 of the 100 reports provided details of study pre-registration and 41% provided evidence of a published protocol. 4% provided accessible open data, 8% provided open materials and 1% provided open analysis scripts. 73% of reports were published as open access and no studies were described as replication attempts. 93% of reports declared their sources of funding and 88% provided conflicts of interest statements. A Krippendorff’s alpha of 0.73 was obtained across all coding.

Conclusion Open data, materials, analysis and replication attempts are currently rare in physical activity behaviour change intervention reports, whereas funding source and conflict of interest declarations are common. Future physical activity research should increase the reproducibility of their methods and results by incorporating more Open Science practices.

New Instructions to Authors Emphasize Open Science, Transparency, Full Reporting of Sociodemographic Characteristics of the Sample, and Avoidance of Piecemeal Publication | Annals of Behavioral Medicine | Oxford Academic

“We have updated our Author guidelines to more fully reflect the journal’s values and to better align manuscript reporting practices with scientific ideals for open transparency, open science, and contextualization. Accordingly, we are adding a number of new requirements for manuscript submission to Annals of Behavioral Medicine. The updated Author Guidelines (https://academic.oup.com/abm/pages/General_Instructions) describe them in full detail. Here, we briefly summarize some of the most important changes.”

Merit Review Policy – [U of Maryland, Psychology Department]

“Examples of specific evaluative criteria to be used in merit review, based on professional standards for evaluating faculty performance….Openness and transparency: Degree to which research, data, procedures, code, and research products are made openly available where appropriate; the use of registered reports or pre-registration. Committee should recognize that researchers may not be able to share some types of data, such as when data are proprietary or subject to ethical concerns over confidentiality[7, 1, 6, 2, 5] These limitations should be documented by faculty.”

 

The potential butterfly effect of preregistered peer-reviewed research – The Official PLOS Blog

“Refocusing journal peer review on the study design phase exerts more and greater downstream changes. Peer review that focuses on evaluating significance of the research question, the methods and analytical approach before work begins, has the power to shape stronger, more rigorous and more creative research. Making an editorial decision while results are still unknown minimizes the potential impacts confirmation bias and impact bias, taking science communication back to its roots, with an emphasis on quality, rigor, and a pure intellectual curiosity. As Kiermer explains, “Preregistration and peer review of the study protocol with a journal is a way to tackle publication bias. As long as the protocol is followed, or any deviations explained, it’s a guarantee for the author that the results will be published, even if they don’t confirm their hypothesis.”

In combination, all of these factors contribute to a more complete and efficient scientific record, replete with studies exploring important hypotheses, performed to the very highest technical standards, and free from the distorting influence of impact-chasing, ego, and bias. A scientific record that is both demonstrably trustworthy, and widely trusted. And with that, there is no telling where science might go, or how quickly….”