Abstract: Objective: The open science movement seeks to make research more transparent, and to that end, researchers are increasingly expected or required to archive their data in national repositories. In qualitative trauma research, data sharing could compromise participants’ safety, privacy, and confidentiality because narrative data can be more difficult to de-identify fully. There is little guidance in the traumatology literature regarding how to discuss data-sharing requirements with participants during the informed consent process. Within a larger research project in which we interviewed assault survivors, we developed and evaluated a protocol for informed consent for qualitative data sharing and engaging participants in data de-identification. Method: We conducted qualitative interviews with N = 32 adult sexual assault survivors regarding (a) how to conduct informed consent for data sharing, (b) whether participants should have input on sharing their data, and (c) whether they wanted to redact information from their transcripts prior to archiving. Results: No potential participants declined participation after learning about the archiving mandate. Survivors indicated that they wanted input on archiving because the interview is their story of trauma and abuse and it would be disempowering not to have control over how this information was shared and disseminated. Survivors also wanted input on this process to help guard their privacy, confidentiality, and safety. None of the participants elected to redact substantive data prior to archiving. Conclusions: Engaging participants in the archiving process is a feasible practice that is important and empowering for trauma survivors. (PsycInfo Database Record (c) 2022 APA, all rights reserved)
Abstract: Opening data promises to improve research rigour and democratize knowledge production. But it also presents practical, theoretical, and ethical considerations for qualitative researchers in particular. Discussion about open data in qualitative social psychology predates the replication crisis. However, the nuances of this ongoing discussion have not been translated into current journal guidelines on open data. In this article, we summarize ongoing debates about open data from qualitative perspectives, and through a content analysis of 261 journals we establish the state of current journal policies for open data in the domain of social psychology. We critically discuss how current common expectations for open data may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We advise that future open data guidelines should aim to reflect the nuance of arguments surrounding data sharing in qualitative research, and move away from a universal “one-size-fits-all” approach to data sharing. This article outlines the past, present, and the potential future of open data guidelines in social-psychological journals. We conclude by offering recommendations for how journals might more inclusively consider the use of open data in qualitative methods, whilst recognizing and allowing space for the diverse perspectives, needs, and contexts of all forms of social-psychological research.
Abstract: Qualitative data sharing practices in psychology have not developed as rapidly as those in parallel quantitative domains. This is often explained by numerous epistemological, ethical and pragmatic issues concerning qualitative data types. In this article, I provide an alternative to the frequently expressed, often reasonable, concerns regarding the sharing of qualitative human data by highlighting three advantages of qualitative data sharing. I argue that sharing qualitative human data is not by default ‘less ethical’, ‘riskier’ and ‘impractical’ compared with quantitative data sharing, but in some cases more ethical, less risky and easier to manage for sharing because (1) informed consent can be discussed, negotiated and validated; (2) the shared data can be curated by special means; and (3) the privacy risks are mainly local instead of global. I hope this alternative perspective further encourages qualitative psychologists to share their data when it is epistemologically, ethically and pragmatically possible.
“Psychology journals are not immune to targeting by paper mills. Difficulties in obtaining peer reviewers have led many journals, such as this one, to ask authors to recommend peer reviewers. This creates a crack in the defences of a journal against fraud, if it is combined with lack of editorial oversight. This case illustrates the benefits of open peer review in detecting fraud….”
“The American Psychological Association has signed an agreement with Jisc offering participating UK authors capped open access publishing in its journals….”
Abstract: Open science is a set of practices to ensure that all research elements are transparently reported and freely accessible for all to learn, assess, and build on. Psychiatric genetics has led among the health sciences in implementing some open science practices in common study designs, such as replication as part of genome-wide association studies. However, while additional open science practices could be embedded in genetics research to further improve its quality and accessibility, guidelines for doing so are limited. They are largely not specific to data, privacy, and research conduct challenges in psychiatric genetics. Here, we present a primer of open science practices in psychiatric genetics for multiple steps of the research process, including deciding on a research topic with patients/non-academic collaborators, equitable authorship and citation practices, considerations in designing a replicable, reproducible study, pre-registrations, open data, and privacy issues. We provide tips for creating informative figures, using inclusive, precise language, and following reporting standards. We also discuss considerations in working with non-academic research collaborators (citizen scientists) and outline ways of disseminating research through preprints, blogs, social media, and accessible lecture materials. Finally, we provide a list of extra resources to support every step of the research process.
“The number of articles indexed on ScienceOpen is rapidly approaching 80 million. ScienceOpen works with a diverse community of publishers and is a great and useful resource for research in many subjects, including psychology.
Many of our partners publish in the field of psychology, and in this post, we will highlight some of the most notable and recently added psychology-related content and collections on the ScienceOpen network. …”
Abstract: Principles and applications of open science (also referred to as open research or open scholarship) inpsychology have emerged in response to growing concerns about the replicability, transparency, reproducibility, and robustness of psychological research alongside global moves to open science in many fields. Our objective in this paper is to inform ways of collectively constructing open science practices and systems that are appropriate to, and get the best out of, the full range of qualitative and mixed-method approaches used in psychology. We achieve this by describing three areas of open research practice (contributorship, pre-registration, and open data) and explore how and why qualitative researchers might consider engaging with these in ways that are compatible with a qualitative research paradigm. We argue it is crucial that open research practices do not (even inadvertently) exclude qualitative research, and that qualitative researchers reflect on how we can meaningfully engage with open science in psychology.
“The University of Maryland is rewarding faculty members in the department of psychology who perform and disseminate research in accordance with open science practices. In April, the department adopted new guidelines that explicitly codify open science as a core criteria in tenure and promotion review.
The change was several years in the making and championed by Michael Dougherty, chair of the department. “When you think about the goal and purpose of higher education and why we take these positions, it’s because we felt there would be some good that we could impart on the world,” Dougherty said. “The traditional markers of impact are how many times you’ve been cited [in a journal]. That’s not the type of impact that is valuable to the broader society.”
The new policy was necessary, he said, so incentives for advancement reflect the values of scientists and their institutions….”
The scientific community has long recognized the benefits of open science. Today, governments and research agencies worldwide are increasingly promoting and mandating open practices for scientific research. However, for open science to become the by-default model for scientific research, researchers must perceive open practices as accessible and achievable. A significant obstacle is the lack of resources providing a clear direction on how researchers can integrate open science practices in their day-to-day workflows. This article outlines and discusses ten concrete strategies that can help researchers use and disseminate open science. The first five strategies address basic ways of getting started in open science that researchers can put into practice today. The last five strategies are for researchers who are more advanced in open practices to advocate for open science. Our paper will help researchers navigate the transition to open science practices and support others in shifting toward openness, thus contributing to building a better science.
“The University of Maryland is rewarding faculty members in the department of psychology who perform and disseminate research in accordance with open science practices. In April, the department adopted new guidelines that explicitly codify open science as a core criteria in tenure and promotion review….”
“Ten years ago, mindful of the trend of research being increasingly generated, but not increasingly revisited, Nosek and colleagues decided to re-run a series of published scientific experiments, creating the “Many Labs” project. The global effort, which at times has been both headline-grabbing and apple cart-turning, wrapped up at the end of April….”
Abstract: Objectives Concerns on the lack of reproducibility and transparency in science have led to a range of research practice reforms, broadly referred to as ‘Open Science’. The extent that physical activity interventions are embedding Open Science practices is currently unknown. In this study, we randomly sampled 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practices.
Methods One hundred reports of randomised controlled trial physical activity behaviour change interventions published between 2018 and 2021 were identified, as used within the Human Behaviour-Change Project. Open Science practices were coded in identified reports, including: study pre-registration, protocol sharing, data, materials and analysis scripts sharing, replication of a previous study, open access publication, funding sources and conflict of interest statements. Coding was performed by two independent researchers, with inter-rater reliability calculated using Krippendorff’s alpha.
Results 78 of the 100 reports provided details of study pre-registration and 41% provided evidence of a published protocol. 4% provided accessible open data, 8% provided open materials and 1% provided open analysis scripts. 73% of reports were published as open access and no studies were described as replication attempts. 93% of reports declared their sources of funding and 88% provided conflicts of interest statements. A Krippendorff’s alpha of 0.73 was obtained across all coding.
Conclusion Open data, materials, analysis and replication attempts are currently rare in physical activity behaviour change intervention reports, whereas funding source and conflict of interest declarations are common. Future physical activity research should increase the reproducibility of their methods and results by incorporating more Open Science practices.
“It takes 270 psychologists, 100 study findings, and four years to start a replication crisis. Or at least these were the ingredients of the Reproducibility Project: Psychology, published in 2015. This project responded to widespread concerns that many psychological findings might be false-positives and that the reported effects therefore do not actually exist. The goal of the Reproducibility Project was to test whether these concerns were well-founded by estimating the replicability of psychological science….
A more positive view of the Reproducibility Project and its findings has developed over recent years. Many scholars believe that the Reproducibility Project and similar efforts led researchers to finally acknowledge that there was a problem. These replication efforts sparked a transformation of psychological science towards more openness and transparency. For example, many researchers now preregister their studies and make their data and materials publicly available. Consequently, some scholars argue that psychological research is now more credible and productive than ever….”
“Examples of specific evaluative criteria to be used in merit review, based on professional standards for evaluating faculty performance….Openness and transparency: Degree to which research, data, procedures, code, and research products are made openly available where appropriate; the use of registered reports or pre-registration. Committee should recognize that researchers may not be able to share some types of data, such as when data are proprietary or subject to ethical concerns over confidentiality[7, 1, 6, 2, 5] These limitations should be documented by faculty.”