International Survey on Data Sharing and Re-use in Traumatic Stress Research

“The Global Collaboration on Traumatic Stress, a coalition of 11 scientific societies in the field of traumatic stress, is conducting a survey to better understand traumatic stress researchers’ opinions and experiences regarding data sharing and data re-use.

If you are a traumatic stress researcher at any career stage (including trainees) we invite you to share your opinions and experiences by participating in this survey. …”

PsyArXiv Preprints | When open data closes the door: Problematising a one size fits all approach to open data in journal submission guidelines

Abstract:  Opening data promises to improve research rigour and democratise knowledge production. But it also poses practical, theoretical, and ethical risks for qualitative research. Despite discussion about open data in qualitative social psychology predating the replication crisis, the nuances of this discussion have not been translated into current journal policies. Through a content analysis of 261 journals in the domain of social psychology, we establish the state of current journal policies for open data. We critically discuss how these expectations may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We assert that open data requirements should include clearer guidelines that reflect the nuance of data sharing in qualitative research, and move away from a universal ‘one-size-fits-all’ approach to data sharing.

 

How misconduct helped psychological science to thrive

“Despite this history, before Stapel, researchers were broadly unaware of these problems or dismissed them as inconsequential. Some months before the case became public, a concerned colleague and I proposed to create an archive that would preserve the data collected by researchers in our department, to ensure reproducibility and reuse. A council of prominent colleagues dismissed our proposal on the basis that competing departments had no similar plans. Reasonable suggestions that we made to promote data sharing were dismissed on the unfounded grounds that psychology data sets can never be safely anonymized and would be misused out of jealousy, to attack well-meaning researchers. And I learnt about at least one serious attempt by senior researchers to have me disinvited from holding a workshop for young researchers because it was too critical of suboptimal practices….

Much of the advocacy and awareness has been driven by early-career researchers. Recent cases show how preregistering studies, replication, publishing negative results, and sharing code, materials and data can both empower the self-corrective mechanisms of science and deter questionable research practices and misconduct….

For these changes to stick and spread, they must become systemic. We need tenure committees to reward practices such as sharing data and publishing rigorous studies that have less-than-exciting outcomes. Grant committees and journals should require preregistration or explanations of why it is not warranted. Grant-programme officers should be charged with checking that data are made available in accordance with mandates, and PhD committees should demand that results are verifiable. And we need to strengthen a culture in which top research is rigorous and trustworthy, as well as creative and exciting….”

A New Option for Scientific Exchange and an Alternative to the Commentary Format – Patricia J. Bauer, 2021

“Letters to the Editors will be disseminated online only to permit more rapid publication and to keep the discussion timely and responsive. They will be hosted on Figshare, which is an online open-access repository and the site that hosts all of the journal’s Supplemental Material. To further speed dissemination, accepted Letters to the Editors will not be copyedited or held for replies but instead will be disseminated as quickly as possible on acceptance. Letters to the Editors will have DOIs but, fitting their existence in the liminal space between a formal publication and an unmediated social media conversation, they will not be indexed (i.e., discoverable through PubMed, PsycInfo, etc.). To facilitate connections between the target article and Letters to the Editors, they will be linked to each other. …”

Full article: Making data meaningful: guidelines for good quality open data

“In the most recent editorial for the The Journal of Social Psychology (JSP), J. Grahe (2021) set out and justified a new journal policy: publishing papers now requires authors to make available all data on which claims are based. This places the journal amongst a growing group of forward-thinking psychology journals that mandate open data for research outputs.1 It is clear that the editorial team hopes to raise the credibility and usefulness of research in the journal, as well as the discipline, through increased research transparency….

This commentary represents a natural and complementary alliance between the ambition of JSP’s open data policy and the reality of how data sharing often takes place. We share with JSP the belief that usable and open data is good for social psychology and supports effective knowledge exchange within and beyond academia. For this to happen, we must have not just more open data, but open data that is of a sufficient quality to support repeated use and replication (Towse et al., 2020). Moreover, it is becoming clear that researchers across science are seeking guidance, training and standards for open data provision (D. Roche et al., 2021; Soeharjono & Roche, 2021). With this in mind, we outline several simple steps and point toward a set of freely available resources that can help make datasets more valuable and impactful. Specifically, we explain how to make data meaningful; easily findable, accessible, complete and understandable. We have provided a simple checklist (Table 1) and useful resources (Appendix A) based on our recommendations, these can also be found on the project page for this article (https:doi.org/10.17605/OSF.IO/NZ5WS). While we have focused mostly on sharing quantitative data, much of what has been discussed remains relevant to qualitative research (for an in-depth discussion of qualitative data sharing, see DuBois et al., 2018)….”

Frontiers | The Ethic of Access: An AIDS Activist Won Public Access to Experimental Therapies, and This Must Now Extend to Psychedelics for Mental Illness | Psychiatry

“If patients with mental illnesses are to be treated fairly in comparison with other categories of patients, they must be given access to promising experimental therapies, including psychedelics. The right of early access to promising therapies was advanced as an ethical principle by activist Larry Kramer during the AIDS pandemic, and has now largely been adopted by the medical establishment. Patients are regularly granted access to experimental drugs for many illness categories, such as cancer and infectious diseases. The need for expanded access is especially relevant during evolving crises like the AIDS and the coronavirus pandemics. In contrast to non-psychiatric branches of medicine, psychiatry has failed to expedite access to promising drugs in the face of public health emergencies, psychological crises, the wishes of many patients, and the needs of the community. Psychiatry must catch up to the rest of medicine and allow the preferences of patients for access to guide policy and law regarding unapproved medications like psychedelics….

Open questions include how to amplify the voices of patients regarding experimental therapies like psychedelics, how to implement early access, how to educate the public about this option once it exists, and how to ensure equitable access for multiple marginalized groups. A model of political engagement like ACT UP may not work for patients whose symptoms include lack of motivation and will, and who are at risk for re-traumatization. The authors are exploring an entirely patient-led counterpart to traditional academic peer review, which allows diverse patient communities to provide meaningful input into therapies that result from trials….”

 

IBM, MIT and Harvard release DARPA “Common Sense AI” dataset at ICML 2021 | IBM Research Blog

“Before we can build machines that make decisions based on common sense, the AI powering those machines must be capable of more than simply finding patterns in data. It must also consider the intentions, beliefs, and desires of others that people use to intuitively make decisions.

At the 2021 International Conference on Machine Learning (ICML), we are releasing a new dataset for benchmarking AI intuition, along with two machine learning models representing different approaches to the problem. The research has been done with our colleagues at MIT and Harvard University to accelerate the development of AI that exhibits common sense. These tools rely on testing techniques that psychologists use to study the behavior of infants….”

Advancing Scientific Integrity, Transparency, and Openness in Child Development Research: Challenges and Possible Solutions – Gilmore – 2020 – Child Development Perspectives – Wiley Online Library

Abstract:  In 2019, the Governing Council of the Society for Research in Child Development (SRCD) adopted a Policy on Scientific Integrity, Transparency, and Openness (SRCD, 2019a) and accompanying Author Guidelines on Scientific Integrity and Openness in Child Development (SRCD, 2019b). In this issue, a companion article (Gennetian, Tamis?LeMonda, & Frank) discusses the opportunities to realize SRCD’s vision for a science of child development that is open, transparent, robust, and impactful. In this article, we discuss some of the challenges associated with realizing SRCD’s vision. In identifying these challenges—protecting participants and researchers from harm, respecting diversity, and balancing the benefits of change with the costs—we also offer constructive solutions.

 

Open science as a path to education of new psychophysiologists – ScienceDirect

Highlights

 

Open science increases access to resources for training.

Open education practices empower educators to make use of open science resources.

PURSUE is an open education initiative for training in psychophysiology.

PURSUE’s model of open education can generalize to other STEM fields….”

 

Health Psychology adopts Transparency and Openness Promotion (TOP) Guidelines.

“The Editors are pleased to announce that Health Psychology has adopted the Transparency and Openness Promotion (TOP) Guidelines (Center for Open Science, 2021). We and the other core American Psychological Association (APA) journals are implementing these guidelines at the direction of the APA Publications and Communications Board. Their decision was made with the support of the APA Council of Editors and the APA Open Science and Methodology Committee.

The TOP Guidelines were originally published in Science (Nosek et al., 2015) to encourage journals to incentivize open research practices. They are being implemented by a wide range of scientific publications, including some of the leading behavioral and medical research journals….”

Symposium: A critical analysis of the scientific reform movement

“As the science reform movement has gathered momentum to change research culture and behavior relating to openness, rigor, and reproducibility, so has the critical analysis of the reform efforts. This symposium includes five perspectives examining distinct aspects of the reform movement to illuminate and challenge underlying assumptions about the value and impact of changing practices, to identify potential unintended or counterproductive consequences, and to provide a meta perspective of metascience and open science. It’s meta, all the way up.

Each presenter will provide a 15-minute perspective followed by a concluding discussion among the panelists and a time to address audience questions. Visit cos.io/meta-meta to view session abstracts and speaker info.”

PsychOpen CAMA

“PsychOpen CAMA enables accessing meta-analytic datasets, reproducing meta-analyses and dynamically updating evidence from new primary studies collaboratively….

A CAMA (Community Augmented Meta Analysis) is an open repository for meta-analytic data, that provides meta-analytic analysis tools….

PsychOpen CAMA enables easy access and automated reproducibility of meta-analyses in psychology and related fields. This has several benefits for the research community:

Evidence can be kept updated by adding new studies published after the meta-analysis.
Researchers with special research questions can use subsets of the data or rerun meta-analyses using different moderators.
Flexible analyses with the datasets enable the application of new statistical procedures or different graphical displays.
The cumulated evidence in the CAMA can be used to get a quick overview of existing research gaps. This may give an idea of which study designs or moderators may be especially interesting for future studies to use limited resources for research in a way to enhance evidence.
Given existing meta-analytic evidence, the necessary sample size of future studies to detect an effect of a reasonable size can be estimated. Moreover, the effect of possible future studies on the results of the existing meta-analytic evidence can be simulated.
PsychOpen CAMA offers tutorials to better understand the reasoning behind meta-analyses and to learn the basic steps of conducting a meta-analysis to empower other researchers to contribute to our project for the benefit of the research community….”

 

 

Improving Social Science: Lessons from the Open Science Movement | PS: Political Science & Politics | Cambridge Core

“Recent years have been times of turmoil for psychological science. Depending on whom you ask, the field underwent a “replication crisis” (Shrout and Rodgers 2018) or a “credibility revolution” (Vazire 2018) that might even climax in “psychology’s renaissance” (Nelson, Simmons, and Simonsohn 2018). This article asks what social scientists can learn from this story. Our take-home message is that although differences in research practices make it difficult to prescribe cures across disciplines, much still can be learned from interdisciplinary exchange. We provide nine lessons but first summarize psychology’s experience and what sets it apart from neighboring disciplines….”

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”