Full article: Making data meaningful: guidelines for good quality open data

“In the most recent editorial for the The Journal of Social Psychology (JSP), J. Grahe (2021) set out and justified a new journal policy: publishing papers now requires authors to make available all data on which claims are based. This places the journal amongst a growing group of forward-thinking psychology journals that mandate open data for research outputs.1 It is clear that the editorial team hopes to raise the credibility and usefulness of research in the journal, as well as the discipline, through increased research transparency….

This commentary represents a natural and complementary alliance between the ambition of JSP’s open data policy and the reality of how data sharing often takes place. We share with JSP the belief that usable and open data is good for social psychology and supports effective knowledge exchange within and beyond academia. For this to happen, we must have not just more open data, but open data that is of a sufficient quality to support repeated use and replication (Towse et al., 2020). Moreover, it is becoming clear that researchers across science are seeking guidance, training and standards for open data provision (D. Roche et al., 2021; Soeharjono & Roche, 2021). With this in mind, we outline several simple steps and point toward a set of freely available resources that can help make datasets more valuable and impactful. Specifically, we explain how to make data meaningful; easily findable, accessible, complete and understandable. We have provided a simple checklist (Table 1) and useful resources (Appendix A) based on our recommendations, these can also be found on the project page for this article (https:doi.org/10.17605/OSF.IO/NZ5WS). While we have focused mostly on sharing quantitative data, much of what has been discussed remains relevant to qualitative research (for an in-depth discussion of qualitative data sharing, see DuBois et al., 2018)….”

Frontiers | The Ethic of Access: An AIDS Activist Won Public Access to Experimental Therapies, and This Must Now Extend to Psychedelics for Mental Illness | Psychiatry

“If patients with mental illnesses are to be treated fairly in comparison with other categories of patients, they must be given access to promising experimental therapies, including psychedelics. The right of early access to promising therapies was advanced as an ethical principle by activist Larry Kramer during the AIDS pandemic, and has now largely been adopted by the medical establishment. Patients are regularly granted access to experimental drugs for many illness categories, such as cancer and infectious diseases. The need for expanded access is especially relevant during evolving crises like the AIDS and the coronavirus pandemics. In contrast to non-psychiatric branches of medicine, psychiatry has failed to expedite access to promising drugs in the face of public health emergencies, psychological crises, the wishes of many patients, and the needs of the community. Psychiatry must catch up to the rest of medicine and allow the preferences of patients for access to guide policy and law regarding unapproved medications like psychedelics….

Open questions include how to amplify the voices of patients regarding experimental therapies like psychedelics, how to implement early access, how to educate the public about this option once it exists, and how to ensure equitable access for multiple marginalized groups. A model of political engagement like ACT UP may not work for patients whose symptoms include lack of motivation and will, and who are at risk for re-traumatization. The authors are exploring an entirely patient-led counterpart to traditional academic peer review, which allows diverse patient communities to provide meaningful input into therapies that result from trials….”

 

IBM, MIT and Harvard release DARPA “Common Sense AI” dataset at ICML 2021 | IBM Research Blog

“Before we can build machines that make decisions based on common sense, the AI powering those machines must be capable of more than simply finding patterns in data. It must also consider the intentions, beliefs, and desires of others that people use to intuitively make decisions.

At the 2021 International Conference on Machine Learning (ICML), we are releasing a new dataset for benchmarking AI intuition, along with two machine learning models representing different approaches to the problem. The research has been done with our colleagues at MIT and Harvard University to accelerate the development of AI that exhibits common sense. These tools rely on testing techniques that psychologists use to study the behavior of infants….”

Advancing Scientific Integrity, Transparency, and Openness in Child Development Research: Challenges and Possible Solutions – Gilmore – 2020 – Child Development Perspectives – Wiley Online Library

Abstract:  In 2019, the Governing Council of the Society for Research in Child Development (SRCD) adopted a Policy on Scientific Integrity, Transparency, and Openness (SRCD, 2019a) and accompanying Author Guidelines on Scientific Integrity and Openness in Child Development (SRCD, 2019b). In this issue, a companion article (Gennetian, Tamis?LeMonda, & Frank) discusses the opportunities to realize SRCD’s vision for a science of child development that is open, transparent, robust, and impactful. In this article, we discuss some of the challenges associated with realizing SRCD’s vision. In identifying these challenges—protecting participants and researchers from harm, respecting diversity, and balancing the benefits of change with the costs—we also offer constructive solutions.

 

Open science as a path to education of new psychophysiologists – ScienceDirect

Highlights

 

Open science increases access to resources for training.

Open education practices empower educators to make use of open science resources.

PURSUE is an open education initiative for training in psychophysiology.

PURSUE’s model of open education can generalize to other STEM fields….”

 

Health Psychology adopts Transparency and Openness Promotion (TOP) Guidelines.

“The Editors are pleased to announce that Health Psychology has adopted the Transparency and Openness Promotion (TOP) Guidelines (Center for Open Science, 2021). We and the other core American Psychological Association (APA) journals are implementing these guidelines at the direction of the APA Publications and Communications Board. Their decision was made with the support of the APA Council of Editors and the APA Open Science and Methodology Committee.

The TOP Guidelines were originally published in Science (Nosek et al., 2015) to encourage journals to incentivize open research practices. They are being implemented by a wide range of scientific publications, including some of the leading behavioral and medical research journals….”

Symposium: A critical analysis of the scientific reform movement

“As the science reform movement has gathered momentum to change research culture and behavior relating to openness, rigor, and reproducibility, so has the critical analysis of the reform efforts. This symposium includes five perspectives examining distinct aspects of the reform movement to illuminate and challenge underlying assumptions about the value and impact of changing practices, to identify potential unintended or counterproductive consequences, and to provide a meta perspective of metascience and open science. It’s meta, all the way up.

Each presenter will provide a 15-minute perspective followed by a concluding discussion among the panelists and a time to address audience questions. Visit cos.io/meta-meta to view session abstracts and speaker info.”

PsychOpen CAMA

“PsychOpen CAMA enables accessing meta-analytic datasets, reproducing meta-analyses and dynamically updating evidence from new primary studies collaboratively….

A CAMA (Community Augmented Meta Analysis) is an open repository for meta-analytic data, that provides meta-analytic analysis tools….

PsychOpen CAMA enables easy access and automated reproducibility of meta-analyses in psychology and related fields. This has several benefits for the research community:

Evidence can be kept updated by adding new studies published after the meta-analysis.
Researchers with special research questions can use subsets of the data or rerun meta-analyses using different moderators.
Flexible analyses with the datasets enable the application of new statistical procedures or different graphical displays.
The cumulated evidence in the CAMA can be used to get a quick overview of existing research gaps. This may give an idea of which study designs or moderators may be especially interesting for future studies to use limited resources for research in a way to enhance evidence.
Given existing meta-analytic evidence, the necessary sample size of future studies to detect an effect of a reasonable size can be estimated. Moreover, the effect of possible future studies on the results of the existing meta-analytic evidence can be simulated.
PsychOpen CAMA offers tutorials to better understand the reasoning behind meta-analyses and to learn the basic steps of conducting a meta-analysis to empower other researchers to contribute to our project for the benefit of the research community….”

 

 

Improving Social Science: Lessons from the Open Science Movement | PS: Political Science & Politics | Cambridge Core

“Recent years have been times of turmoil for psychological science. Depending on whom you ask, the field underwent a “replication crisis” (Shrout and Rodgers 2018) or a “credibility revolution” (Vazire 2018) that might even climax in “psychology’s renaissance” (Nelson, Simmons, and Simonsohn 2018). This article asks what social scientists can learn from this story. Our take-home message is that although differences in research practices make it difficult to prescribe cures across disciplines, much still can be learned from interdisciplinary exchange. We provide nine lessons but first summarize psychology’s experience and what sets it apart from neighboring disciplines….”

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

BABCP journals, openness and transparency | Behavioural and Cognitive Psychotherapy | Cambridge Core

“Our BABCP journals have for some time been supportive of open science in its various forms. We are now taking the next steps towards this in terms of our policies and practices. For some things we are transitioning to the changes (but would encourage our contributors to embrace these as early as possible), and in others we are implementing things straight away. This is part of the global shift to open practices in science, and has many benefits and few, if any, drawbacks. See for example http://www.unesco.or/e//ommunication-and-informatio/ortals-and-platform/oa/pen-science-movement/

One of the main drivers for open science has been the recent ‘reproducibility crisis’, which crystallised long-standing concerns about a range of biases within and across research publication. Open science and research transparency will provide the means to reduce the impact of such biases, and can reasonably be considered to be a paradigm change. There are benefits beyond dealing with problems, however.

McKiernan et al. (2016) for example suggest that ‘open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities’. This is, of course, from a researcher-focused perspective. The BABCP and the Journal Editors take the view that open and transparent research practices will have the greatest long-term impact on service users both directly and indirectly through more accurate reporting and interpretation of research and its applications by CBT practitioners. So what are the practical changes we are implementing in partnership with our publisher, Cambridge University Press?…”

PsyArXiv Preprints | Replicability, Robustness, and Reproducibility in Psychological Science

Abstract:  Replication, an important, uncommon, and misunderstood practice, is making a comeback in psychology. Achieving replicability is a necessary but not sufficient condition for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understanding to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understanding and observed surprising failures to replicate many published findings. Replication efforts also highlighted sociocultural challenges, such as disincentives to conduct replications, framing of replication as personal attack rather than healthy scientific practice, and headwinds for replication contributing to self-correction. Nevertheless, innovation in doing and understanding replication, and its cousins, reproducibility and robustness, have positioned psychology to improve research practices and accelerate progress.

As new venues for peer review flower, will journals catch up? – Psychonomic Society Featured Content

“Given that preprints are here to stay, the field should be devoting resources to getting them certified more quickly as having received some amount of expert scrutiny. This is particularly important, of course, for preprints making claims relevant to the response to the pandemic.

In many cases, one component of this certification is already happening very quickly. More publicly-available peer review is happening today than ever before – just not at our journals. While academic journals typically call on half a handful of hand-picked, often reluctant referees, social media is not as limiting, and lively expert discussions are flourishing at forums like Twitter, Pubpeer, and the commenting facility of preprint servers.

So far, most journals have simply ignored this. As a result, science is now happening on two independent tracks, one slow, and one fast. The fast track is chaotic and unruly, while the slow track is bureaucratic and secretive – at most journals the experts’ comments never become available to readers, and the resulting evaluation by the editor of the strengths and weaknesses of the manuscript are never communicated to readers….

Will we need to reinvent the scientific journal wheel, or will legacy journals catch up with the modern world, by both taking advantage of and adding value to the peer review that is happening on the fast track?”

 

PsyArXiv Preprints | The Pandemic as a Portal: Reimagining Psychological Science as Truly Open and Inclusive

Abstract:  Psychological science is at an inflection point: The COVID-19 pandemic has already begun to exacerbate inequalities that stem from our historically closed and exclusive culture. Meanwhile, reform efforts to change the future of our science are too narrow in focus to fully succeed. In this paper, we call on psychological scientists—focusing specifically on those who use quantitative methods in the United States as one context in which such a conversation can begin—to reimagine our discipline as fundamentally open and inclusive. First, we discuss who our discipline was designed to serve and how this history produced the inequitable reward and support systems we see today. Second, we highlight how current institutional responses to address worsening inequalities are inadequate, as well as how our disciplinary perspective may both help and hinder our ability to craft effective solutions. Third, we take a hard look in the mirror at the disconnect between what we ostensibly value as a field and what we actually practice. Fourth and finally, we lead readers through a roadmap for reimagining psychological science in whatever roles and spaces they occupy, from an informal discussion group in a department to a formal strategic planning retreat at a scientific society.