Responsible Research Assessment | Open Science Talk

Abstract:  Felix Schönbrodt, Professor of Psychology at Ludwig-Maximilians-Universität (LMU) in Munich, tells about an initiative that he coordinates within the Deutsche Gesellschaft für Psychologie (German Psychological Society). Motivated by the Reproducibility Crisis and a rising frustration with the publishers of high-ranking journals, Schönbrodt has co-authored three position papers on the theme of responsible research assessment. The suggestion is to develop a two-stage evaluation system for hiring, the first of which will use responsible metrics with emphasis on open data, pre-registration and several aspects of reproducibility, whereas the second stage will focus on a qualitative (content-oriented) evaluation of selected candidates. The propositions of Schönbrodt’s group have so far led to published feedback from more than 40 different scholars. Besides his nation-wide work within the German Psychological Society, Schönbrodt is the managing director of LMU’s Open Science Centre, where scholars from different disciplines convene for workshops on various aspects of Open Science. Under the nickname «nicebread» (Schön = nice, Brodt = bread), he also runs a personal blog and a project webpage on GitHub.


Open-Science Guidance for Qualitative Research: An Empirically Validated Approach for De-Identifying Sensitive Narrative Data – Rebecca Campbell, McKenzie Javorka, Jasmine Engleton, Kathryn Fishwick, Katie Gregory, Rachael Goodman-Williams, 2023

Abstract:  The open-science movement seeks to make research more transparent and accessible. To that end, researchers are increasingly expected to share de-identified data with other scholars for review, reanalysis, and reuse. In psychology, open-science practices have been explored primarily within the context of quantitative data, but demands to share qualitative data are becoming more prevalent. Narrative data are far more challenging to de-identify fully, and because qualitative methods are often used in studies with marginalized, minoritized, and/or traumatized populations, data sharing may pose substantial risks for participants if their information can be later reidentified. To date, there has been little guidance in the literature on how to de-identify qualitative data. To address this gap, we developed a methodological framework for remediating sensitive narrative data. This multiphase process is modeled on common qualitative-coding strategies. The first phase includes consultations with diverse stakeholders and sources to understand reidentifiability risks and data-sharing concerns. The second phase outlines an iterative process for recognizing potentially identifiable information and constructing individualized remediation strategies through group review and consensus. The third phase includes multiple strategies for assessing the validity of the de-identification analyses (i.e., whether the remediated transcripts adequately protect participants’ privacy). We applied this framework to a set of 32 qualitative interviews with sexual-assault survivors. We provide case examples of how blurring and redaction techniques can be used to protect names, dates, locations, trauma histories, help-seeking experiences, and other information about dyadic interactions.


How to make research reproducible: psychology protocol gives 86% success rate

“In a bid to restore its reputation, experimental psychology has now brought its A game to the laboratory. A group of heavy-hitters in the field spent five years working on new research projects under the most rigorous and careful experimental conditions possible and getting each other’s labs to try to reproduce the findings.

Published today in Nature Human Behaviour1, the results show that the original findings could be replicated 86% of the time — significantly better than the 50% success rate reported by some systematic replication efforts….”

PhD position in open science | Erasmus University Rotterdam

“The Brain & Cognition Team at Erasmus University Rotterdam seeks applicants for a fully funded 4 year PhD position on the project Open Science Reform: Past, Present, and Future, under the supervision of Steven Verheyen, Oliver Lindemann, and Rolf Zwaan. The position involves taking stock of the impact the open science movement has had on psychological science and investigating which further steps toward a transparent and inclusive science can be taken. To this end, you will perform both quantitative and qualitative research into the publication culture during the decade that followed the replication crisis to assess the impact of the various scientific reforms, and look into how publication policies and undergraduate education can be further shaped to promote responsible science. You are expected to write articles for an academic audience, develop tools or policies that can be implemented more broadly, and where possible to join or initiate team science projects, for instance through the Society for the Improvement of Psychological Science or the Framework for Open and Reproducible Research Training. The position is ideally suited for candidates who are interested in handling large data sets, and who are keen to engage with various stakeholders on how to better our discipline….”

NOT-OD-23-180: Request for Information (RFI): Inviting Comments and Suggestions on Opportunities and Challenges for the Collection, Use, and Sharing of Real-World Data (RWD) including Electronic Health Records, for NIH Supported Biomedical and Behavioral Research

“The purpose of this Request for Information (RFI) is to solicit public comments on the use of Real-World Data (RWD), including Electronic Health Records, for Biomedical and Behavioral Research…. 

Researchers are increasingly using data collected in real-world settings to augment traditional research studies as well as develop more effective treatments and interventions for patients. These “real-world data (RWD)”, defined by the U.S. Food and Drug Administration, are data relating to patient health status and/or the delivery of health care routinely collected from a variety of sources. Examples of RWD include data derived from electronic health records, medical claims data, data from product or disease registries, and data gathered from other sources (such as digital health technologies) that can inform on health status. While these data hold tremendous promise for biomedical and behavioral research, they can be collected from a variety of sources through multiple mechanisms, creating challenges for researchers and questions for those whose data are being shared.

Importantly, the National Institutes of Health (NIH) is committed to ensuring participant privacy and autonomy are protected in all NIH supported research. As NIH establishes health-related research data platforms that include access to RWD, NIH continues to prioritize maximizing data access while upholding participant preferences regarding the collection and use of their data. Most recently, through an NIH Director Advisory Committee, NIH met with stakeholders to understand their perspectives on benefits and risks of combining and using human datasets, particularly from disparate sources (e.g., research and non-research settings) and how their data should be used in biomedical research. NIH will continue working to incorporate these perspectives in its research studies to build trust and honor participant preferences. Input requested on this RFI will be used to inform NIH’s continuing development of guidance on the use of RWD for research and assist in the planning for appropriate mechanisms and programs for research with RWD….”

Research Sharing Survey


The survey should take about 10 minutes.
Your participation in this survey is completely voluntary, and you are free to withdraw at any time until you submit the survey. Answers will never be associated with individual participants and the results will only be analyzed in aggregate. Any research findings and survey data that are publicly shared will be anonymized.
The survey will close on September 30, 2023, 11:59 pm Pacific Daylight Time.
If you have any questions about the survey, please contact the PLOS Research Team (….”

Supporting PsyArXiv: Your Support Matters!

“PsyArXiv, the psychological science preprint server, needs your support to continue serving as a free platform for sharing our work.

Created during SIPS 2016 by SIPS members like you, PsyArXiv is one of the most successful SIPS Products. It is maintained by SIPS and hosted by OSF Preprints, which allows it to seamlessly interface with OSF projects. PsyArXiv currently hosts around 30 thousand preprints with more than 6,000 new manuscripts being added every year, and an average of 9 thousand page views per day!…”

PsyArXiv Preprints | ReproduceMe: lessons from a pilot project on computational reproducibility

Abstract:  If a scientific paper is computationally reproducible, the analyses it reports can be repeated independently by others. At the present time most papers are not reproducible. However, the tools to enable computational reproducibility are now widely available, using free and open source software. We conducted a pilot study in which we offered ‘reproducibility as a service’ within a UK psychology department for a period of 6 months. Our rationale was that most researchers lack either the time or expertise to make their own work reproducible, but might be willing to allow this to be done by an independent team. Ten papers were converted into reproducible format using R markdown, such that all analyses were conducted by a single script that could download raw data from online platforms as required, generate figures, and produce a pdf of the final manuscript. For some studies this involved reproducing analyses originally conducted using commercial software. The project was an overall success, with strong support from the contributing authors who saw clear benefit from this work, including greater transparency and openness, and ease of use for the reader. Here we describe our framework for reproducibility, summarise the specific lessons learned during the project, and discuss the future of computational reproducibility. Our view is that computationally reproducible manuscripts embody many of the core principles of open science, and should become the default format for scientific communication.

PsyArXiv Preprints | Concerns about Replicability, Theorizing, Applicability, Generalizability, and Methodology across Two Crises in Social Psychology

Abstract:  Twice in the history of social psychology has there been a crisis of confidence. The first started in the 1960s and lasted until the end of the 1970s, and the second crisis dominated the 2010s. In both these crises, researchers discussed fundamental concerns about the replicability of findings, the strength of theories in the field, the societal relevance of research, the generalizability of effects, and problematic methodological and statistical practices. On the basis of extensive quotes drawn from articles published during both crises, I explore the similarities and differences in discussions across both crises in social psychology.

American Psychological Association partners with ResearchGate | Research Information

“The American Psychological Association (APA) and ResearchGate have entered a partnership aimed at amplifying the reach and discoverability of APA’s journals by providing ResearchGate members with direct access to their articles through the platform.

APA’s collection of peer-reviewed journals span the breadth and depth of psychology, many published in partnership with APA’s specialty divisions and other national and international psychological organisations. As a ResearchGate partner, APA will provide access to more than 5,000 new articles a year, as well as backfile content of more than 300,000 articles.  

Authors of the articles included in this partnership will have their content automatically added to their profiles on ResearchGate, giving them easy access to statistics that showcase the impact of their work and providing an opportunity for them to connect with their readers….”

The replication crisis has led to positive structural, procedural, and community changes | Communications Psychology

Abstract:  The emergence of large-scale replication projects yielding successful rates substantially lower than expected caused the behavioural, cognitive, and social sciences to experience a so-called ‘replication crisis’. In this Perspective, we reframe this ‘crisis’ through the lens of a credibility revolution, focusing on positive structural, procedural and community-driven changes. Second, we outline a path to expand ongoing advances and improvements. The credibility revolution has been an impetus to several substantive changes which will have a positive, long-term impact on our research environment.

From the body of the article: “An academic movement collectively known as open scholarship (incorporating Open Science and Open Research) has driven constructive change by accelerating the uptake of robust research practices while concomitantly championing a more diverse, equitable, inclusive, and accessible psychological science….”

No-pay publishing: use institutional repositories

“The European Council’s recommended open, equitable and sustainable scholarly publishing system, free to readers and authors, has been dismissed as unsustainable and too costly (see Nature; 2023). However, institutional repositories run by research institutions offer an inexpensive and sustainable route to realizing this aspiration.

Such non-profit repositories are ubiquitous and capable of hosting ‘diamond’ open-access academic journals, which are free to publish and to read. In Spain, for example, the journal Psicológica is owned by the Spanish Society for Experimental Psychology and published on DIGITAL.CSIC, the institutional repository of the Spanish National Research Council (see

Transferred in 2022 from a commercial publisher, Psicológica publishes about 50 articles, preprints and peer reviews annually. Publication costs are shared between the journal — which is financially supported by the society — and the publicly funded repository, which provides services such as archiving, DOI assignation and metadata curation. At an estimated cost of €30 (US$34) per publication, Psicológica can increase its output without incurring substantial extra costs. This underscores the sustainability of such models.”

Gorilla Experiment Builder – Easily Create Online Behavioural Experiments

“Online research is a fast-growing field and we’re committed to facilitating high quality research and open science. An overview of Gorilla is presented in our Behavior Research Methods paper, Gorilla in our midst: An online behavioral experiment builder. We have also published a peer-reviewed large-scale study of timing accuracy across platforms, web-browsers and devices.

In our desire to make life easier for researchers, and as supporters of Open Science practices, Gorilla includes an open access repository for sharing research materials.  Explore Gorilla Open Materials to discover and clone a wide range of tasks, questionnaires and experiments shared by our users.”

PsychOpen: Call for proposals

“PsychOpen GOLD is an open access publishing platform for primary research in psychology and related fields. Currently, fifteen journals are published by PsychOpen GOLD. The purpose of this call is to expand the spectrum of PsychOpen GOLD’s journal portfolio to include additional fields of psychological research. Of particular interest are journals that focus on issues germane to or of concern to the general population or that break new ground in communicating psychological research to the general population. Paper formats supported within the journals may range from classical research articles to data publications and other paper types designed to promote open and transparent research.

We welcome proposals to found new journals as well as proposals to transfer existing journals to PsychOpen GOLD (e.g., transforming a subscription based journal to an Open Access journal). Applications are invited from scientists and scholarly societies from all parts of the world….”

Predicting psychologists’ approach to academic reciprocity and data sharing with a theory of collective action | Emerald Insight

“This study found that data sharing among psychologists is driven primarily by their perceptions of community benefits, academic reciprocity and the norms of data sharing. This study also found that academic reciprocity is significantly influenced by psychologists’ perceptions of community benefits, academic reputation and the norms of data sharing. Both academic reputation and academic reciprocity are affected by psychologists’ prior experiences with data reuse. Additionally, psychologists’ perceptions of community benefits and the norms of data sharing are significantly affected by the perception of their academic reputation.”