Inaccuracy in the Scientific Record and Open Postpublication Critique – Chris R. Brewin, 2023

Abstract:  There is growing evidence that the published psychological literature is marred by multiple errors and inaccuracies and often fails to reflect the changing nature of the knowledge base. At least four types of error are common—citation error, methodological error, statistical error, and interpretation error. In the face of the apparent inevitability of these inaccuracies, core scientific values such as openness and transparency require that correction mechanisms are readily available. In this article, I reviewed standard mechanisms in psychology journals and found them to have limitations. The effects of more widely enabling open postpublication critique in the same journal in addition to conventional peer review are considered. This mechanism is well established in medicine and the life sciences but rare in psychology and may assist psychological science to correct itself.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science – Sophia Crüwell, Deborah Apthorp, Bradley J. Baker, Lincoln Colling, Malte Elson, Sandra J. Geiger, Sebastian Lobentanzer, Jean Monéger, Alex Patterson, D. Samuel Schwarzkopf, Mirela Zaneva, Nicholas J. L. Brown, 2023

Abstract:  In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.

 

Opportunities, challenges and tensions: Open science through a lens of qualitative social psychology – Pownall – British Journal of Social Psychology – Wiley Online Library

Abstract:  In recent years, there has been a focus in social psychology on efforts to improve the robustness, rigour, transparency and openness of psychological research. This has led to a plethora of new tools, practices and initiatives that each aim to combat questionable research practices and improve the credibility of social psychological scholarship. However, the majority of these efforts derive from quantitative, deductive, hypothesis-testing methodologies, and there has been a notable lack of in-depth exploration about what the tools, practices and values may mean for research that uses qualitative methodologies. Here, we introduce a Special Section of BJSP: Open Science, Qualitative Methods and Social Psychology: Possibilities and Tensions. The authors critically discuss a range of issues, including authorship, data sharing and broader research practices. Taken together, these papers urge the discipline to carefully consider the ontological, epistemological and methodological underpinnings of efforts to improve psychological science, and advocate for a critical appreciation of how mainstream open science discourse may (or may not) be compatible with the goals of qualitative research.

 

Experimentology: An Open Science Approach to Experimental Psychology Methods

“How do we create generalizable theories of human behavior? Experiments provide us a tool for measuring causal effects, which provide the basis for building theories. If we design our experiments appropriately, we can even begin to estimate generalizable relationships between different psychological constructs. But how do you do an experiment?

This book provides an introduction to the workflow of the experimental researcher in the psychological sciences. The organization is sequential, from the planning stages of the research process through design, data collection, analysis, and reporting. We introduce these concepts via narrative examples from a range of sub-disciplines, including cognitive, developmental, and social psychology. Throughout, we also illustrate the pitfalls that led to the “replication crisis” in psychology. To help researchers avoid these pitfalls, we advocate for an open-science based approach, providing readers with guidance for preregistration, project management, data sharing, and reproducible writing….”

Ten Strategies to Foster Open Science in Psychology and Beyond | Collabra: Psychology | University of California Press

Abstract:  The scientific community has long recognized the benefits of open science. Today, governments and research agencies worldwide are increasingly promoting and mandating open practices for scientific research. However, for open science to become the by-default model for scientific research, researchers must perceive open practices as accessible and achievable. A significant obstacle is the lack of resources providing a clear direction on how researchers can integrate open science practices in their day-to-day workflows. This article outlines and discusses ten concrete strategies that can help researchers use and disseminate open science. The first five strategies address basic ways of getting started in open science that researchers can put into practice today. The last five strategies are for researchers who are more advanced in open practices to advocate for open science. Our paper will help researchers navigate the transition to open science practices and support others in shifting toward openness, thus contributing to building a better science.

 

NIMH » NIMH Creates Publicly Accessible Resource With Data From Healthy Volunteers

“Studying healthy people can help researchers understand how the brain works in states of health and illness. Although many mental health studies include healthy participants as a comparison group, these studies typically focus on selected measures relevant to a certain functional domain or specific mental illness. The Healthy Research Volunteer Study at the National Institute of Mental Health aims to build a comprehensive, publicly accessible resource with a range of brain and behavioral data from healthy volunteers.

This resource aims to shed light on basic questions about brain function and translational questions about the relationship between brain and behavior. Although the study focuses on healthy volunteers, the data also have relevance to clinical questions about neurobiological, cognitive, and emotional processes associated with certain mental illnesses.

The NIMH Healthy Research Volunteer Study is unique in the breadth and depth of its measures. All data collected as part of the study are anonymized and shared with the research community via the OpenNeuro repository….”

About Meta-Psychology

“Meta-Psychology publishes theoretical and empirical contributions that advance psychology as a science through critical discourse related to individual articles, research lines, research areas, or psychological science as a field. Important contributions include systematic reviews, meta-analyses, replicability reports, and replication studies. We encourage pre-registered studies and registered reports (i.e., peer-review on the basis of theory, methods, and planned data-analysis, before data has been collected). Manuscripts introducing novel methods are welcome, but also tutorials on established methods that are still poorly understood by psychology researchers. We further welcome papers introducing statistical packages or other software useful for psychology researchers….”

 

Open Science, Mental Health, and Sustainable Development: A Proposed Model for a Low-Resource Setting

“Mental health is an important concern in low and middle income countries and must be addressed for sustainable development. Open science is a movement which can contribute significantly towards addressing mental health challenges. Mental health in India and other low and middle income countries faces many challenges, such as lack of resources and low investment. This policy brief proposes an intervention model using the core principles of open science to transform the mental health programmes run by local self-government institutions in India. The model can co-opt key stakeholders involved in the data collection, programme implementation, and monitoring for standardisation. Kerala’s participatory development experience is employed as a case to describe the model. By empowering frontline health workers, accredited volunteers, and officials of the childcare system, and implementing open science principles, this model could help address mental health challenges with minimal resource allocation through the streamlining of the data management process. It could also encourage increased participation in open science through the citizen science model, opening scientific research to non-specialists. Open science principles such as collective benefit, equity, participation, sustainability, and inclusiveness can also be promoted.”

 

Further action toward valid science in Law and Human Behavior: Requiring open data, analytic code, and research materials.

“Beginning on March 1, 2023, Law and Human Behavior will raise its standard for data reporting and expand its focus to include analytic code and research materials. Adopting the recommended language from the TOP Guidelines (Nosek et al., 2015b), the journal will publish articles only if the data, analytic code, and research materials are clearly and precisely documented and are fully available to any researcher who wishes to reproduce the results or replicate the procedure.

Accordingly, authors using original data who seek to publish their research in the journal must make the following items publicly available: …

 

Authors reusing data from public repositories who pursue publication in Law and Human Behavior must provide program code, scripts for statistical packages, and other documentation sufficient to allow an informed researcher to precisely reproduce all published results….”

 

Responsible Research Assessment I: Implementing DORA for hiring and promotion in psychology | PsychArchives

Abstract:  The use of journal impact factors and other metric indicators of research productivity, such as the h-index, has been heavily criticized for being invalid for the assessment of individual researchers and for fueling a detrimental “publish or perish” culture. Multiple initiatives call for developing alternatives to existing metrics that better reflect quality (instead of quantity) in research assessment. This report, written by a task force established by the German Psychological Society, proposes how responsible research assessment could be done in the field of psychology. We present four principles of responsible research assessment in hiring and promotion and suggest a two-step assessment procedure that combines the objectivity and efficiency of indicators with a qualitative, discursive assessment of shortlisted candidates. The main aspects of our proposal are (a) to broaden the range of relevant research contributions to include published data sets and research software, along with research papers, and (b) to place greater emphasis on quality and rigor in research evaluation.

 

Open Science in Developmental Science | Annual Review of Developmental Psychology

Abstract:  Open science policies have proliferated in the social and behavioral sciences in recent years, including practices around sharing study designs, protocols, and data and preregistering hypotheses. Developmental research has moved more slowly than some other disciplines in adopting open science practices, in part because developmental science is often descriptive and does not always strictly adhere to a confirmatory approach. We assess the state of open science practices in developmental science and offer a broader definition of open science that includes replication, reproducibility, data reuse, and global reach.

 

Open Access information specialist 1.0 FTE, permanent | Leibniz-Institut für Psychologie (ZPID)

English version via deepl.com:

We are looking for an Open Access Information Specialist (m/f/d), (pay grade TV-L E13, working hours 100%, permanent), start date February 01, 2023 or later

Tasks and Functions:

You are part of the department “Archiving and Publishing Services” at ZPID, which provides innovative products and services for psychology and related disciplines that follow the Open Science idea. One of the lead products of this infrastructure area is PsychOpen GOLD, ZPID’s open access publication platform for first publications in psychology and related disciplines. In close cooperation with renowned scholars, professional societies and international editorial boards, 15 journals are currently produced and published on PsychOpen GOLD.

Together with the PsychOpen GOLD team, you will provide operational support for all tasks arising in the context of the PsychOpen GOLD service catalog and, in coordination with other stakeholders (e.g., external service providers and scientific partners), contribute significantly to the optimization of the associated workflows. For detailed information on the PsychOpen GOLD service catalog, please see https://doi.org/10.23668/psycharchives.4632. In addition, you will provide scientific support for the development of the platform and participate in the acquisition of third-party funding.

Your profile:

Master’s degree (or equivalent/higher) in information science or another relevant discipline, e.g., data science, library science, social and behavioral sciences.
Advanced experience in scientific work and publishing (e.g. as author, reviewer, etc.)
Expertise in publication-relevant application programs (e.g. in the area of office applications, graphics processing, PDF authoring)
IT skills preferably in the areas of markup languages (e.g. XML, HTML, LaTeX), transformation and stylesheet languages (e.g. XSL, CSS), scripting and programming languages (e.g. PHP, Javascript, Python), database systems and languages (e.g. MySQL).
Project management skills
Very good command of the German and English language, both written and spoken
Proactive, structured, result-oriented and independent way of working
Distinct ability to work in a team and social competence

Desirable:

Knowledge of scientific publication infrastructures (e.g., subject databases, DOI registration agencies), standards (e.g., metadata standards, Open Science Standards), and/or software systems (e.g., OJS, Editorial Manager)
Psychological expertise and methodological knowledge
Experience in the area of third-party funding acquisition
Basic knowledge of legal frameworks (licenses, copyright, data and privacy rights)

German original:

 

 

Wir suchen zum 01. Februar 2023 oder später eine*n:

Open Access Informationsspezialist*in (m/f/d)

(Entgeltgruppe TV-L E13, Arbeitszeit 100%, unbefristet)

 

Aufgaben und Funktionen:

Sie sind Teil der Abteilung “Archivierungs- und Veröffentlichungsdienste” am ZPID, die innovative und dem Open Science-Gedanken folgende Produkte und Dienstleistungen für die Psychologie und verwandte Disziplinen zur Verfügung stellt. Eines der Leitprodukte dieses Infrastrukturbereichs ist PsychOpen GOLD, die Open Access Publikationsplattform des ZPID für Erstveröffentlichungen aus der Psychologie und verwandten Fächern. In enger Zusammenarbeit mit renommierten Fachwissenschaftler*innen, Fachgesellschaften und international besetzten Editorial Boards werden aktuell 15 Zeitschriften auf PsychOpen GOLD produziert und veröffentlicht.

Gemeinsam mit dem PsychOpen GOLD Team werden Sie  sämtliche im Rahmen des PsychOpen GOLD Servicekatalogs anfallenden Aufgaben operativ unterstützen und in Abstimmung mit weiteren Stakeholdern (z.B. externe Dienstleister und Wissenschaftspartner) maßgeblich zur Optimierung der zugehörigen Workflows beitragen. Für detaillierte Informationen zum Servicekatalog von PsychOpen GOLD siehe https://doi.org/10.23668/psycharchives.4632. Darüber hinaus werden Sie die Entwicklung der Plattform wissenschaftlich begleiten und sich bei der Akquise von Drittmitteln beteiligen.

Ihr Profil:

Masterabschluss (oder gleichwertig/höher) in den Informationswissenschaften oder einer anderen einschlägigen Disziplin, z.B. Data Science, Bibliothekswissenschaften, Sozial- und Verhaltenswissenschaften
Erweiterte Erfahrungen im wissenschaftlichen Arbeiten und Publizieren (z. B. als Autor*in, Gutachter*in, etc.)
Expertise bei publikationsrelevanten Anwendungsprogrammen (z. B. im Bereich Office-Anwendungen, Grafikbearbeitung, PDF-Authoring)
IT-Kompete