Merit Review Policy – [U of Maryland, Psychology Department]

“Examples of specific evaluative criteria to be used in merit review, based on professional standards for evaluating faculty performance….Openness and transparency: Degree to which research, data, procedures, code, and research products are made openly available where appropriate; the use of registered reports or pre-registration. Committee should recognize that researchers may not be able to share some types of data, such as when data are proprietary or subject to ethical concerns over confidentiality[7, 1, 6, 2, 5] These limitations should be documented by faculty.”

 

What do participants think of our research practices? An examination of behavioural psychology participants’ preferences | Royal Society Open Science

Abstract:  What research practices should be considered acceptable? Historically, scientists have set the standards for what constitutes acceptable research practices. However, there is value in considering non-scientists’ perspectives, including research participants’. 1873 participants from MTurk and university subject pools were surveyed after their participation in one of eight minimal-risk studies. We asked participants how they would feel if (mostly) common research practices were applied to their data: p-hacking/cherry-picking results, selective reporting of studies, Hypothesizing After Results are Known (HARKing), committing fraud, conducting direct replications, sharing data, sharing methods, and open access publishing. An overwhelming majority of psychology research participants think questionable research practices (e.g. p-hacking, HARKing) are unacceptable (68.3–81.3%), and were supportive of practices to increase transparency and replicability (71.4–80.1%). A surprising number of participants expressed positive or neutral views toward scientific fraud (18.7%), raising concerns about data quality. We grapple with this concern and interpret our results in light of the limitations of our study. Despite the ambiguity in our results, we argue that there is evidence (from our study and others’) that researchers may be violating participants’ expectations and should be transparent with participants about how their data will be used.

 

 

Sharing is caring: Ethical implications of transparent research in psychology. – PsycNET

Abstract:  The call for greater openness in research data is quickly growing in many scientific fields. Psychology as a field, however, still falls short in this regard. Research is vulnerable to human error, inaccurate interpretation, and reporting of study results, and decisions during the research process being biased toward favorable results. Despite the obligation to share data for verification and the importance of this practice for protecting against human error, many psychologists do not fulfill their ethical responsibility of sharing their research data. This has implications for the accurate and ethical dissemination of specific research findings and the scientific development of the field more broadly. Open science practices provide promising approaches to address the ethical issues of inaccurate reporting and false-positive results in psychological research literature that hinder scientific growth and ultimately violate several relevant ethical principles and standards from the American Psychological Association’s (APA’s) Ethical Principles of Psychologists Code of Conduct (APA, 2017). Still, current incentive structures in the field for publishing and professional advancement appear to induce hesitancy in applying these practices. With each of these considerations in mind, recommendations on how psychologists can ethically proceed through open science practices and incentive restructuring—in particular, data management, data and code sharing, study preregistration, and registered reports—are provided.

Investigating the Effectiveness of the Open Data Badge Policy at Psychological Science Through Computational Reproducibility

Abstract:  In April 2019, Psychological Science published its first issue in which all research articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its stated aim at Psychological Science: ensuring reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all articles provided at least some data, 6/14 articles provided analysis code or scripts, only 1/14 articles was rated to be exactly reproducible, and 3/14 essentially reproducible with minor deviations. We recommend that Psychological Science require a check of reproducibility at the peer review stage before awarding badges, and that the Open Data badge be renamed “Open Data and Code” to avoid confusion and encourage researchers to adhere to this higher standard.

 

Is it time to share qualitative research data?

Abstract:  Policies by the National Institutes of Health and the National Science Foundation and scandals surrounding failures to reproduce the findings of key studies in psychology have generated increased calls for sharing research data. Most of these discussions have focused on quantitative, rather than qualitative, research data. This article examines scientific, ethical, and policy issues surrounding sharing qualitative research data. We consider advantages of sharing data, including enabling verification of findings, promoting new research in an economical manner, supporting research education, and fostering public trust in science. We then examine standard procedures for archiving and sharing data, such as anonymizing data and establishing data use agreements. Finally, we engage a series of concerns with sharing qualitative research data, such as the importance of relationships in interpreting data, the risk of reidentifying participants, issues surrounding consent and data ownership, and the burden of data documentation and depositing on researchers. For each concern, we identify options that enable data sharing or describe conditions under which select data might be withheld from a data repository. We conclude by suggesting that the default assumption should be that qualitative data will be shared unless concerns exist that cannot be addressed through standard data depositing practices such as anonymizing data or through data use agreements.

 

Show your work: Tools for open developmental science – ScienceDirect

Abstract:  Since grade school, students of many subjects have learned to “show their work” in order to receive full credit for assignments. Many of the reasons for students to show their work extend to the conduct of scientific research. And yet multiple barriers make it challenging to share and show the products of scientific work beyond published findings. This chapter discusses some of these barriers and how web-based data repositories help overcome them. The focus is on Databrary.org, a data library specialized for storing and sharing video data with a restricted community of institutionally approved investigators. Databrary was designed by and for developmental researchers, and so its features and policies reflect many of the specific challenges faced by this community, especially those associated with sharing video and related identifiable data. The chapter argues that developmental science poses some of the most interesting, challenging, and important questions in all of science, and that by openly sharing much more of the products and processes of our work, developmental scientists can accelerate discovery while making our scholarship much more robust and reproducible.

 

Open Science and Multicultural Research: Some Data, Considerations, and Recommendations

Abstract:  Objectives: There are two potentially useful but nonintersecting efforts to help ensure that psychological science produces valid and credible information and contributes to the understanding of diverse human experiences. Whereas North American ethnic minority psychology research/cultural diversity science (EM/D) emphasizes cultural competency to yield contextualized psychological understanding of understudied and underserved minority populations, current open science (OS) approaches emphasize material and data sharing, and statistical proficiency to maximize the replicability of mainstream findings. To illuminate the extent of and explore reasons for this bifurcation, and OS’s potential impact on EM/D, we conducted three studies. Methods and Results: In Study 1, we reviewed editorial/publishing policies and empirical articles appearing in four major EM/D journals on the incentives for and use of OS. Journals varied in OS-related policies; 32 of 823 empirical articles incorporated any OS practices. Study 2 was a national mixed-methods survey of EM/D scholars’ (N=141) and journal editors’ (N=16) views about and experiences with OS practices. Emerged themes included beliefs about the impact of OS on scientific quality, possible professional disadvantages for EM/D scholars, and concerns about the welfare of and ethical risks posed for communities of color. In Study 3, we explored community research participants’ beliefs about data sharing and credibility of science/scientists (N=1,104). Participants were receptive of data sharing and viewed psychological science favorably. Conclusions: We provide data-driven recommendations for researchers to assemble the best tools for approaching the knowledge-production process with transparency, humility, and cultural competency.

 

Psicológica and DIGITAL.CSIC join forces for Sustainable Diamond Open Access and Repository as a Publisher Services – Open Scholar C.I.C.

“We are excited to announce the relaunch of Psicológica, the journal of the Spanish Society for Experimental Psychology (SEPEX), as a Diamond Open Access journal published exclusively on DIGITAL.CSIC, the institutional repository of the Spanish National Research Council (CSIC).

This project kicks off in a time when both the sustainability of Diamond Open Access journals and the opportunities to consolidate Repository as a Publisher services have come to the fore in global discussion about innovative scholarly communications. On the one hand, in light of the heated debates about the true costs of academic publishing, the direct partnership between a society-owned journal and a publicly funded repository demonstrates the viability of a novel and sustainable publishing model that does not entail any costs to authors, institutions, readers or libraries. On the other hand, DIGITAL.CSIC takes a further step in its agenda to expand publishing services, by providing a full peer review workflow on top of its infrastructure. A rigorous quality control of incoming manuscripts is performed by senior volunteer academics with the collaboration of expert reviewers, while the institutional repository and its staff of professional librarians provide a state-of-the-art publishing infrastructure, including peer review management, metadata curation, DOI minting, support for database indexing and harvesting by aggregators and search engines, support for policy development, users support service, and digital preservation. This whole set of services on top of an institutional repository opens the door for truly innovative publishing controlled by the scholarly community and without the intermediation of third parties….”

Psicológica and DIGITAL.CSIC join forces for Sustainable Diamond Open Access and Repository as a Publisher Services – Open Scholar C.I.C.

“We are excited to announce the relaunch of Psicológica, the journal of the Spanish Society for Experimental Psychology (SEPEX), as a Diamond Open Access journal published exclusively on DIGITAL.CSIC, the institutional repository of the Spanish National Research Council (CSIC).

This project kicks off in a time when both the sustainability of Diamond Open Access journals and the opportunities to consolidate Repository as a Publisher services have come to the fore in global discussion about innovative scholarly communications. On the one hand, in light of the heated debates about the true costs of academic publishing, the direct partnership between a society-owned journal and a publicly funded repository demonstrates the viability of a novel and sustainable publishing model that does not entail any costs to authors, institutions, readers or libraries. On the other hand, DIGITAL.CSIC takes a further step in its agenda to expand publishing services, by providing a full peer review workflow on top of its infrastructure. A rigorous quality control of incoming manuscripts is performed by senior volunteer academics with the collaboration of expert reviewers, while the institutional repository and its staff of professional librarians provide a state-of-the-art publishing infrastructure, including peer review management, metadata curation, DOI minting, support for database indexing and harvesting by aggregators and search engines, support for policy development, users support service, and digital preservation. This whole set of services on top of an institutional repository opens the door for truly innovative publishing controlled by the scholarly community and without the intermediation of third parties….”

CAPMH: development of the first open access journal in the field of child mental health | Child and Adolescent Psychiatry and Mental Health | Full Text

“Child and Adolescent Psychiatry and Mental Health (CAPMH) was founded in 2007, an initiative of Prof. Fegert. He served as the Editor-in-Chief, with support from Dr. Benedetto Vitiello (Italy) as Deputy Editor-in-Chief and Prof. Goldbeck (Germany) and Jacinta Tan (UK) as Associate Editors. The journal was the first independent, open access, online journal in the field with the mission to provide an international platform for rapid and comprehensive scientific communication on child and adolescent mental health issues from diverse cultures and contexts.

The first issue of CAPMH was released in 2007. In February 2013, CAPMH became the official journal of the International Association for Child and Adolescent Psychiatry and Allied Professions (IACAPAP) and has later also been affiliated with the European Association for Forensic Child and Adolescent Psychiatry, Psychology and other involved Professions (EFCAP). Since its inception, the journal has grown rapidly and has received funding by different foundations. However, the journal also faced difficulties and setbacks as sadly, Prof. Goldbeck, one of the founding editors, unexpectedly passed away in 2017….”

Comparing dream to reality: an assessment of adherence of the first generation of preregistered studies | Royal Society Open Science

Abstract:  Preregistration is a method to increase research transparency by documenting research decisions on a public, third-party repository prior to any influence by data. It is becoming increasingly popular in all subfields of psychology and beyond. Adherence to the preregistration plan may not always be feasible and even is not necessarily desirable, but without disclosure of deviations, readers who do not carefully consult the preregistration plan might get the incorrect impression that the study was exactly conducted and reported as planned. In this paper, we have investigated adherence and disclosure of deviations for all articles published with the Preregistered badge in Psychological Science between February 2015 and November 2017 and shared our findings with the corresponding authors for feedback. Two out of 27 preregistered studies contained no deviations from the preregistration plan. In one study, all deviations were disclosed. Nine studies disclosed none of the deviations. We mainly observed (un)disclosed deviations from the plan regarding the reported sample size, exclusion criteria and statistical analysis. This closer look at preregistrations of the first generation reveals possible hurdles for reporting preregistered studies and provides input for future reporting guidelines. We discuss the results and possible explanations, and provide recommendations for preregistered research.

 

International Survey on Data Sharing and Re-use in Traumatic Stress Research

“The Global Collaboration on Traumatic Stress, a coalition of 11 scientific societies in the field of traumatic stress, is conducting a survey to better understand traumatic stress researchers’ opinions and experiences regarding data sharing and data re-use.

If you are a traumatic stress researcher at any career stage (including trainees) we invite you to share your opinions and experiences by participating in this survey. …”

PsyArXiv Preprints | When open data closes the door: Problematising a one size fits all approach to open data in journal submission guidelines

Abstract:  Opening data promises to improve research rigour and democratise knowledge production. But it also poses practical, theoretical, and ethical risks for qualitative research. Despite discussion about open data in qualitative social psychology predating the replication crisis, the nuances of this discussion have not been translated into current journal policies. Through a content analysis of 261 journals in the domain of social psychology, we establish the state of current journal policies for open data. We critically discuss how these expectations may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We assert that open data requirements should include clearer guidelines that reflect the nuance of data sharing in qualitative research, and move away from a universal ‘one-size-fits-all’ approach to data sharing.

 

How misconduct helped psychological science to thrive

“Despite this history, before Stapel, researchers were broadly unaware of these problems or dismissed them as inconsequential. Some months before the case became public, a concerned colleague and I proposed to create an archive that would preserve the data collected by researchers in our department, to ensure reproducibility and reuse. A council of prominent colleagues dismissed our proposal on the basis that competing departments had no similar plans. Reasonable suggestions that we made to promote data sharing were dismissed on the unfounded grounds that psychology data sets can never be safely anonymized and would be misused out of jealousy, to attack well-meaning researchers. And I learnt about at least one serious attempt by senior researchers to have me disinvited from holding a workshop for young researchers because it was too critical of suboptimal practices….

Much of the advocacy and awareness has been driven by early-career researchers. Recent cases show how preregistering studies, replication, publishing negative results, and sharing code, materials and data can both empower the self-corrective mechanisms of science and deter questionable research practices and misconduct….

For these changes to stick and spread, they must become systemic. We need tenure committees to reward practices such as sharing data and publishing rigorous studies that have less-than-exciting outcomes. Grant committees and journals should require preregistration or explanations of why it is not warranted. Grant-programme officers should be charged with checking that data are made available in accordance with mandates, and PhD committees should demand that results are verifiable. And we need to strengthen a culture in which top research is rigorous and trustworthy, as well as creative and exciting….”

A New Option for Scientific Exchange and an Alternative to the Commentary Format – Patricia J. Bauer, 2021

“Letters to the Editors will be disseminated online only to permit more rapid publication and to keep the discussion timely and responsive. They will be hosted on Figshare, which is an online open-access repository and the site that hosts all of the journal’s Supplemental Material. To further speed dissemination, accepted Letters to the Editors will not be copyedited or held for replies but instead will be disseminated as quickly as possible on acceptance. Letters to the Editors will have DOIs but, fitting their existence in the liminal space between a formal publication and an unmediated social media conversation, they will not be indexed (i.e., discoverable through PubMed, PsycInfo, etc.). To facilitate connections between the target article and Letters to the Editors, they will be linked to each other. …”