Abstract: In this short practice paper, we introduce the public version of the Qualitative Data Repository’s (QDR) Curation Handbook. The Handbook documents and structures curation practices at QDR. We describe the background and genesis of the Handbook and highlight some of its key content.
Abstract: To make evidence-based policy, the research ecosystem must produce trustworthy evidence. In the US, federal evidence clearinghouses evaluate research using published standards designed to identify “evidence-based” interventions. Because their evaluations can affect billions of dollars in funding, we examined 10 federal evidence clearinghouses. We found their standards focus on study design features such as randomization, but they overlook issues related to transparency, openness, and reproducibility that also affect whether research tends to produce true results. We identified intervention reports used by these clearinghouses, and we developed a method to assess journals that published those evaluations. Based on the Transparency and Openness Promotion (TOP) Guidelines, we created new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices). Of the 340 journals that published influential research, we found that some endorse the TOP Guidelines, but standards for transparency and openness are not applied systematically and consistently. Examining the quality of our tools, we also found varying levels of interrater reliability across different TOP standards. Using our results, we delivered a normative feedback intervention. We informed editors how their journals compare with TOP and with their peer journals. We also used the Theoretical Domains Framework to develop a survey concerning obstacles and facilitators to implementing TOP; 88 editors responded and reported that they are capable of implementing TOP but lack motivation to do so. The results of this program of research highlight ways to support and to assess transparency and openness throughout the evidence ecosystem.
Ferguson, L. M., Bertelmann, R., Bruch, C., Messerschmidt, R., Pampel, H., Schrader, A. C., Schultze-Motel, P., Weisweiler, N. L. (2021): Helmholtz Open Science Briefing: Gute (digitale) wissenschaftliche Praxis und Open Science: Support und Best Practices zur Umsetzung des DFG-Kodex „Leitlinien zur Sicherung guter wissenschaftlicher Praxis“ (Version 2.0), (Helmholtz Open Science Briefing), Potsdam : Helmholtz Open Science Office, 21 p. https://doi.org/10.48440/os.helmholtz.027
English abstract (via deepl.com): Since 01.08.2019, the code “Guidelines to ensure good scientific practice” of the German Research Foundation (DFG) is valid. For many of the guidelines contained in the DFG Code, Open Science aspects are relevant. The Helmholtz Open Science Office provides this handout for these aspects. Based on selected recommendations of the DFG Code, this handout describes in a practical way the relevance of Open Science in the implementation of the Code at the Helmholtz Centres. The aim of the Helmholtz Open Science Office is to provide impulses for anchoring Open Science in good (digital) scientific practice. [The present version 2.0 is an updated version of the handout].
Orig. German abstract: Seit dem 01.08.2019 ist der Kodex „Leitlinien zur Sicherung guter wissenschaftlicher Praxis“ der Deutschen Forschungsgemeinschaft (DFG) gu?ltig. Fu?r viele der im DFG-Kodex enthaltenen Leitlinien sind Open-Science-Aspekte relevant. Das Helmholtz Open Science Office stellt fu?r diese Aspekte die vorliegende Handreichung bereit. Diese Handreichung beschreibt praxisnah anhand ausgewählter Empfehlungen des DFG-Kodexes die Relevanz von Open Science bei der Implementierung des Kodexes an den Helmholtz-Zentren. Anliegen des Helmholtz Open Science Office ist es, mit dieser Handreichung Impulse zur Verankerung von Open Science in der guten (digitalen) wissenschaftlichen Praxis zu geben. [Die vorliegende Version 2.0 ist eine aktualisierte Version der Handreichung].
“In the most recent editorial for the The Journal of Social Psychology (JSP), J. Grahe (2021) set out and justified a new journal policy: publishing papers now requires authors to make available all data on which claims are based. This places the journal amongst a growing group of forward-thinking psychology journals that mandate open data for research outputs.1 It is clear that the editorial team hopes to raise the credibility and usefulness of research in the journal, as well as the discipline, through increased research transparency….
This commentary represents a natural and complementary alliance between the ambition of JSP’s open data policy and the reality of how data sharing often takes place. We share with JSP the belief that usable and open data is good for social psychology and supports effective knowledge exchange within and beyond academia. For this to happen, we must have not just more open data, but open data that is of a sufficient quality to support repeated use and replication (Towse et al., 2020). Moreover, it is becoming clear that researchers across science are seeking guidance, training and standards for open data provision (D. Roche et al., 2021; Soeharjono & Roche, 2021). With this in mind, we outline several simple steps and point toward a set of freely available resources that can help make datasets more valuable and impactful. Specifically, we explain how to make data meaningful; easily findable, accessible, complete and understandable. We have provided a simple checklist (Table 1) and useful resources (Appendix A) based on our recommendations, these can also be found on the project page for this article (https:doi.org/10.17605/OSF.IO/NZ5WS). While we have focused mostly on sharing quantitative data, much of what has been discussed remains relevant to qualitative research (for an in-depth discussion of qualitative data sharing, see DuBois et al., 2018)….”
“This document is designed for journals and editorial boards that wish to establish a data policy. A data policy defines what the journal expects from its authors in terms of managing and sharing the data related to its publications.
This document is intended in particular for editors of journals in the humanities and social sciences, as they have been relatively less active in this area than their counterparts in science, technology and medicine. However, it can be useful to all editors, regardless of the disciplinary scope of their journal….”
Abstract: In this era of information overload and misinformation, it is a challenge to rapidly translate evidence-based health information to the public. Wikipedia is a prominent global source of health information with high traffic, multilingual coverage, and acceptable quality control practices. Viewership data following the Ebola crisis and during the COVID-19 pandemic reveals that a significant number of web users located health guidance through Wikipedia and related projects, including its media repository Wikimedia Commons and structured data complement, Wikidata.
The basic idea discussed in this paper is to increase and expedite health institutions’ global reach to the general public, by developing a specific strategy to maximize the availability of focused content into Wikimedia’s public digital knowledge archives. It was conceptualized from the experiences of leading health organizations such as Cochrane, the World Health Organization (WHO) and other United Nations Organizations, Cancer Research UK, National Network of Libraries of Medicine, and Centers for Disease Control and Prevention (CDC)’s National Institute for Occupational Safety and Health (NIOSH). Each has customized strategies to integrate content in Wikipedia and evaluate responses.
We propose the development of an interactive guide on the Wikipedia and Wikidata platforms to support health agencies, health professionals and communicators in quickly distributing key messages during crisis situations. The guide aims to cover basic features of Wikipedia, including adding key health messages to Wikipedia articles, citing expert sources to facilitate fact-checking, staging text for translation into multiple languages; automating metrics reporting; sharing non-text media; anticipating offline reuse of Wikipedia content in apps or virtual assistants; structuring data for querying and reuse through Wikidata, and profiling other flagship projects from major health organizations.
In the first phase, we propose the development of a curriculum for the guide using information from prior case studies. In the second phase, the guide would be tested on select health-related topics as new case studies. In its third phase, the guide would be finalized and disseminated.
“In this compendium, we compile Open Science guides with their specific features and fields of application. The book was made as part of a student seminar at the Hannover University of Applied Sciences and Arts in close cooperation with the TIB Open Science Lab as part of TIB Book Sprints R&D….”
“This Practical Guide provides guidance to ensure the long-term preservation and accessibility of research data, and supports organisations to provide a framework in which researchers can share their output in a sustainable way.
It includes three complementary maturity matrices for funders, performers, and data infrastructures. These allow them to evaluate the current status of their policies and practices, and to identify next steps towards sustainable data sharing and seeking alignment with other organisations in doing so….”
This briefing paper aims to support decision makers at research organisations and research funders to develop new monitoring exercises or assess and improve existing processes to measure the Open Access status of publications.
The availability of data and information on the current state of scholarly publishing is invaluable to help advance Open Access. Given the complexity of the scholarly publishing system, this involves a multitude of decisions.
This briefing paper provides recommendations on the three main questions an organisation should answer to develop a monitoring exercise: Why, What, and How?
Examples of different monitoring exercises have been selected to represent different use cases, organisational setups, data sources, and strategies of interpretation.
Abstract: Over the past three years, “Data Repository Selection-Criteria That Matter” – “a set of criteria for the identification and selection of those data repositories that accept research data submissions” – were developed by a group of publishers facilitated by the FAIRsharing initiative. Throughout this time, a large number of organizations and individuals have formulated responses and expressed concern about the criteria and the process through which the criteria were developed. Collectively, our organizations consider that the “Data Repository: Selection Criteria that Matter” recommendations – as currently conceived – will act as an impediment to achieving these aims. As such, we are issuing this Joint Position Statement to highlight the community’s concerns and request that the authors of these criteria respond with specific actions.
“The National Lottery Heritage Fund’s licensing requirement supports open access to the rich heritage in the UK and the exciting possibilities of digital transformation in the cultural sector. All materials created or digitised with grant funding are subject to this requirement, which was updated in September 2020.
Open licences and public domain dedications are tools that give the public permission to use materials typically protected by copyright and other laws….
This guide explains open licensing and provides a step-by-step approach to the open licensing requirement for each stage of your project.
It is aimed at The National Lottery Heritage Fund applicants and grantees but contains useful information for anyone who supports open access to cultural heritage….”