“Three Paths to Open Access is a handout that can be shared with researchers to provide an overview of three common options for making their work open access. The content can be edited to better reflect your institution’s open access support services. For a more in-depth exploration of this topic, see our YouTube video, Three Routes to Open Access: https://www.youtube.com/watch?v=hkSLywLnS9c …”
Open-source science builds on open and free resources that include data, metadata, software, and workflows. Informed decisions on whether and how to (re)use digital datasets are dependent on an understanding about the quality of the underpinning data and relevant information. However, quality information, being difficult to curate and often context specific, is currently not readily available for sharing within and across disciplines. To help address this challenge and promote the creation and (re)use of freely and openly shared information about the quality of individual datasets, members of several groups around the world have undertaken an effort to develop international community guidelines with practical recommendations for the Earth science community, collaborating with international domain experts. The guidelines were inspired by the guiding principles of being findable, accessible, interoperable, and reusable (FAIR). Use of the FAIR dataset quality information guidelines is intended to help stakeholders, such as scientific data centers, digital data repositories, and producers, publishers, stewards and managers of data, to: i) capture, describe, and represent quality information of their datasets in a manner that is consistent with the FAIR Guiding Principles; ii) allow for the maximum discovery, trust, sharing, and reuse of their datasets; and iii) enable international access to and integration of dataset quality information. This article describes the processes that developed the guidelines that are aligned with the FAIR principles, presents a generic quality assessment workflow, describes the guidelines for preparing and disseminating dataset quality information, and outlines a path forward to improve their disciplinary diversity.
“Below are some of the fundamental guidelines of transformative agreements, as defined by the ESAC Initiative community; a listing of the specific requirements that have been adopted by national consortia and other organizations can be found here https://esac-initiative.org/guidelines/. …”
“From optimising supply chains and supporting innovation, to addressing sector challenges and delivering public services, we have seen that sharing data can generate benefits for companies, the economy, society and the environment.
However, a common concern for organisations looking to share data is in providing assurance to senior leaders that sharing a particular set of data will not generate negative impacts on reputation; compromise legal compliance or negatively affect their place in the market; or cause harm to society, the economy or the environment.
With this in mind, we’ve created this guide to help organisations identify, assess and manage risks related to sharing data that they hold.
This guide seeks to provide early steps – prior to seeking legal counsel (if that is required) – to consider real and perceived risks in sharing data to identify suitable mitigating actions. We include typical risk categories, key questions to consider and suggestions on how to minimise harm….”
Abstract: Are you interested in the field of scholarly communications or have you recently been hired at your institution as the director of scholarly initiatives? The concepts presented in Sustaining and Enhancing the Scholarly Communications Department: A Comprehensive Guide by Kris S. Helge, Ahmet Meti Tmava, and Amanda R. Zerangue provide guidance for the scholarly communications librarian, especially those new to the profession.
Wissenschaftrat (2022): Empfehlungen zur Transformation des wissenschaftlichen Publizierens zu Open Access; Köln. DOI: https://doi.org/10.57674/fyrc-vb61
“In May 2015, the Center for Open Science invited Epidemiology to support the Transparency and Openness Promotion (TOP) Guidelines.1 After consulting our editors and former Editors-in-Chief, I declined this invitation and published an editorial to explain the rationale.2 Nonetheless, the Center for Open Science has assigned a TOP score to the journal and disseminated the score via Clarivate, which also disseminates the Journal Impact Factor. Given that Epidemiology has been scored despite opting not to support the TOP Guidelines, and that our score has been publicized by the Center for Open Science, we here restate and expand our concerns with the TOP Guidelines and emphasize that the guidelines are at odds with Epidemiology’s mission and principles. We declined the invitation to support the TOP Guidelines for three main reasons. First, Epidemiology prefers that authors, reviewers, and editors focus on the quality of the research and the clarity of its presentation over adherence to one-size guidelines. For this reason, among others, the editors of Epidemiology have consistently declined opportunities to endorse or implement endeavors such as the TOP Guidelines.3–5 Second, the TOP Guidelines did not include a concrete plan for program evaluation or revision. Well-meaning guidelines with similar goals sometimes have the opposite of their intended effect.6 Our community would never accept a public health or medical intervention that had little evidence to support its effectiveness (more on that below) and no plan for longitudinal evaluation. We hold publication guidelines to the same standard. Third, we declined the invitation to support the TOP Guidelines because they rest on the untenable premise that each research article’s results are right or wrong, as eventually determined by whether its results are reproducible or not. Too often, and including in the study of reproducibility that was foundational in the promulgation of the TOP Guidelines,7 reproducibility is evaluated by whether results are concordant in terms of statistical significance. This faulty approach has been used frequently, even though the idea that two results—one statistically significant and the other not—are necessarily different from one another is a well-known fallacy.8,9 ”
“Whether starting to develop your own open access strategy or assessing a publisher “read and publish” offer for the first time, adapting to the changes underway in the scholarly publishing landscape can be daunting. Luckily, clear signposts have emerged and, thanks to the excellent resources shared by the community, there is no need to re-invent the wheel.
The ESAC Reference Guide is the narrative manifestation of a mapping exercise conducted in Spring 2021 by members the international ESAC community that have accumulated deep, first-hand knowledge and expertise in the negotiation and implementation of transformative agreements with scholarly publishers. Threading together and contextualizing the many local guidelines, recommendations, toolkits, templates and data openly available, the reference guide serves as an authoritative and essential orientation for librarians and consortium staff just beginning to approach or looking to update their transformative agreement strategies based on the latest benchmarks.
The ESAC Reference Guide develops through the phases of preparing, negotiating and implementing an agreement, but libraries and library consortia each have their own unique starting points, and the steps they take in adopting transformative agreements will have local flavors….”
Abstract: In this short practice paper, we introduce the public version of the Qualitative Data Repository’s (QDR) Curation Handbook. The Handbook documents and structures curation practices at QDR. We describe the background and genesis of the Handbook and highlight some of its key content.
Abstract: To make evidence-based policy, the research ecosystem must produce trustworthy evidence. In the US, federal evidence clearinghouses evaluate research using published standards designed to identify “evidence-based” interventions. Because their evaluations can affect billions of dollars in funding, we examined 10 federal evidence clearinghouses. We found their standards focus on study design features such as randomization, but they overlook issues related to transparency, openness, and reproducibility that also affect whether research tends to produce true results. We identified intervention reports used by these clearinghouses, and we developed a method to assess journals that published those evaluations. Based on the Transparency and Openness Promotion (TOP) Guidelines, we created new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices). Of the 340 journals that published influential research, we found that some endorse the TOP Guidelines, but standards for transparency and openness are not applied systematically and consistently. Examining the quality of our tools, we also found varying levels of interrater reliability across different TOP standards. Using our results, we delivered a normative feedback intervention. We informed editors how their journals compare with TOP and with their peer journals. We also used the Theoretical Domains Framework to develop a survey concerning obstacles and facilitators to implementing TOP; 88 editors responded and reported that they are capable of implementing TOP but lack motivation to do so. The results of this program of research highlight ways to support and to assess transparency and openness throughout the evidence ecosystem.
“A Researcher’s Guide to Working With Public Access Books…”
Not even an abstract is OA.
Ferguson, L. M., Bertelmann, R., Bruch, C., Messerschmidt, R., Pampel, H., Schrader, A. C., Schultze-Motel, P., Weisweiler, N. L. (2021): Helmholtz Open Science Briefing: Gute (digitale) wissenschaftliche Praxis und Open Science: Support und Best Practices zur Umsetzung des DFG-Kodex „Leitlinien zur Sicherung guter wissenschaftlicher Praxis“ (Version 2.0), (Helmholtz Open Science Briefing), Potsdam : Helmholtz Open Science Office, 21 p. https://doi.org/10.48440/os.helmholtz.027
English abstract (via deepl.com): Since 01.08.2019, the code “Guidelines to ensure good scientific practice” of the German Research Foundation (DFG) is valid. For many of the guidelines contained in the DFG Code, Open Science aspects are relevant. The Helmholtz Open Science Office provides this handout for these aspects. Based on selected recommendations of the DFG Code, this handout describes in a practical way the relevance of Open Science in the implementation of the Code at the Helmholtz Centres. The aim of the Helmholtz Open Science Office is to provide impulses for anchoring Open Science in good (digital) scientific practice. [The present version 2.0 is an updated version of the handout].
Orig. German abstract: Seit dem 01.08.2019 ist der Kodex „Leitlinien zur Sicherung guter wissenschaftlicher Praxis“ der Deutschen Forschungsgemeinschaft (DFG) gu?ltig. Fu?r viele der im DFG-Kodex enthaltenen Leitlinien sind Open-Science-Aspekte relevant. Das Helmholtz Open Science Office stellt fu?r diese Aspekte die vorliegende Handreichung bereit. Diese Handreichung beschreibt praxisnah anhand ausgewählter Empfehlungen des DFG-Kodexes die Relevanz von Open Science bei der Implementierung des Kodexes an den Helmholtz-Zentren. Anliegen des Helmholtz Open Science Office ist es, mit dieser Handreichung Impulse zur Verankerung von Open Science in der guten (digitalen) wissenschaftlichen Praxis zu geben. [Die vorliegende Version 2.0 ist eine aktualisierte Version der Handreichung].
A Guide to Publishing Research
A Guide to Sharing Your Research Online
A Guide to Research Data Management
A Guide to Copyright and Creative Commons in Research
A Guide to Open Access
“In the most recent editorial for the The Journal of Social Psychology (JSP), J. Grahe (2021) set out and justified a new journal policy: publishing papers now requires authors to make available all data on which claims are based. This places the journal amongst a growing group of forward-thinking psychology journals that mandate open data for research outputs.1 It is clear that the editorial team hopes to raise the credibility and usefulness of research in the journal, as well as the discipline, through increased research transparency….
This commentary represents a natural and complementary alliance between the ambition of JSP’s open data policy and the reality of how data sharing often takes place. We share with JSP the belief that usable and open data is good for social psychology and supports effective knowledge exchange within and beyond academia. For this to happen, we must have not just more open data, but open data that is of a sufficient quality to support repeated use and replication (Towse et al., 2020). Moreover, it is becoming clear that researchers across science are seeking guidance, training and standards for open data provision (D. Roche et al., 2021; Soeharjono & Roche, 2021). With this in mind, we outline several simple steps and point toward a set of freely available resources that can help make datasets more valuable and impactful. Specifically, we explain how to make data meaningful; easily findable, accessible, complete and understandable. We have provided a simple checklist (Table 1) and useful resources (Appendix A) based on our recommendations, these can also be found on the project page for this article (https:doi.org/10.17605/OSF.IO/NZ5WS). While we have focused mostly on sharing quantitative data, much of what has been discussed remains relevant to qualitative research (for an in-depth discussion of qualitative data sharing, see DuBois et al., 2018)….”