Sharing published short academic works in institutional repositories after six months | LIBER Quarterly: The Journal of the Association of European Research Libraries

Abstract:  The ambition of the Netherlands, laid down in the National Plan Open Science, is to achieve 100% open access for academic publications. The ambition was to be achieved by 2020. However, it is to be expected that for the year 2020 between 70% and 75% of the articles will be open access. Until recently, the focus of the Netherlands has been on the gold route – open access via journals and publishers’ platforms. This is likely to be costly and it is also impossible to cover all articles and other publication types this way. Since 2015, Dutch Copyright Act has offered an alternative with the implementation of Article 25fa (also known as the ‘Taverne Amendment’), facilitating the green route, i.e. open access via (trusted) repositories. This amendment allows researchers to share short scientific works (e.g. articles and book chapters in edited collections), regardless of any restrictive guidelines from publishers. From February 2019 until August 2019 all Dutch universities participated in the pilot ‘You Share, we Take Care!’ to test how this copyright amendment could be interpreted and implemented by institutions as a policy instrument to enhance green open access and “self-archiving”. In 2020 steps were taken to scale up further implementation of the amendment. This article describes the outcomes of this pilot and shares best practices on implementation and awareness activities in the period following the pilot until early 2021, in which libraries have played an instrumental role in building trust and working on effective implementations on an institutional level. It concludes with some possible next steps for alignment, for example on a European level.

 

Incorporating open science into evidence-based practice: The TRUST Initiative

Abstract:  To make evidence-based policy, the research ecosystem must produce trustworthy evidence. In the US, federal evidence clearinghouses evaluate research using published standards designed to identify “evidence-based” interventions. Because their evaluations can affect billions of dollars in funding, we examined 10 federal evidence clearinghouses. We found their standards focus on study design features such as randomization, but they overlook issues related to transparency, openness, and reproducibility that also affect whether research tends to produce true results. We identified intervention reports used by these clearinghouses, and we developed a method to assess journals that published those evaluations. Based on the Transparency and Openness Promotion (TOP) Guidelines, we created new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices). Of the 340 journals that published influential research, we found that some endorse the TOP Guidelines, but standards for transparency and openness are not applied systematically and consistently. Examining the quality of our tools, we also found varying levels of interrater reliability across different TOP standards. Using our results, we delivered a normative feedback intervention. We informed editors how their journals compare with TOP and with their peer journals. We also used the Theoretical Domains Framework to develop a survey concerning obstacles and facilitators to implementing TOP; 88 editors responded and reported that they are capable of implementing TOP but lack motivation to do so. The results of this program of research highlight ways to support and to assess transparency and openness throughout the evidence ecosystem.

 

Open Research Infrastructure Programs at LYRASIS

“Academic libraries, and institutional repositories in particular, play a key role in the ongoing quest for ways to gather metrics and connect the dots between researchers and research contributions in order to measure “institutional impact,” while also streamlining workflows to reduce administrative burden. Identifying accurate metrics and measurements for illustrating “impact” is a goal that many academic research institutions share, but these goals can only be met to the extent that all organizations across the research and scholarly communication landscape are using best practices and shared standards in research infrastructure. For example, persistent identifiers (PIDs) such as ORCID iDs (Open Researcher and Contributor Identifier) and DOIs (Digital Object Identifiers) have emerged as crucial best practices for establishing connections between researchers and their contributions while also serving as a mechanism for interoperability in sharing data across systems. The more institutions using persistent identifiers (PIDs) in their workflows, the more connections can be made between entities, making research objects more FAIR (findable, accessible, interoperable, and reusable). Also, when measuring institutional repository usage, clean, comparable, standards-based statistics are needed for accurate internal assessment, as well as for benchmarking with peer institutions….”

Open Research Infrastructure Programs at LYRASIS

“Academic libraries, and institutional repositories in particular, play a key role in the ongoing quest for ways to gather metrics and connect the dots between researchers and research contributions in order to measure “institutional impact,” while also streamlining workflows to reduce administrative burden. Identifying accurate metrics and measurements for illustrating “impact” is a goal that many academic research institutions share, but these goals can only be met to the extent that all organizations across the research and scholarly communication landscape are using best practices and shared standards in research infrastructure. For example, persistent identifiers (PIDs) such as ORCID iDs (Open Researcher and Contributor Identifier) and DOIs (Digital Object Identifiers) have emerged as crucial best practices for establishing connections between researchers and their contributions while also serving as a mechanism for interoperability in sharing data across systems. The more institutions using persistent identifiers (PIDs) in their workflows, the more connections can be made between entities, making research objects more FAIR (findable, accessible, interoperable, and reusable). Also, when measuring institutional repository usage, clean, comparable, standards-based statistics are needed for accurate internal assessment, as well as for benchmarking with peer institutions….”

Digital GLAM Spaces Conference

“Web accessibility and user experience are important for centering those who want to learn, research, and teach with digital and digitized cultural heritage. In 2018, the University of Oregon was awarded an Andrew W. Mellon Foundation grant to experiment and build collaboration capacity for how GLAM assets could be used in innovation ways for research.

Digital GLAM Spaces is a conference about building community around web accessibility and user experience. It’s a place for GLAM practitioners to share definitions and best practices for what is UX and accessibility; communicate digital strategies for incorporating user research into digital projects; and talk about the people, skillsets, and support needed to be better and make web accessibility and user experience part of our work instead of bolted on….”

Best Practices for Software Registries and Repositories | FORCE11

“Software is a fundamental element of the scientific process, and cataloguing scientific software is helpful to enable software discoverability. During the years 2019-2020, the Task Force on Best Practices for Software Registries of the FORCE11 Software Citation Implementation Working Group worked to create Nine Best Practices for Scientific Software Registries and Repositories . In this post, we explain why scientific software registries and repositories are important, why we wanted to create a list of best practices for such registries and repositories, the process we followed, what the best practices include, and what the next steps for this community are….”

Open Environmental Data Project

“We partner and collaborate with institutes, nonprofits, individuals and universities to help articulate best practices for new data commons models in the environmental context….

We provide insight that identifies, evaluates and summarizes scientific, legal, economic and cultural incentive and strategy levers for advancing environmental generative actions….”

Preprints in times of COVID19: the time is ripe for agreeing on terminology and good practices | BMC Medical Ethics | Full Text

Abstract:  Over recent years, the research community has been increasingly using preprint servers to share manuscripts that are not yet peer-reviewed. Even if it enables quick dissemination of research findings, this practice raises several challenges in publication ethics and integrity. In particular, preprints have become an important source of information for stakeholders interested in COVID19 research developments, including traditional media, social media, and policy makers. Despite caveats about their nature, many users can still confuse pre-prints with peer-reviewed manuscripts. If unconfirmed but already widely shared first-draft results later prove wrong or misinterpreted, it can be very difficult to “unlearn” what we thought was true. Complexity further increases if unconfirmed findings have been used to inform guidelines. To help achieve a balance between early access to research findings and its negative consequences, we formulated five recommendations: (a) consensus should be sought on a term clearer than ‘pre-print’, such as ‘Unrefereed manuscript’, “Manuscript awaiting peer review” or ‘’Non-reviewed manuscript”; (b) Caveats about unrefereed manuscripts should be prominent on their first page, and each page should include a red watermark stating ‘Caution—Not Peer Reviewed’; (c) pre-print authors should certify that their manuscript will be submitted to a peer-review journal, and should regularly update the manuscript status; (d) high level consultations should be convened, to formulate clear principles and policies for the publication and dissemination of non-peer reviewed research results; (e) in the longer term, an international initiative to certify servers that comply with good practices could be envisaged.

 

Guest Post – APC Waiver Policies; A Job Half-done? – The Scholarly Kitchen

“Most, if not all, open access publishers offer to waive publication charges (of whatever flavor) for researchers in lower and middle-income countries (LMICs) without access to funds to pay them. After all, no-one wants to see open access actually increasing barriers and reducing diversity and inclusion in direct opposition to one of its fundamental objectives. However, as an echo of the “build it and they will come” mentality, waiver policies may end up failing to achieve their intended outcome if they are poorly constructed and communicated to their intended beneficiaries. A recent study by INASP revealed that fully 60% of respondents to an AuthorAID survey had paid Article Processing Charges (APCs) from their own pockets, despite the widespread availability of waivers. This could be due to internal organizational bureaucracy but more likely to the lack of awareness and understanding of APC waivers and how to claim them.

A White Paper published jointly by STM and Elsevier’s International Center for the Study of Research in September 2020 on how to achieve an equitable transition to open access included a specific recommendation to make publisher policies on APC waivers more consistent and more transparent. The authors commented, “Even though this business model may turn out to be an interim step on the road to universal open access, it is likely to persist for several years to come and may unwittingly end up preventing much important research from reaching its intended audience.”…”

Negotiating Open Access Journal Agreements: An Academic Library Case Study

 

The COVID-19 pandemic has presented an opportunity for academic libraries to advance open access (OA) to scholarly articles. Awareness among faculty on the importance of OA has increased significantly during the pandemic, as colleges and universities struggle financially and seek sustainable access to high-quality scholarly journals. Consortia have played an important role in establishing negotiation principles on OA journal agreements. While the number of OA agreements is increasing, case studies involving individual libraries are still limited. This paper reviews existing literature on publisher negotiation principles related to OA journal negotiations and reflects on recent cases at an academic library in Pennsylvania, in order to identify best practices in OA journal negotiations. It provides recommendations on roles, relationships, and processes, as well as essential terms of OA journal agreements. This study’s findings are most relevant to large academic libraries that are interested in negotiating with scholarly journal publishers independently or through consortia.

PID Strategy of Dutch Research Council (NWO) – PID Best Practices – The PID Forum

“The Dutch Research Council (NWO) has published its Persistent Identifier (PID) strategy to improve its capacity for analysing the impact of research. In the Persistent Identifier (PID) strategy NWO describes how it will gradually implement PIDs in the coming years. PIDs are an increasingly important component of scholarly communication because of the increased digitisation of research. They ensure that research is findable and contribute to save researchers time and effort.

The NWO PID strategy can be summarised by the following five recommendations:

Implement ORCID ID for researchers into grant application, peer review, and project reporting workflows.
Implement Crossref Grant ID in grant application and project reporting workflows.
Implement research organisation IDs in grant application and project reporting workflows.
Contribute to shaping the national PID landscape by participating in the ORCID-NL consortium and in a future PID Advisory Board.
Collaborate with other funders in the international PID landscape, for instance within the context of Science Europe….”

Analyzing Education Data with Open Science Best Practices, R, and OSF | OER Commons

“Overview: The webinar features Dr. Joshua Rosenberg from the University of Tennessee, Knoxville and Dr. Cynthia D’Angelo from the University of Illinois at Urbana-Champaign discussing best practices examples for using R. They will present: a) general strategies for using R to analyze educational data and b) accessing and using data on the Open Science Framework (OSF) with R via the osfr package. This session is for those both new to R and those with R experience looking to learn more about strategies and workflows that can help to make it possible to analyze data in a more transparent, reliable, and trustworthy way.”