CHECKLIST FOR OPEN ACCESS PUBLISHERS ON IMPLEMENTING THE UNESCO RECOMMENDATION ON OPEN SCIENCE

“This document is part of the UNESCO Open Science Toolkit, designed to support implementation of the UNESCO Recommendation on Open Science. It has been produced in partnership with the Open  Access  Scholarly  Publishing  Association  (OASPA),  a  diverse  community  of  organizations  engaged  in  open  scholarship.  The  aim  is  to  provide  practical  assistance  to  the  open  access  publishing  community  to  better  understand  the  Recommendation  by  highlighting  the  areas  that apply to open access publishers who wish to support its implementation….”

 

Incentivizing Collaborative Open Research (ICOR) Envisions a Culture That Rewards Open Science – SPARC

“The sweeping movement towards open research has set in motion changes across funding bodies, institutions, and scholars. For open research to take off, sharing at all stages of the research cycle needs to be easy and the benefits explicitly recognized.

A new project is cataloguing best practices and promoting real incentives to work in the open, with the aim of improving reproducibility and accelerating outcomes to advance science.

Incentivizing Collaborative Open Research (ICOR) began in 2020 with discussions among a circle of 20 strategists led by Kristen Ratan, founder of Strategies for Open Science (Stratos) and Sarah Greene, founder of Rapid Science. The team set out to identify policies, tools, and practices that can be tested to provide evidence regarding the power of operating in the open. A goal of ICOR is to highlight pockets of innovation and to connect researchers with concrete practices that ease and improve their work in open science….”

Enhancing community participation & understanding of preprint review – YouTube

“In this video, Jane Alfred (Catalyst Editorial Ltd) and Iratxe Puebla (ASAPbio) provide an overview of preprint review, its benefits to researchers and the research community and different platforms available for preprint review. The video also discusses good practices in preprint review and ways in which individual researchers can participate in preprint review.”

New report provides insights into global OA landscape — and with a focus on China

A new report released today provides insights into the complex and evolving global Open Access landscape — and with a particular focus on China. The report is a product of a collaboration between STM Association and the China Association for Science and Technology (CAST) focused on the bilateral sharing of ideas and best practices in OA publishing.

 

RI Webinar: New models of publishing: Libraries as publishers – 1579092

“Summary

In an effort to short-circuit the traditional publishing route, libraries are increasingly publishing their own material. What are the upsides to this and what are the problems associated with this route? What tools are available to libraries who want to publish?

 

Key takeaways

An insight into the scholarly publishing landscape
Hear about the latest technologies and new models of publishing that can support the needs of the scholarly community
Learn about best practices and shared expertise in library publishing…”

RI Webinar: Conforming to the REF: An international view – 1579136

“Summary

The REF is a UK-specific measure for research institutions to assess the quality of their research output and is pertinent to libraries, research offices, university planning departments and institutions. But this webcast will aim to look at this within a global context and explore other frameworks that are in place around the world.

Key takeaways

Learn how the outputs of scholarly research are evaluated globally

Hear from experts about best practices in the assessment of researchers and scholarly research

Key insights into why representation of researchers in the design of research assessment practices across the world is crucial…”

Producing Open Data

Abstract:  Open data offer the opportunity to economically combine data into large-scale datasets, fostering collaboration and re-use in the interest of treating researchers’ resources as well as study participants with care. Whereas advantages of utilising open data might be self-evident, the production of open datasets also challenges individual researchers. This is especially true for open data that include personal data, for which higher requirements have been legislated. Mainly building on our own experience as scholars from different research traditions (life sciences, social sciences and humanities), we describe best-practice approaches for opening up research data. We reflect on common barriers and strategies to overcome them, condensed into a step-by-step guide focused on actionable advice in order to mitigate the costs and promote the benefit of open data on three levels at once: society, the disciplines and individual researchers. Our contribution may prevent researchers and research units from re-inventing the wheel when opening data and enable them to learn from our experience.

 

Intelligent open science: viral genomic data sharing during the COVID-19 pandemic – GOV.UK

“A case study on how data was shared across borders during the coronavirus pandemic, and best practice for responding to future global emergencies….

While genomic sequencing data was shared more quickly and widely than ever before during the COVID-19 pandemic, in many cases it was shared too late, or in too partial a form, to support the emergency response.

There is broad consensus that existing norms for data sharing are not well-adapted to an emergency context in which near real-time sharing is the desired goal.

Following the open science commitments made during the UK’s G7 Presidency, BEIS commissioned this study to add depth and precision to existing recommendations on:

data sharing across borders
related research practice
related cultural issues

The findings are intended to inform understanding of open science best practice in responding to future global emergencies….”

Nine best practices for research software registries and repositories [PeerJ]

Abstract:  Scientific software registries and repositories improve software findability and research transparency, provide information for software citations, and foster preservation of computational methods in a wide range of disciplines. Registries and repositories play a critical role by supporting research reproducibility and replicability, but developing them takes effort and few guidelines are available to help prospective creators of these resources. To address this need, the FORCE11 Software Citation Implementation Working Group convened a Task Force to distill the experiences of the managers of existing resources in setting expectations for all stakeholders. In this article, we describe the resultant best practices which include defining the scope, policies, and rules that govern individual registries and repositories, along with the background, examples, and collaborative work that went into their development. We believe that establishing specific policies such as those presented here will help other scientific software registries and repositories better serve their users and their disciplines.

 

Author interview: Nine best practices for software repositories and registries

PeerJ talks to Daniel Garijo about the recently published PeerJ Computer Science article Nine best practices for research software registries and repositories. The article is featured in the PeerJ Software Citation, Indexing, and Discoverability Special Issue.


 

Can you tell us a bit about yourself?

This work would not have been possible without the SciCodes community, the participants of the 2019 Scientific Software Registry Collaboration Workshop and the FORCE11 Software Citation Implementation Working Group. It all started when a task force of that working group undertook the initial work that is detailed in the paper, and then formed SciCodes to continue working together. We are a group of software enthusiasts who maintain and curate research software repositories and registries from different disciplines, including geosciences, neuroscience, biology, and astronomy (currently more than 20 resources and 30 worldwide participants are members of the initiative) 

 

Can you briefly explain the research you published in PeerJ?

In examining the literature, we found best practices and policy suggestions for many different aspects of science, software, and data, but none that specifically addressed software repositories and registries. Our goal was to examine our own and other similar resources, share practices, discuss common challenges, and develop a set of basic best practices for these resources.  

 

What did you find? and how do these practices have such an impact?

We were surprised to find a lot of diversity between our resources. We expected that  our  domains, missions, and types of software in our collections would be different but we expected more commonality in the software metadata our  different resources collect! We had far  fewer fields in common than expected. For example, some resources might collect information on what operating system a software package runs on, other resources may not. In retrospect, this makes sense, since disciplines have different goals and expectations for sharing and reusability  of research software and different heterogeneities (or not) in technology used.

 

The practices outlined in our work aim to strengthen registries and repositories by including enacting policies that make our resources more transparent to our users and encourage us to think more about the long-term availability of software entries. They also provide a way for us to work cooperatively to establish a way for our metadata to be searched, as software that is useful in one field may have application in another.  

Our proposed practices are already having an impact. They have helped member registries audit their practices and start enacting policies and procedures to strengthen their practices. By doing so, they encourage long-term success for their communities. Through this paper, we hope that other registries find these useful in improving their practices and just maybe, contribute to the conversation by joining SciCodes.

 

What kinds of lessons do you hope your readers take away from the research?

We hope the proposed practices will help new and existing resources consider key aspects of their maintainability, metadata and future availability. We expected that the process of converging in common practices would be easy but developing policies and practices that cover a wide range of disciplines and missions was challenging. We are grateful to our funders that we could convene such a great group of experts together and of course, to the experts for contributing their time in helping make our initial draft better.

 

How did you first hear about PeerJ, and what persuaded you to submit to us?

An editor of this special issue on software citation, indexing and discoverability (https://peerj.com/special-issues/84-software)

mentioned that this would be an interesting paper for the community. While not fitting neatly into this category, we felt that workshop discussions and resulting best practices contribute substantially to the software citation ecosystem as repositories and registries are a mechanism to promote discovery, reuse, and credit for software.

 


You can find more PeerJ author interviews here.

FAIREST: A Framework for Assessing Research Repositories | Data Intelligence | MIT Press

Abstract:  The open science movement has gained significant momentum within the last few years. This comes along with the need to store and share research artefacts, such as publications and research data. For this purpose, research repositories need to be established. A variety of solutions exist for implementing such repositories, covering diverse features, ranging from custom depositing workflows to social media-like functions.

In this article, we introduce the FAIREST principles, a framework inspired by the well- known FAIR principles, but designed to provide a set of metrics for assessing and selecting solutions for creating digital repositories for research artefacts. The goal is to support decision makers in choosing such a solution when planning for a repository, especially at an institutional level. The metrics included are therefore based on two pillars: (1) an analysis of established features and functionalities, drawn from existing dedicated, general purpose and commonly used solutions, and (2) a literature review on general requirements for digital repositories for research artefacts and related systems. We further describe an assessment of 11 widespread solutions, with the goal to provide an overview of the current landscape of research data repository solutions, identifying gaps and research challenges to be addressed.

CU Boulder receives collaborative national grant for open science project | University Libraries | University of Colorado Boulder

“This multi-year research project aspires to establish community-informed recommendations on how to assign persistent identifiers like Digital Object Identifiers (DOIs) and Research Resource Identifiers (RRIDs) to research facilities and instrumentation. CU Boulder is also working with the National Center for Atmospheric Research (NCAR) and Florida State University on the project to strengthen coordination among researchers in order to advance FAIR data principles and open science practices. 

The “Findable Accessible Interoperable Reusable (FAIR) Open Science Facilities and Instruments project is one of 10 projects funded by the US National Science Foundation (NSF) as part of its Findable, Accessible, Interoperable, Reusable, Open Science Research Coordination Networks (FAIROS RCN) program. FAIR is a set of international principles that focus on making scientific research more open and transparent.

Johnson said these projects are part of a nationwide attempt to establish norms and best practices to strengthen coordination among researchers to advance fair data principles and open science practices. …”

Revised principles of transparency and best practice released | OASPA

A revised version of the Principles of Transparency and Best Practice in Scholarly Publishing has been released by four key scholarly publishing organizations today. These guiding principles are intended as a foundation for best practice in scholarly publishing to help existing and new journals reach the best possible standards. 

The fourth edition of the Principles represents a collective effort between the four organizations to align the principles with today’s scholarly publishing landscape. The last update was in 2018, and the scholarly publishing landscape has changed. Guidance is provided on the information that should be made available on websites, peer review, access, author fees and publication ethics. The principles also cover ownership and management, copyright and licensing, and editorial policies. They stress the need for inclusivity in scholarly publishing and emphasize that editorial decisions should be based on merit and not affected by factors such as the origins of the manuscript and the nationality, political beliefs or religion of the author.

 

How Figshare meets the NIH ‘Desirable Characteristics for Data Repositories’ – a help article for using figshare

“The new NIH Policy for Data Management and Sharing (effective January 25, 2023) includes supplemental information on Selecting a Data Repository (NOT-OD-21-016), which outlines the data repositories characteristics that researchers should seek out to share their NIH-funded research data and materials. 

Figshare.com is an appropriate and well-established generalist repository for researchers to permanently store the datasets and other materials produced from their NIH-funded research and to include in their NIH Data Management and Sharing Plans. Figshare+ uses the same repository infrastructure to offer support for sharing large datasets including transparent costs that can be included in funding proposal budgets. Note that Figshare may also be included in Data Management and Sharing Plans in combination with discipline-specific repositories for sharing any types of research outputs that may not be accepted in more specific repositories. Figshare is currently working with NIH as part of their Generalist Repository Ecosystem Initiative to continue enhancing our support for NIH-funded researcher needs. 

Figshare repositories offer established repository infrastructure including adherence to community best practices and standards for persistence, provenance, and discoverability with the flexibility to share any file type and any type of research material and documentation. Figshare makes it easy to share your data in a way that is citable and reusable and to get credit for all of your work. 

Figshare is listed as a recommended data sharing resource in the following: 

NIH Scientific Data Sharing: Generalist Repositories
NIH National Library of Medicine (NLM): Generalist Repositories
NIH HEAL Initiative Recommended Repositories
Nature’s Data Repository Guidance …”