“Academic libraries, and institutional repositories in particular, play a key role in the ongoing quest for ways to gather metrics and connect the dots between researchers and research contributions in order to measure “institutional impact,” while also streamlining workflows to reduce administrative burden. Identifying accurate metrics and measurements for illustrating “impact” is a goal that many academic research institutions share, but these goals can only be met to the extent that all organizations across the research and scholarly communication landscape are using best practices and shared standards in research infrastructure. For example, persistent identifiers (PIDs) such as ORCID iDs (Open Researcher and Contributor Identifier) and DOIs (Digital Object Identifiers) have emerged as crucial best practices for establishing connections between researchers and their contributions while also serving as a mechanism for interoperability in sharing data across systems. The more institutions using persistent identifiers (PIDs) in their workflows, the more connections can be made between entities, making research objects more FAIR (findable, accessible, interoperable, and reusable). Also, when measuring institutional repository usage, clean, comparable, standards-based statistics are needed for accurate internal assessment, as well as for benchmarking with peer institutions….”
Category Archives: oa.standards
Open Research Infrastructure Programs at LYRASIS
“Academic libraries, and institutional repositories in particular, play a key role in the ongoing quest for ways to gather metrics and connect the dots between researchers and research contributions in order to measure “institutional impact,” while also streamlining workflows to reduce administrative burden. Identifying accurate metrics and measurements for illustrating “impact” is a goal that many academic research institutions share, but these goals can only be met to the extent that all organizations across the research and scholarly communication landscape are using best practices and shared standards in research infrastructure. For example, persistent identifiers (PIDs) such as ORCID iDs (Open Researcher and Contributor Identifier) and DOIs (Digital Object Identifiers) have emerged as crucial best practices for establishing connections between researchers and their contributions while also serving as a mechanism for interoperability in sharing data across systems. The more institutions using persistent identifiers (PIDs) in their workflows, the more connections can be made between entities, making research objects more FAIR (findable, accessible, interoperable, and reusable). Also, when measuring institutional repository usage, clean, comparable, standards-based statistics are needed for accurate internal assessment, as well as for benchmarking with peer institutions….”
Who we are – ENJOI – Science communication
“ENJOI (ENgagement and JOurnalism Innovation for Outstanding Open Science Communication) will explore and test engagement as a key asset of innovation in science communication distributed via media platforms, with a strong focus on journalism. Through a combination of methodologies and in collaboration with producers, target users and stakeholders of science communication, ENJOI will co-create and select a set of standards, principles and indicators (SPIs) condensed to a Manifesto for an Outstanding Open Science Communication.
ENJOI will deploy a series of actions via Engagement Workshops, Labs, field and participatory research, evaluation and testing phases. It will also build an Observatory as its landmark product to make all results and outputs available to foster capacity building and collaboration of all actors in the field. ENJOI will work in four countries: Belgium, Italy, Portugal and Spain, taking into account different cultural contexts.
ENJOI’s ultimate goal is that of improving science communication by making it more consistently reliable, truthful, open and engaging. Contextually, ENJOI will contribute to the active development of critical thinking, digital awareness and media literacy of all actors involved in the process….”
SPIs – ENJOI – Science communication
“In order to address this challenge [of disinformation], the ENJOI project is working to co-create and select a set of Standards, Principles and Indicators (SPIs) for Outstanding Open Science Communication (OOSC). What are SPIs?…
The Catalan Association of Science Communication (ACCC) was in charge of identifying and selecting the ENJOI SPIs. With the help of the ENJOI network, they surveyed existing academic literature, including books, explored grey literature, and consulted several experts to identify a set of documents that can be considered a representative sample of past efforts to define SPIs….”
JCORE upgrades to JATS 1.3 – Highwire Press
“HighWire’s journals hosting solution JCORE can now transform XML to industry standard Journal Article Tag Suite (JATS) XML version 1.3, the latest version. This means that publishers using the platform can now seamlessly comply with the specifications of this interoperable standard.
The British Medical Journal (BMJ) is the first JCORE publisher to offer the JATS 1.3 download for all of their OA content which includes over 57,000 articles. This option may be attractive to BMJ and other publishers as the ability to download full text in a machine readable format is one of the strong recommendations made by Coalition S within the Plan S Principles and Implementation guidance for publishers. …”
Access & License Indicators Revision | NISO website
“The NISO Access & License Indicators Working Group looks forward to receiving your comments! You may access the draft document, add a comment, and view comments received, which will be considered by the Working Group prior to final publication. Note: Comments on changed material (indicated in highlighted text) are prioritized.
This working group began its efforts in late 2020.
It is extending NISO RP-22-2015 Access and License Indicators (ALI) to add metadata and indicators that would allow metadata users, such as content platforms, to filter or target subsets of license information. This filtering or sub-setting would enable applications to determine whether their users can share a specific journal article version – or elements thereof – under specific contexts (e.g., sharing in researcher collaboration groups or on public profiles)….”
Access & License Indicators Revision | NISO website
“The NISO Access & License Indicators Working Group looks forward to receiving your comments! You may access the draft document, add a comment, and view comments received, which will be considered by the Working Group prior to final publication. Note: Comments on changed material (indicated in highlighted text) are prioritized.
This working group began its efforts in late 2020.
It is extending NISO RP-22-2015 Access and License Indicators (ALI) to add metadata and indicators that would allow metadata users, such as content platforms, to filter or target subsets of license information. This filtering or sub-setting would enable applications to determine whether their users can share a specific journal article version – or elements thereof – under specific contexts (e.g., sharing in researcher collaboration groups or on public profiles)….”
Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink
Abstract: Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.
Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink
Abstract: Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.
Principles and Standards for OA Arrangements Between Libraries/Consortia and Smaller Independent Publishers
“The transition to Open Access requires change on the part of all stakeholders, and it is particularly crucial that there is active cross-stakeholder alignment focused on enabling smaller independent publishers to transition successfully. In recognition of this, cOAlition S and ALPSP have asked us to convene groups to work on shared principles, data, licenses, and workflows as outlined in our recent report (see https://www.coalition-s.org/open-access-agreements-with-smaller-publishers-require-active-cross-stakeholder-alignment-report-says/).
We are seeking expressions of interest in engaging with this work. Ideally, we would like a diverse array of people knowledgeable about the topic, who can represent their communities and influence working practices, and backed by organisations willing to communicate, champion, implement and maintain the outputs that will emerge from this work….”
Open Access principles and standards – information power
Access
[ORFG recommendations to the White House Office of Science and Technology Policy]
“This response to the White House Office of Science and Technology Policy’s “Request for Information To Improve Federal Scientific Integrity Policies” is submitted on behalf of the Open Research Funders Group….
The Open Research Funders Group is supportive of the White House Office of Science and Technology Policy’s commitment to explore good practices Federal agencies can adopt to improve scientific integrity, promote transparency, prioritize evidence-based decision making, and promote equity. We believe that the promotion of and adherence to open science principles is a catalytic enabling strategy in support of these goals. Specifically, we recommend that the OSTP prioritize making as much of the research lifecycle openly available to access and reuse. This includes, but is not limited to, preregistrations, protocols, preprints, articles, data, code, and software. The rationale is simple. Research cannot be considered reliable unless it can be tested, replicated, and built upon. Making critical components of the research lifecycle unavailable hampers OSTP’s pursuit of scientific integrity at best, and renders it impossible at worst. Limiting access to research outputs has the further effect of rendering science opaque, which negatively impacts public trust in the research endeavor writ large….”
ORFG’s Response to OSTP’s Federal Scientific Integrity RFI — Open Research Funders Group
“The Open Research Funders Group (ORFG) is pleased to submit a formal response to the White House Office of Science and Technology Policy’s “Request for Information To Improve Federal Scientific Integrity Policies”. The comments, which may be found in their entirety here, encourage the federal government to prioritize making as much of the research lifecycle openly available to access and reuse. This includes, but is not limited to, preregistrations, protocols, preprints, articles, data, code, and software. The rationale is simple. Research cannot be considered reliable unless it can be tested, replicated, and built upon. Making critical components of the research lifecycle unavailable hampers OSTP’s pursuit of scientific integrity at best, and renders it impossible at worst. Limiting access to research outputs has the further effect of rendering science opaque, which negatively impacts public trust in the research endeavor writ large….”
COAR releases resource types vocabulary version 3.0 for repositories with new look and feel – COAR
“We are pleased to announce the release of version 3.0 of the resource types vocabulary. Since 2015, three COAR Controlled Vocabularies have been developed and are maintained by the Controlled Vocabulary Editorial Board: Resource types, access rights and version types. These vocabularies have a new look and are now being managed using the iQvoc platform, hosted by the University of Vienna Library.
Using controlled vocabularies enables repositories to be consistent in describing their resources, helps with search and discovery of content, and allows machine readability for interoperability. The COAR vocabularies are available in several languages, supporting multilingualism across repositories. They also play a key role in making semantic artifacts and repositories compliant with the FAIR Principles, in particular when it comes to findability and interoperability….”
Controlled Vocabularies for Repositories: COAR Vocabularies Documentation
“Documentation for vocabularies developed and managed by COAR….: Resource Types…Access Rights…Version Types.”