Revised principles of transparency and best practice released | OASPA

A revised version of the Principles of Transparency and Best Practice in Scholarly Publishing has been released by four key scholarly publishing organizations today. These guiding principles are intended as a foundation for best practice in scholarly publishing to help existing and new journals reach the best possible standards. 

The fourth edition of the Principles represents a collective effort between the four organizations to align the principles with today’s scholarly publishing landscape. The last update was in 2018, and the scholarly publishing landscape has changed. Guidance is provided on the information that should be made available on websites, peer review, access, author fees and publication ethics. The principles also cover ownership and management, copyright and licensing, and editorial policies. They stress the need for inclusivity in scholarly publishing and emphasize that editorial decisions should be based on merit and not affected by factors such as the origins of the manuscript and the nationality, political beliefs or religion of the author.

 

How Figshare meets the NIH ‘Desirable Characteristics for Data Repositories’ – a help article for using figshare

“The new NIH Policy for Data Management and Sharing (effective January 25, 2023) includes supplemental information on Selecting a Data Repository (NOT-OD-21-016), which outlines the data repositories characteristics that researchers should seek out to share their NIH-funded research data and materials. 

Figshare.com is an appropriate and well-established generalist repository for researchers to permanently store the datasets and other materials produced from their NIH-funded research and to include in their NIH Data Management and Sharing Plans. Figshare+ uses the same repository infrastructure to offer support for sharing large datasets including transparent costs that can be included in funding proposal budgets. Note that Figshare may also be included in Data Management and Sharing Plans in combination with discipline-specific repositories for sharing any types of research outputs that may not be accepted in more specific repositories. Figshare is currently working with NIH as part of their Generalist Repository Ecosystem Initiative to continue enhancing our support for NIH-funded researcher needs. 

Figshare repositories offer established repository infrastructure including adherence to community best practices and standards for persistence, provenance, and discoverability with the flexibility to share any file type and any type of research material and documentation. Figshare makes it easy to share your data in a way that is citable and reusable and to get credit for all of your work. 

Figshare is listed as a recommended data sharing resource in the following: 

NIH Scientific Data Sharing: Generalist Repositories
NIH National Library of Medicine (NLM): Generalist Repositories
NIH HEAL Initiative Recommended Repositories
Nature’s Data Repository Guidance …”

Analysis of Harvard Medical School Countway Library’s MOOC Course, Best Practices for Biomedical Research Data Management: Learner Demographics and Motivations

Abstract:  The Harvard Medical School Countway Library’s Massive Open Online Course (MOOC) Best Practices for Biomedical Research Data Management launched on Canvas in January 2018. This report analyzes learner reported data and course generated analytics from March 2020 through June 2021 for the course. This analysis focuses on three subsets of participant data during the pandemic to understand global learner demographics and interest in biomedical research data management. 

Adoption of World Health Organization Best Practices in Clinical Trial Transparency Among European Medical Research Funder Policies | Global Health | JAMA Network Open | JAMA Network

Abstract:  Importance  Research funders can reduce research waste and publication bias by requiring their grantees to register and report clinical trials.

Objective  To determine the extent to which 21 major European research funders’ efforts to reduce research waste and publication bias in clinical trials meet World Health Organization (WHO) best practice benchmarks and to investigate areas for improvement.

Design, Setting, and Participants  This cross-sectional study was based on 2 to 3 independent assessments of each funder’s publicly available documentation and validation of results with funders during 2021. Included funders were the 21 largest nonmultilateral public and philanthropic medical research funders in Europe, with a combined budget of more than US $22 billion.

Exposures  Scoring of funders using an 11-item assessment tool based on WHO best practice benchmarks, grouped into 4 broad categories: trial registries, academic publication, monitoring, and sanctions. Funder references to reporting standards were captured.

Main Outcomes and Measures  The primary outcome was funder adoption or nonadoption of 11 policy and monitoring measures to reduce research waste and publication bias as set out by WHO best practices. The secondary outcomes were whether and how funder policies referred to reporting standards. Outcomes were preregistered after a pilot phase that used the same outcome measures.

Results  Among 21 of the largest nonmultilateral public and philanthropic funders in Europe, some best practices were more widely adopted than others, with 14 funders (66.7%) mandating prospective trial registration and 6 funders (28.6%) requiring that trial results be made public on trial registries within 12 months of trial completion. Less than half of funders actively monitored whether trials were registered (9 funders [42.9%]) or whether results were made public (8 funders [38.1%]). Funders implemented a mean of 4 of 11 best practices in clinical trial transparency (36.4%) set out by WHO. The extent to which funders adopted WHO best practice items varied widely, ranging from 0 practices for the French Centre National de la Recherche Scientifique and the ministries of health of Germany and Italy to 10 practices (90.9%) for the UK National Institute of Health Research. Overall, 9 funders referred to reporting standards in their policies.

Conclusions and Relevance  This study found that many European medical research funder policy and monitoring measures fell short of WHO best practices. These findings suggest that funders worldwide may need to identify and address gaps in policies and processes.

June HELIOS Newsletter — Higher Education Leadership Initiative for Open Scholarship

“Open Scholarship Good Practices:

This working group will (1) curate current good practices resources that institutions can adapt and adopt, and (2) scope an on-demand open scholarship support service/National Open Office Hours service. Simultaneously, the working group will begin to curate curricula for training the next generation of researchers to engage in good open scholarship practices by design….”

Building Data Resilience Through Collaborative Networks | Educopia Institute

“The aim of this symposium is to share information and best practices on the opportunities, challenges, models, methodologies, successes, and collaborative strategies concerning data sharing for digital scholarship, science, and community formation more broadly. The broad audience addressed will include faculty, librarians, technologists, and university administrators interested in these topics….”

Integrity and security in the global research ecosystem

“Open and transparent communication and dissemination of scientific information and data and sharing of research materials are essential for the global science ecosystem to operate effectively….

However, new challenges and threats are emerging as some governments and non-state actors exhibit increasingly forceful efforts to unfairly exploit and distort the open research environment for their own interests. Many countries now consider unauthorised information transfer and foreign interference in public research as a serious national and economic security risk and a threat to freedom of scientific research….

Hence, the aim of the project was to identify good practices to safeguard national and economic security whilst protecting freedom of enquiry, promoting international research cooperation, and ensuring openness and non-discrimination….”

 

UNESCO global call for best practices in open science: Response from communities – Google Docs

“UNESCO is aiming to collect best practices in open science at individual, institutional, national, regional and international levels with a particular focus on the seven priority areas of action highlighted in the Recommendation. 

The resulting compendium of best practices will be a useful tool to better understand the current landscape of open science, share lessons learned, identify and connect open science actors around the world, and further develop innovative solutions for open science in a collaborative, inclusive and transparent manner.

Submission can be made in English, French or Spanish, by 15 July 2022. Website with more information: https://www.unesco.org/en/articles/unesco-launches-global-call-best-practices-open-science …”

UNESCO’s Global Call for Best Practices in Open Science | UNESCO

“In November 2021, at the 41st session of the General Conference of UNESCO, 193 Member States unanimously adopted the first global standard-setting instrument on open science, the UNESCO Recommendation on Open Science.

Developed through a regionally balanced, multistakeholder, inclusive and transparent consultation process, this landmark international agreement defines shared values and principles for open science, and identifies measures to make science more accessible, the scientific process more inclusive and the outputs of science more readily available and relevant to society.

To assist Member States with the implementation of the Recommendation, UNESCO is launching a Global Call for Best Practices in Open Science to collect best practices in open science at individual, institutional, national, regional and international levels with a particular focus on the seven priority areas highlighted in the Recommendation (p. 20 available here).

The resulting compendium of best practices will be made widely available and broadly disseminated and will be a useful tool to better understand the current landscape of best practices in open science, to identify possible gaps and challenges, share lessons learned improve knowledge and understanding.

If you are involved in an open science initiative that you consider to be a good example or best practice in open science, please provide your input to the survey in French or Spanish

You are encouraged to fill in the questionnaire by 15 July 2022.”

Community workshop to respond to UNESCO’s global call for best practices in open science

“Further to the adoption of the UNESCO Recommendation on Open Science in November 2021, UNESCO is launching a Global Call for Best Practices in Open Science. This call aims to collect best practices in open science at individual, institutional, national, regional, and international levels with a particular focus on the seven priority areas of action highlighted in the Recommendation.

Invest in Open Infrastructure (IOI) has been working to conduct research to provide strategic support and investment guidance to funders, budget holders, policymakers, and other stakeholders on investing in open infrastructure for scholarship and research. To this end, we wish to work with our community to contribute to this Global Call, to gather our experiences to identify best practices in supporting, adopting, using, and contributing to open infrastructure.

To this end, we are collaborating with the Turing Way, the Tools, Practices & Systems (TPS) Programme at the Alan Turing Institute, and Open Life Science to create a series of three 90-min community workshops. Each workshop is hosted by a hosting organization/initiative and will focus on one or two priority areas of action that is/are most central to that community’s work. We invite everyone interested in learning more about others’ practices in supporting open science and open infrastructure to participate in our workshops to contribute to a community response to the UNESCO call.

Wednesday 8 June 2022, 10-11:30 am EDT (see this in your time zone): On “investing in open science infrastructures and services“ hosted by IOI; register here

Wednesday 15 June 2022, 10-11:30 am EDT (see this in your time zone): On “promoting innovative approaches for open science at different stages of the scientific process“ and “promoting international and multi-stakeholder cooperation in the context of open science and with a view to reducing digital, technological and knowledge gaps” hosted by the Turing Way and the TPS Programme; register here

Wednesday 22 June 2022, 10-11:30 am EDT (see this in your time zone): On “investing in human resources, training, education, digital literacy and capacity building for open science“ and, “fostering a culture of open science and aligning incentives for open science” hosted by Open Life Science; register here

We will draft a community response to the UNESCO call based on the input from the session and will share our response publicly upon submission….”

European parliamentarians urge action on missing clinical trial results

“A cross-party group of members of the European parliament has sent an open letter to regulators urging them to not drop the ball on over 3,400 clinical trial results that are still missing on the EudraCT trial registry, in violation of long-standing transparency rules.

 

 

Under European rules, institutions running investigative drug trials must make their results public within 12 months of trial completion. While the rules are set at the European level, responsibility for encouraging and enforcing compliance lies with the national medicines regulators in each country….”

Global Community Guidelines for Documenting, Sharing, and Reusing Quality Information of Individual Digital Datasets

Open-source science builds on open and free resources that include data, metadata, software, and workflows. Informed decisions on whether and how to (re)use digital datasets are dependent on an understanding about the quality of the underpinning data and relevant information. However, quality information, being difficult to curate and often context specific, is currently not readily available for sharing within and across disciplines. To help address this challenge and promote the creation and (re)use of freely and openly shared information about the quality of individual datasets, members of several groups around the world have undertaken an effort to develop international community guidelines with practical recommendations for the Earth science community, collaborating with international domain experts. The guidelines were inspired by the guiding principles of being findable, accessible, interoperable, and reusable (FAIR). Use of the FAIR dataset quality information guidelines is intended to help stakeholders, such as scientific data centers, digital data repositories, and producers, publishers, stewards and managers of data, to: i) capture, describe, and represent quality information of their datasets in a manner that is consistent with the FAIR Guiding Principles; ii) allow for the maximum discovery, trust, sharing, and reuse of their datasets; and iii) enable international access to and integration of dataset quality information. This article describes the processes that developed the guidelines that are aligned with the FAIR principles, presents a generic quality assessment workflow, describes the guidelines for preparing and disseminating dataset quality information, and outlines a path forward to improve their disciplinary diversity.

OER Publishing and Libraries

Abstract:  This presentation explored current library Open Educational Resources (OER) publishing practices and presented research results on those practices. This original research surveyed academic librarians involved in OER publication projects to begin to address the need for expanded dialogue and the development of best practices for publishing OER. The survey results illustrate a broad picture of current practices and serves as a foundation for creating a best practice guide for library OER publishing. The presentation addressed author recruitment and marketing, publishing tools and platforms, and publishing support outside the library.

 

A Registry of Editorial Boards – a new trust signal for scholarly communications? – Crossref

“Whilst most journal websites only give the names of the editors, others possibly add a country, some include affiliations, very few link to a professional profile, an ORCID ID. Even when it’s clear when the editorial board details were updated, it’s hardly ever possible to find past editorial boards information and almost none lists declarations of competing interest.

We hear of instances where a researcher’s name has been listed on the board of a journal without their knowledge or agreement, potentially to deceive other researchers into submitting their manuscripts. Regular reports of impersonation, nepotism, collusion and conflicts of interest have become a cause for concern.

Similarly, recent studies on gender representation and gender and geographical disparity on editorial boards have highlighted the need to do better in this area and provide trusted, reliable and coherent information on editorial board members in order to add transparency, prevent unethical behaviour, maintain trust, promote and support research integrity….

We are proposing the creation of some form of Registry of Editorial Boards to encourage best practice around editorial boards’ information and governance that can easily be accessed and used by the community….”