The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor is a metric to describe the extent to which journals have adopted the TOP Guidelines in their policies. Systematic methods and rating instruments are needed to calculate the TOP Factor. Moreover, implementation of these open science policies depends on journal procedures and practices, for which TOP provides no standards or rating instruments.
We describe a process for assessing journal policies, procedures, and practices according to the TOP Guidelines. We developed this process as part of the Transparency of Research Underpinning Social Intervention Tiers (TRUST) Initiative to advance open science in the social intervention research ecosystem. We also provide new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices) according to standards in the TOP Guidelines. In addition, we describe how to determine the TOP Factor score for a journal, calculate reliability of journal ratings, and assess coherence among a journal’s policies, procedures, and practices. As a demonstration of this process, we describe a protocol for studying approximately 345 influential journals that have published research used to inform evidence-based policy.
The TRUST Process includes systematic methods and rating instruments for assessing and facilitating implementation of the TOP Guidelines by journals across disciplines. Our study of journals publishing influential social intervention research will provide a comprehensive account of whether these journals have policies, procedures, and practices that are consistent with standards for open science and thereby facilitate the publication of trustworthy findings to inform evidence-based policy. Through this demonstration, we expect to identify ways to refine the TOP Guidelines and the TOP Factor. Refinements could include: improving templates for adoption in journal instructions to authors, manuscript submission systems, and published articles; revising explanatory guidance intended to enhance the use, understanding, and dissemination of the TOP Guidelines; and clarifying the distinctions among different levels of implementation.
“Still, Moher acknowledges that the center has not fully engaged with policy makers—the research funders, journal editors, and institutional leaders who have the power to change norms. He cites the need to gather evidence before suggesting any changes to the status quo. “We have the evidence in lots of areas now, and I think what we need to do is actually try to work much closer with policy people now,” he says. Part of that effort involves putting together an open-science dashboard to help institutions keep track of their own accomplishments and shortcomings in how accessible their research is.
The center is also building collaborations with institutions to help them implement open-science practices. An example, Moher says, is the Montreal Neurological Institute, where the Centre for Journalology is auditing data-sharing practices and introducing an educational program to make data sharing the norm. If that effort is successful, “we’ll slowly start to introduce other open-science practices,” he says….”
This briefing paper aims to support decision makers at research organisations and research funders to develop new monitoring exercises or assess and improve existing processes to measure the Open Access status of publications.
The availability of data and information on the current state of scholarly publishing is invaluable to help advance Open Access. Given the complexity of the scholarly publishing system, this involves a multitude of decisions.
This briefing paper provides recommendations on the three main questions an organisation should answer to develop a monitoring exercise: Why, What, and How?
Examples of different monitoring exercises have been selected to represent different use cases, organisational setups, data sources, and strategies of interpretation.
“Last month the National Health and Medical Research Council sought submissions on going immediate OA on publication. If publishers refuse the council suggested authors’ accepted manuscripts could be made available by named institutional repositories (CMM April 16).
Which is good, but Drs Kingsley and Smith (both ex Cambridge University’s Office of Scholarly Communication) suggest tighter wording to make intent impossible to ignore.
And they call for checks, which institutions could use to make sure OA actually occurs. “There is evidence that even ‘light touch’ compliance checking results in significant behavioural change,” they write. Especially if “there is a significant consequence for non-compliance,” – which could be tying grants to OA rules….”
From Google’s English: “The indicator is produced and launched annually by the Danish Agency for Education and Research, which is part of the Ministry of Education and Research. The indicator monitors the implementation of the Danish Open Access strategy 2018-2025 by collecting and analyzing publication data from the Danish universities.
OVERVIEW – National strategic goals and the realization of them at national and university level.
OA TYPES – Types of Open Access realization at national and local level.
DATA – Data for download as well as documentation at an overview and technical level.
GUIDANCE – Information to support the Danish universities’ implementation of Open Access, such as important dates and guidelines.
FAQ – Frequently Asked Questions….”
Abstract: In 2019, the Governing Council of the Society for Research in Child Development (SRCD) adopted a Policy on Scientific Integrity, Transparency, and Openness (SRCD, 2019a) and accompanying Author Guidelines on Scientific Integrity and Openness in Child Development (SRCD, 2019b). In this issue, a companion article (Gennetian, Tamis?LeMonda, & Frank) discusses the opportunities to realize SRCD’s vision for a science of child development that is open, transparent, robust, and impactful. In this article, we discuss some of the challenges associated with realizing SRCD’s vision. In identifying these challenges—protecting participants and researchers from harm, respecting diversity, and balancing the benefits of change with the costs—we also offer constructive solutions.
“The OA2020 Community of Practice was established to expand the shared knowledge and implementation of OA (open access) and transformative agreement principles and mechanisms.
The number of new open access and transformative agreements is rapidly growing, yet an understanding of how they work is far from universal. Mutual exchanges of ideas, tactics, and a deeper knowledge of the current and potential models will foster opportunities for open access publishing on a larger scale.
Experienced OA and transformative agreement pioneers share their experiences and approaches and collaboratively address emerging issues and models. Collectively, the Community discusses such topics as the importance of data analysis in agreements, faculty and institutional buy-in, and the impact of shifting funding models and workflows into open access….”
Abstract: This article describes a program session covering the nuances and complexities of “Read and Publish” transformative agreements. The session, a panel led by Assistant Marketing Director at AIP Publishing, Sara Rotjan, included the perspectives of three individuals – the researcher, the publisher, and the librarian – to give audience members a well-rounded idea of how transformative agreements are being negotiated from various stakeholders in Open Access publishing. In addition to outlining the infrastructures of the “Read and Publish” model, panelists also detailed the unique role they play in developing and implementing “Read and Publish” models at their own institutions. They also discussed some of the challenges with “Read and Publish” models and how these challenges are being addressed by internal and external stakeholders.
“A new database established by a collaborative team including Penn State University Libraries aims to provide centralized, consistent access to scholarly research metadata for Penn State faculty research, while eliminating much of the administrative work involved with research-activity reporting software used by higher education faculty.
The Researcher Metadata Database (RMD) aggregates content from multiple scholarly research databases including Digital Measures, Pure, the Penn State Electronic Theses and Dissertations database, National Science Foundation (NSF), Open Access Button and Clarivate (formerly Web of Science). RMD’s function not only helps to create a single access programming interface (API) for faculty profiles and department web pages, but also facilitates implementation of Penn State’s Open Access Policy and the ability to generate reports on common data requests.
A unique feature of the RMD Is the ability to push information to the Open Researcher and Contributor ID (ORCID) system, whose identifiers are increasingly used by funding organizations such as NSF and the National Institutes of Health (NIH) as a source of information on research activity, including biographical sketches of researchers….”
“In 2018, a group of mostly European funders sent shock waves through the world of scientific publishing by proposing an unprecedented rule: The scientists they funded would be required to make journal articles developed with their support immediately free to read when published.
The new requirement, which takes effect starting this month, seeks to upend decades of tradition in scientific publishing, whereby scientists publish their research in journals for free and publishers make money by charging universities and other institutions for subscriptions. Advocates of the new scheme, called Plan S (the “S” stands for the intended “shock” to the status quo), hope to destroy subscription paywalls and speed scientific progress by allowing findings to be shared more freely. It’s part of a larger shift in scientific communication that began more than 20 years ago and has recently picked up steam.
Scientists have several ways to comply with Plan S, including by paying publishers a fee to make an article freely available on a journal website, or depositing the article in a free public repository where anyone can download it. The mandate is the first by an international coalition of funders, which now includes 17 agencies and six foundations, including the Wellcome Trust and Howard Hughes Medical Institute, two of the world’s largest funders of biomedical research….”