Enhancing transparency through open government data: the case of data portals and their features and capabilities | Emerald Insight

Abstract:  Purpose

The purpose of this paper was to draw on evidence from computer-mediated transparency and examine the argument that open government data and national data infrastructures represented by open data portals can help in enhancing transparency by providing various relevant features and capabilities for stakeholders’ interactions.

Design/methodology/approach

The developed methodology consisted of a two-step strategy to investigate research questions. First, a web content analysis was conducted to identify the most common features and capabilities provided by existing national open data portals. The second step involved performing the Delphi process by surveying domain experts to measure the diversity of their opinions on this topic.

Findings

Identified features and capabilities were classified into categories and ranked according to their importance. By formalizing these feature-related transparency mechanisms through which stakeholders work with data sets we provided recommendations on how to incorporate them into designing and developing open data portals.

Social implications

The creation of appropriate open data portals aims to fulfil the principles of open government and enables stakeholders to effectively engage in the policy and decision-making processes.

Originality/value

By analyzing existing national open data portals and validating the feature-related transparency mechanisms, this paper fills this gap in existing literature on designing and developing open data portals for transparency efforts.

Our Commitment to Price Transparency – The Official PLOS Blog

“PLOS is committed to transparency in all its forms—from our Open Science practices that we urge our authors to adopt, to providing our community clear insight into our journals and activities. Last year, Plan S provided a pilot opportunity for the latter through their Price & Service Transparency Framework which becomes a requirement for Plan S compliance in July 2022. We have committed to participate in and share our reporting from that framework each year and we are once again sharing our price transparency data in the spreadsheet and chart below. Read on for more details of how the framework has changed and what that means for PLOS. …”

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

 

Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations | SpringerLink

Abstract:  Clearinghouses are influential repositories of information on the effectiveness of social interventions. To identify which interventions are “evidence-based,” clearinghouses review intervention evaluations using published standards of evidence that focus primarily on internal validity and causal inferences. Open science practices can improve trust in evidence from evaluations on the effectiveness of social interventions. Including open science practices in clearinghouse standards of evidence is one of many efforts that could increase confidence in designations of interventions as “evidence-based.” In this study, we examined the policies, procedures, and practices of 10 federal evidence clearinghouses that review preventive interventions—an important and influential subset of all evidence clearinghouses. We found that seven consider at least one open science practice when evaluating interventions: replication (6 of 10 clearinghouses), public availability of results (6), investigator conflicts of interest (3), design and analysis transparency (3), study registration (2), and protocol sharing (1). We did not identify any policies, procedures, or practices related to analysis plan registration, data sharing, code sharing, material sharing, and citation standards. We provide a framework with specific recommendations to help federal and other evidence clearinghouses implement the Transparency and Openness Promotion (TOP) Guidelines. Our proposed “TOP Guidelines for Clearinghouses” includes reporting whether evaluations used open science practices, incorporating open science practices in their standards for receiving “evidence-based” designations, and verifying that evaluations used open science practices. Doing so could increase the trustworthiness of evidence used for policy making and support improvements throughout the evidence ecosystem.

Integrating Qualitative Methods and Open Science: Five Principles for More Trustworthy Research* | Journal of Communication | Oxford Academic

Abstract:  Recent initiatives toward open science in communication have prompted vigorous debate. In this article, we draw on qualitative and interpretive research methods to expand the key priorities that the open science framework addresses, namely producing trustworthy and quality research. This article contributes to communication research by integrating qualitative methodological literature with open communication science research to identify five broader commitments for all communication research: validity, transparency, ethics, reflexivity, and collaboration. We identify key opportunities where qualitative and quantitative communication scholars can leverage the momentum of open science to critically reflect on and improve our knowledge production processes. We also examine competing values that incentivize dubious practices in communication research, and discuss several metascience initiatives to enhance diversity, equity, and inclusion in our field and value multiple ways of knowing.

 

Review Commons – Improve your paper and streamline publication through journal-independent peer-review.

“Review Commons is a platform for high-quality journal-independent peer-review in the life sciences.

Review Commons provides authors with a Refereed Preprint, which includes the authors’ manuscript, reports from a single round of peer review and the authors’ response. Review Commons also facilitates author-directed submission of Refereed Preprints to affiliate journals to expedite editorial consideration, reduce serial re-review and streamline publication.

Review Commons transfers Refereed Preprints on behalf of the authors to bioRxiv and 17 affiliate journals from EMBO Press, eLife, ASCB, The Company of Biologists, Rockefeller University Press and PLOS.

Review Commons will:

Allow reviewers to focus on the science, not specific journal fit.
Enrich the value of preprints.
Reduce re-reviewing at multiple journals.
Accelerate the publishing process by providing journals with high-quality referee reports….”

New Peer Review Model Pushes for Transparency and Efficiency in Science – SPARC

“Last December, a new platform was launched to provide scientists independent peer review of their work before submitting to a journal. Review Commons aims to give authors quick, clear, and objective insight that focuses on the rigor of the research rather than its fit for a particular publication. 

Spearheaded by ASAPbio, EMBO, and 17 affiliate journals in the life sciences, with funding from The Helmsley Charitable Trust, the initiative’s open approach is intended to expedite the publication process. It does this by allowing reviews to be reused by multiple journals, while providing publicly-visible feedback on research shared as preprints. Once authors receive comments, they have a chance to respond before submitting for consideration at one of the participating journals from EMBO Press, eLife, ASCB, The Company of Biologists, Rockefeller University Press and PLoS. …”

More Unexpected Consequences: How the Plan S Transformative Journal Route Favors Larger Incumbent Publishers – The Scholarly Kitchen

“But once you read the Transformative Journal reporting requirements, you will realize that this route is likely impossible for journals other than those from larger and wealthier publishers. Once again, a well-intentioned policy has created further inequities in scholarly communication….

Transformative Journals (TJs) are one route offered by cOAlition S “to encourage publishers to transition to immediate Open Access.” Through this route, a subscription/hybrid journal can remain compliant and eligible for Plan S authors by committing to a transition to becoming fully-OA and meeting a set of OA growth requirements each year until 2024, when support for TJs ends and they are expected to fully convert over to OA. Let’s ignore for now the OA growth requirements for TJs – DeltaThink’s recent analysis covers this well and shows how unrealistic the numbers are and how few journals are likely to progress adequately given the timelines involved…

Instead, I want to focus on the reporting requirements for TJs. Tallying up the number of OA articles published each year is easy to accomplish. The transparent pricing reporting requirements remain vague and meaningless enough that they shouldn’t prove too onerous for even smaller publishers to put together. Where things get difficult, if not impossible, is in the requirement for an annual public report to cOAlition S, a report that must include data on downloads, citations, and Altmetric scores for all papers published, and that must be sub-divided into OA papers versus non-OA papers.

For those working at larger publishing houses, this likely sounds trivial. You’d just assign your team of in-house bibliometric analysts to pull citation data from your expensive Web of Science, Scopus, or Dimensions subscription. Download information can be obtained from the usage tracking service you pay for, or perhaps it’s included from the full-service publishing platform that your organization owns or that you employ each year at significant cost. Altmetric numbers can come from your access to the paid service of the same name. Your employee bibliometricians will, of course, spend the necessary time parsing out the OA articles from everything else.

Hopefully the theme running through that last paragraph was fairly obvious – none of this is free, much of it is very expensive, and in-house bibliometric expertise is rare among smaller publishers….”

Evaluating implementation of the Transparency and Openness Promotion (TOP) guidelines: the TRUST process for rating journal policies, procedures, and practices | Research Integrity and Peer Review | Full Text

Abstract:  Background

The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor is a metric to describe the extent to which journals have adopted the TOP Guidelines in their policies. Systematic methods and rating instruments are needed to calculate the TOP Factor. Moreover, implementation of these open science policies depends on journal procedures and practices, for which TOP provides no standards or rating instruments.

Methods

We describe a process for assessing journal policies, procedures, and practices according to the TOP Guidelines. We developed this process as part of the Transparency of Research Underpinning Social Intervention Tiers (TRUST) Initiative to advance open science in the social intervention research ecosystem. We also provide new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices) according to standards in the TOP Guidelines. In addition, we describe how to determine the TOP Factor score for a journal, calculate reliability of journal ratings, and assess coherence among a journal’s policies, procedures, and practices. As a demonstration of this process, we describe a protocol for studying approximately 345 influential journals that have published research used to inform evidence-based policy.

Discussion

The TRUST Process includes systematic methods and rating instruments for assessing and facilitating implementation of the TOP Guidelines by journals across disciplines. Our study of journals publishing influential social intervention research will provide a comprehensive account of whether these journals have policies, procedures, and practices that are consistent with standards for open science and thereby facilitate the publication of trustworthy findings to inform evidence-based policy. Through this demonstration, we expect to identify ways to refine the TOP Guidelines and the TOP Factor. Refinements could include: improving templates for adoption in journal instructions to authors, manuscript submission systems, and published articles; revising explanatory guidance intended to enhance the use, understanding, and dissemination of the TOP Guidelines; and clarifying the distinctions among different levels of implementation.

Open Science rankings: yes, no, or not this way? A debate on developing and implementing transparency metrics. – JOTE | Journal of Trial and Error

“The Journal of Trial and Error is proud to present an exciting and timely event: a three-way debate on the topic of Open Science metrics, specifically, transparency metrics. Should we develop these metrics? What purposes do they fulfil? How should Open Science practices be encouraged? Are (transparency) rankings the best solution? These questions and more will be addressed in a dynamic and interactive debate with three researchers of different backgrounds: Etienne LeBel (Independent Meta-Scientist and founder of ERC-funded project ‘Curate Science’), Sarah de Rijcke (Professor of Science and Evaluation Studies and director of the Centre for Science and Technology Studies at Leiden University), and Juliëtte Schaafsma (Professor of Cultural Psychology at Tilburg University and fierce critic of rankings and audits). This is an event organized by the Journal of Trial and Error, and supported by the Open Science Community Tilburg, the Centre for Science and Technology Studies (CWTS, Leiden University), and the Open Science Community Utrecht.”

 

Boldly growing: PLOS’ new titles and business model update for institutions

“With PLOS’ recent announcement of five new titles in April, PLOS is keen to introduce our newest titles and business model to the library community.

Join PLOS’ outreach, publishing, and partnerships teams for an introduction to these new titles and PLOS’ newest non-APC based, equity-focused business model.

You can learn more about the rationale for launching new titles on the PLOS blog: https://theplosblog.plos.org/2021/04/launching-new-journals-2021/
and recent coverage from Nature: https://www.nature.com/articles/d41586-020-01907-3

This webinar is open to libraries, consortia, and PLOS institutional partners and registration is required….”

REPEAT (Reproducible Evidence: Practices to Enhance and Achieve Transparency)

“Replication is a cornerstone of the scientific method. Historically, public confidence in the validity of healthcare database research has been low. Drug regulators, patients, clinicians, and payers have been hesitant to trust evidence from databases due to high profile controversies with overturned and conflicting results. This has resulted in underuse of a potentially valuable source of real-world evidence.?…

Division of Phamacoepidemiology & Pharmacoeconomics [DoPE]

Brigham & Women’s Hospital and Harvard Medical School.”