Dissecting the tension of open science standards implementation in management and organization journals

Abstract:  Growing concerns about the credibility of scientific findings have sparked a debate on new transparency and openness standards in research. Management and organization studies scholars generally support the new standards, while emphasizing the unique challenges associated with their implementation in this paradigmatically diverse discipline. In this study, I analyze the costs to authors and journals associated with the implementation of new transparency and openness standards, and provide a progress report on the implementation level thus far. Drawing on an analysis of the submission guidelines of 60 empirical management journals, I find that the call for greater transparency was received, but resulted in implementations that were limited in scope and depth. Even standards that could have been easily adopted were left unimplemented, producing a paradoxical situation in which research designs that need transparency standards the most are not exposed to any, likely because the standards are irrelevant to other research designs.

 

Top Publishers Aim To Own The Entire Academic Research Publishing Stack; Here’s How To Stop That Happening | Techdirt

“echdirt’s coverage of open access — the idea that the fruits of publicly-funded scholarship should be freely available to all — shows that the results so far have been mixed. On the one hand, many journals have moved to an open access model. On the other, the overall subscription costs for academic institutions have not gone down, and neither have the excessive profit margins of academic publishers. Despite that success in fending off this attempt to re-invent the way academic work is disseminated, publishers want more. In particular, they want more money and more power. In an important new paper, a group of researchers warn that companies now aim to own the entire academic publishing stack …

To prevent commercial monopolization, to ensure cybersecurity, user/patient privacy, and future development, these standards need to be open, under the governance of the scholarly community. Open standards enable switching from one provider to another, allowing public institutions to develop tender or bidding processes, in which service providers can compete with each other with their services for the scientific workflow.

Techdirt readers will recognize this as exactly the idea that lies at the heart of Mike’s influential essay “Protocols, Not Platforms: A Technological Approach to Free Speech”. Activist and writer Cory Doctorow has also been pushing for the same thing — what he calls “adversarial interoperability”. It seems like an idea whose time has come, not just for academic publishing, but every aspect of today’s digital world.”

Beyond open: Key criteria to assess open infrastructure

“Today, we wanted to share more about how we’re examining the open infrastructure and open technology landscape to further equitable, just, and accessible infrastructure, and what’s emerging as our key criteria. These criteria are designed to center community, reliability, and transformative influence into our analysis. Below we elaborate on those attributes….

The criteria below represent our first cut at examining infrastructure for transformative influence, or a demonstration of the intention and ability to create change towards our vision of an equitable, just, and accessible infrastructure for all….”

 

The Advanced Research Consortium Joins the Open Library Foundation as Project Member | Open Library Foundation

“The Advanced Research Consortium (ARC) has joined the Open Library Foundation as a Project Member. By joining the Open Library Foundation, ARC is able to leverage the community of projects that are part of the Open Library Foundation.

The Advanced Research Consortium (ARC) serves as a hub of humanities virtual research environments or research nodes. ARC provides support, coordination, and a set of evolving standards for more than 200 digital humanities projects that are open access and peer reviewed by five period-specific and thematic research communities, with more projects and communities joining every year. The ARC Catalog is available through BigDIVA (Big Data Infrastructure Visualization Application), a web-based search and discovery service designed for humanities scholars and students….”

Open Grant Proposals · Business of Knowing, summer 2021

“One of those informal frontiers is crowdfunding for scientific research. For the past year, I’ve worked on Experiment, helping hundreds of scientists design and launch crowdfunding campaigns for their research questions. Experiment has been doing this for almost a decade, with more than 1,000 successfully funded projects on the platform. The process is very different than the grant funding mechanisms set up by agencies and foundations. It’s not big money yet, as the average fundraise is still ~$5,000. But in many ways, the process is better: faster, transparent, and more encouraging to early-career scientists. Of all the lessons learned, one stands out for broader consideration: grant proposals and processes should be open by default.

Grant proposals that meet basic requirements for scientific merit and rigor should be posted online, ideally in a standardized format, in a centralized (or several) database or clearinghouse. They should include more detail than just the abstract and dollar amount totals that are currently shown now on federal databases, especially in terms of budgets and costs. The proposals should include a DOI number so that future work can point back to the original question, thinking, and scope. A link to these open grant proposals should be broadly accepted as sufficient for submission to requests from agencies or foundations….

Open proposals would make research funding project-centric, rather than funder-centric….

Open proposals would promote more accurate budgets….

Open proposals would increase the surface area of collaboration….

Open proposals would improve citation metrics….

Open proposals would create an opportunity to reward the best question-askers in addition to the best question-answerers….

Open proposals would give us a view into the whole of science, including the unfunded proposals and the experiments with null results….”

Standards, Inputs, and Outputs: Strategies for improving data-sharing and consortia-based epidemiologic research | American Journal of Epidemiology | Oxford Academic

Abstract:  Data sharing improves epidemiology research, but sharing data frustrates epidemiologic researchers. The inefficiencies of current methods and options for data-sharing are increasingly documented and easily understood by any study that has shared its data and any researcher who has received shared data. Temprosa and Moore et al. (Am J Epidemiol. XXXX;XXX(XX):XXXX–XXXX)) describe how the COnsortium of METabolomics Studies (COMETS) developed and deployed a flexible analytic platform to eliminate key pain points in large-scale metabolomics research. COMETS Analytics includes an online tool, but its cloud computing and technology are supporting, rather than the lead, actors in this script. The COMETS team identified the need to standardize diverse and inconsistent metabolomics and covariate data and models across its many participating cohort studies, and then they developed a flexible tool that gave its member studies choices about how they wanted to meet the consortium’s analytic requirements. Different specialties will have different specific research needs and will likely continue to use and develop an array of diverse analytic and technical solutions for their projects. COMETS Analytics shows how important and enabling the upstream attention to data standards and data consistency are to producing high-quality metabolomics, consortium-based, and large-scale epidemiology research.

 

Open Research Infrastructure Programs at LYRASIS

“Academic libraries, and institutional repositories in particular, play a key role in the ongoing quest for ways to gather metrics and connect the dots between researchers and research contributions in order to measure “institutional impact,” while also streamlining workflows to reduce administrative burden. Identifying accurate metrics and measurements for illustrating “impact” is a goal that many academic research institutions share, but these goals can only be met to the extent that all organizations across the research and scholarly communication landscape are using best practices and shared standards in research infrastructure. For example, persistent identifiers (PIDs) such as ORCID iDs (Open Researcher and Contributor Identifier) and DOIs (Digital Object Identifiers) have emerged as crucial best practices for establishing connections between researchers and their contributions while also serving as a mechanism for interoperability in sharing data across systems. The more institutions using persistent identifiers (PIDs) in their workflows, the more connections can be made between entities, making research objects more FAIR (findable, accessible, interoperable, and reusable). Also, when measuring institutional repository usage, clean, comparable, standards-based statistics are needed for accurate internal assessment, as well as for benchmarking with peer institutions….”

Open Research Infrastructure Programs at LYRASIS

“Academic libraries, and institutional repositories in particular, play a key role in the ongoing quest for ways to gather metrics and connect the dots between researchers and research contributions in order to measure “institutional impact,” while also streamlining workflows to reduce administrative burden. Identifying accurate metrics and measurements for illustrating “impact” is a goal that many academic research institutions share, but these goals can only be met to the extent that all organizations across the research and scholarly communication landscape are using best practices and shared standards in research infrastructure. For example, persistent identifiers (PIDs) such as ORCID iDs (Open Researcher and Contributor Identifier) and DOIs (Digital Object Identifiers) have emerged as crucial best practices for establishing connections between researchers and their contributions while also serving as a mechanism for interoperability in sharing data across systems. The more institutions using persistent identifiers (PIDs) in their workflows, the more connections can be made between entities, making research objects more FAIR (findable, accessible, interoperable, and reusable). Also, when measuring institutional repository usage, clean, comparable, standards-based statistics are needed for accurate internal assessment, as well as for benchmarking with peer institutions….”

Who we are – ENJOI – Science communication

“ENJOI (ENgagement and JOurnalism Innovation for Outstanding Open Science Communication) will explore and test engagement as a key asset of innovation in science communication distributed via media platforms, with a strong focus on journalism. Through a combination of methodologies and in collaboration with producers, target users and stakeholders of science communication, ENJOI will co-create and select a set of standards, principles and indicators (SPIs) condensed to a Manifesto for an Outstanding Open Science Communication.

ENJOI will deploy a series of actions via Engagement Workshops, Labs, field and participatory research, evaluation and testing phases. It will also build an Observatory as its landmark product to make all results and outputs available to foster capacity building and collaboration of all actors in the field. ENJOI will work in four countries: Belgium, Italy, Portugal and Spain, taking into account different cultural contexts.

ENJOI’s ultimate goal is that of improving science communication by making it more consistently reliable, truthful, open and engaging. Contextually, ENJOI will contribute to the active development of critical thinking, digital awareness and media literacy of all actors involved in the process….”

SPIs – ENJOI – Science communication

“In order to address this challenge [of disinformation], the ENJOI project is working to co-create and select a set of Standards, Principles and Indicators (SPIs) for Outstanding Open Science Communication (OOSC). What are SPIs?…

The Catalan Association of Science Communication (ACCC) was in charge of identifying and selecting the ENJOI SPIs. With the help of the ENJOI network, they surveyed existing academic literature, including books, explored grey literature, and consulted several experts to identify a set of documents that can be considered a representative sample of past efforts to define SPIs….”