Review Commons – Improve your paper and streamline publication through journal-independent peer-review.

“Review Commons is a platform for high-quality journal-independent peer-review in the life sciences.

Review Commons provides authors with a Refereed Preprint, which includes the authors’ manuscript, reports from a single round of peer review and the authors’ response. Review Commons also facilitates author-directed submission of Refereed Preprints to affiliate journals to expedite editorial consideration, reduce serial re-review and streamline publication.

Review Commons transfers Refereed Preprints on behalf of the authors to bioRxiv and 17 affiliate journals from EMBO Press, eLife, ASCB, The Company of Biologists, Rockefeller University Press and PLOS.

Review Commons will:

Allow reviewers to focus on the science, not specific journal fit.
Enrich the value of preprints.
Reduce re-reviewing at multiple journals.
Accelerate the publishing process by providing journals with high-quality referee reports….”

New Peer Review Model Pushes for Transparency and Efficiency in Science – SPARC

“Last December, a new platform was launched to provide scientists independent peer review of their work before submitting to a journal. Review Commons aims to give authors quick, clear, and objective insight that focuses on the rigor of the research rather than its fit for a particular publication. 

Spearheaded by ASAPbio, EMBO, and 17 affiliate journals in the life sciences, with funding from The Helmsley Charitable Trust, the initiative’s open approach is intended to expedite the publication process. It does this by allowing reviews to be reused by multiple journals, while providing publicly-visible feedback on research shared as preprints. Once authors receive comments, they have a chance to respond before submitting for consideration at one of the participating journals from EMBO Press, eLife, ASCB, The Company of Biologists, Rockefeller University Press and PLoS. …”

More Unexpected Consequences: How the Plan S Transformative Journal Route Favors Larger Incumbent Publishers – The Scholarly Kitchen

“But once you read the Transformative Journal reporting requirements, you will realize that this route is likely impossible for journals other than those from larger and wealthier publishers. Once again, a well-intentioned policy has created further inequities in scholarly communication….

Transformative Journals (TJs) are one route offered by cOAlition S “to encourage publishers to transition to immediate Open Access.” Through this route, a subscription/hybrid journal can remain compliant and eligible for Plan S authors by committing to a transition to becoming fully-OA and meeting a set of OA growth requirements each year until 2024, when support for TJs ends and they are expected to fully convert over to OA. Let’s ignore for now the OA growth requirements for TJs – DeltaThink’s recent analysis covers this well and shows how unrealistic the numbers are and how few journals are likely to progress adequately given the timelines involved…

Instead, I want to focus on the reporting requirements for TJs. Tallying up the number of OA articles published each year is easy to accomplish. The transparent pricing reporting requirements remain vague and meaningless enough that they shouldn’t prove too onerous for even smaller publishers to put together. Where things get difficult, if not impossible, is in the requirement for an annual public report to cOAlition S, a report that must include data on downloads, citations, and Altmetric scores for all papers published, and that must be sub-divided into OA papers versus non-OA papers.

For those working at larger publishing houses, this likely sounds trivial. You’d just assign your team of in-house bibliometric analysts to pull citation data from your expensive Web of Science, Scopus, or Dimensions subscription. Download information can be obtained from the usage tracking service you pay for, or perhaps it’s included from the full-service publishing platform that your organization owns or that you employ each year at significant cost. Altmetric numbers can come from your access to the paid service of the same name. Your employee bibliometricians will, of course, spend the necessary time parsing out the OA articles from everything else.

Hopefully the theme running through that last paragraph was fairly obvious – none of this is free, much of it is very expensive, and in-house bibliometric expertise is rare among smaller publishers….”

Evaluating implementation of the Transparency and Openness Promotion (TOP) guidelines: the TRUST process for rating journal policies, procedures, and practices | Research Integrity and Peer Review | Full Text

Abstract:  Background

The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor is a metric to describe the extent to which journals have adopted the TOP Guidelines in their policies. Systematic methods and rating instruments are needed to calculate the TOP Factor. Moreover, implementation of these open science policies depends on journal procedures and practices, for which TOP provides no standards or rating instruments.

Methods

We describe a process for assessing journal policies, procedures, and practices according to the TOP Guidelines. We developed this process as part of the Transparency of Research Underpinning Social Intervention Tiers (TRUST) Initiative to advance open science in the social intervention research ecosystem. We also provide new instruments for rating journal instructions to authors (policies), manuscript submission systems (procedures), and published articles (practices) according to standards in the TOP Guidelines. In addition, we describe how to determine the TOP Factor score for a journal, calculate reliability of journal ratings, and assess coherence among a journal’s policies, procedures, and practices. As a demonstration of this process, we describe a protocol for studying approximately 345 influential journals that have published research used to inform evidence-based policy.

Discussion

The TRUST Process includes systematic methods and rating instruments for assessing and facilitating implementation of the TOP Guidelines by journals across disciplines. Our study of journals publishing influential social intervention research will provide a comprehensive account of whether these journals have policies, procedures, and practices that are consistent with standards for open science and thereby facilitate the publication of trustworthy findings to inform evidence-based policy. Through this demonstration, we expect to identify ways to refine the TOP Guidelines and the TOP Factor. Refinements could include: improving templates for adoption in journal instructions to authors, manuscript submission systems, and published articles; revising explanatory guidance intended to enhance the use, understanding, and dissemination of the TOP Guidelines; and clarifying the distinctions among different levels of implementation.

Open Science rankings: yes, no, or not this way? A debate on developing and implementing transparency metrics. – JOTE | Journal of Trial and Error

“The Journal of Trial and Error is proud to present an exciting and timely event: a three-way debate on the topic of Open Science metrics, specifically, transparency metrics. Should we develop these metrics? What purposes do they fulfil? How should Open Science practices be encouraged? Are (transparency) rankings the best solution? These questions and more will be addressed in a dynamic and interactive debate with three researchers of different backgrounds: Etienne LeBel (Independent Meta-Scientist and founder of ERC-funded project ‘Curate Science’), Sarah de Rijcke (Professor of Science and Evaluation Studies and director of the Centre for Science and Technology Studies at Leiden University), and Juliëtte Schaafsma (Professor of Cultural Psychology at Tilburg University and fierce critic of rankings and audits). This is an event organized by the Journal of Trial and Error, and supported by the Open Science Community Tilburg, the Centre for Science and Technology Studies (CWTS, Leiden University), and the Open Science Community Utrecht.”

 

Boldly growing: PLOS’ new titles and business model update for institutions

“With PLOS’ recent announcement of five new titles in April, PLOS is keen to introduce our newest titles and business model to the library community.

Join PLOS’ outreach, publishing, and partnerships teams for an introduction to these new titles and PLOS’ newest non-APC based, equity-focused business model.

You can learn more about the rationale for launching new titles on the PLOS blog: https://theplosblog.plos.org/2021/04/launching-new-journals-2021/
and recent coverage from Nature: https://www.nature.com/articles/d41586-020-01907-3

This webinar is open to libraries, consortia, and PLOS institutional partners and registration is required….”

REPEAT (Reproducible Evidence: Practices to Enhance and Achieve Transparency)

“Replication is a cornerstone of the scientific method. Historically, public confidence in the validity of healthcare database research has been low. Drug regulators, patients, clinicians, and payers have been hesitant to trust evidence from databases due to high profile controversies with overturned and conflicting results. This has resulted in underuse of a potentially valuable source of real-world evidence.?…

Division of Phamacoepidemiology & Pharmacoeconomics [DoPE]

Brigham & Women’s Hospital and Harvard Medical School.”

Contracting in the Age of Open Access Publications. A Systematic Analysis of Transformative Agreements | Ouvrir la Science

The “socioeconomics of scientific publication” Project, Committee for Open Science

Final report – 17 December 2020 Contract No. 206-150

Quentin Dufour (CNRS Postdoctoral fellow) David Pontille (CNRS senior researcher) Didier Torny (CNRS senior researcher)

Mines ParisTech, Center for the Sociology of Innovation • PSL University

Supported by the Ministry of Higher Education, Research and Innovation

Summary

This study focuses on one of the contemporary innovations linked to the economy of academic publishing: the so-called transformative agreements, a relatively circumscribed object within the relations between library consortia and academic publishers, and temporally situated between 2015 and 2020. The stated objective of this type of agreement is to organise the transition from the traditional model of subscription to journals (often proposed by thematic groupings or collections) to that of open access by reallocating the budgets devoted to it.

Our sociological analysis work constitutes a first systematic study of this object, based on a review of 197 agreements. The corpus thus constituted includes agreements characterised by the co-presence of a subscription component and an open access publication component, even minimal (publication “tokens” offered, reduction on APCs, etc.). As a result, agreements that only concern centralised funding for open access publishing were excluded from the analysis, whether with publishers that only offer journals with payment by the author (PLOS, Frontiers, MDPI, etc.) or publishers whose catalogue includes open access journals. The oldest agreement in our corpus was signed in 2010, the most recent ones in 2020 – agreements starting only in 2021, even announced during the study, were not retained.

Several results emerge from our analysis. First of all, there is a great diversity of actors involved with 22 countries and 39 publishers, even if some consortia (Netherlands, Sweden, Austria, Germany) and publishers (CUP, Elsevier, RSC, Springer) signed many more than others. Secondly, the duration of the agreements, ranging from one to six years, reveals a very unequal distribution, with more than half of the agreements (103) signed for 3 years, and a small proportion for 4 years or more (22 agreements). Finally, despite repeated calls for transparency, less than half of the agreements (96) have an accessible text at the time of this study, with no recent trend towards greater availability.

Of the 96 agreements available, 47 of which were signed in 2020, 62 have been analysed in depth. To our knowledge, this is the first analysis on this scale, on a type of material that was not only unpublished, but which was previously subject to confidentiality clauses. Based on a careful reading, the study describes in detail their properties, from the materiality of the document to the financial formulas, including their morphology and all the rights and duties of the parties. We therefore analysed the content of the agreements as a collection, looking for commonalities and variations through an explicit coding of their characteristics. The study also points out some uncertainties, in particular their “transitional” character, which remains strongly debated.

From a morphological point of view, the agreements show a great diversity in size (from 7 to 488 pages) and structure. Nevertheless, by definition, they both articulate two essential objects: on the one hand, the conditions for carrying out a reading of journal articles, in the form of a subscription, combining concerns of access and security; on the other hand, the modalities of open access publication, articulating the management of a new type of workflow with a whole series of possible options. These options include the scope of the journals considered (hybrid and/or open access), the licences available, the degree of obligation to publish, the eligible authors or the volume of publishable articles.

One of the most important results of this in-depth analysis is the discovery of an almost complete decoupling, within the agreements themselves, between the subscription object and the publication object. Of course, subscription is systematically configured in a closed world, subject to payment, which triggers series of identification of legitimate circulations of both information content and users. In particular, it insists on prohibitions on the reuse or even copying of academic articles. On the other hand, open access publishing is attached to a world governed by free access to content, which leads to concerns about workflow management and accessibility modalities. Moreover, the different elements that make up these contractual objects are not interconnected: on one side, the readers are all members of the subscribing institutions, on the other, only the corresponding authors are concerned; the lists of journals accessible to the reader and those reserved for open access publication are usually distinct; the workflows have totally different

New Open Access Business Models – What’s Needed to Make Them Work? – The Scholarly Kitchen

“The third CHORUS Forum meeting, held last week, is a relatively new entrant into the scholarly communication meeting calendar. The meeting has proven to be a rare opportunity to bring together publishers, researchers, librarians, and research funders. I helped organize and moderated a session during the Forum, on the theme of “Making the Future of Open Research Work.” You can watch my session, which looked at new models for sustainable and robust open access (OA) publishing, along with the rest of the meeting in the video below.

The session focuses on the operationalization of the move to open access and the details of what it takes to experiment with a new business model. The model the community has the most experience with, the individual author paying an article-processing-charge (APC), works really well for some authors, in some subject areas, in some geographies. But it is not a universal solution to making open access work and it creates new inequities as it resolves others….

Some of the key takeaways for me were found in the commonalities across all of the models. The biggest hurdle that each organization faced in executing its plans was gathering and analyzing author data. As Sara put it, “Data hygiene makes or breaks all of these models.” For PLOS and the ACM, what they’re asking libraries to support is authorship – the model essentially says “this many papers had authors from your institution and what you pay will largely be based on the volume of your output.” But disambiguating author identity, and especially identifying which institutions each represents, remains an enormous problem. While we do have persistent identifiers (PIDs) like ORCID, and the still-under-development ROR, their use is not universal, and we still lack a unifying mechanism to connect the various PIDs into a simple, functional tool to support this type of analysis.

One solution would be requiring authors to accurately identify their host institutions from a controlled vocabulary, but this runs up against most publishers’ desire to streamline the article submission process. There’s a balance to be struck, but probably one that’s going to ask authors to provide more accurate and detailed information….

[M]oving beyond the APC is essential to the long-term viability of open access, and there remains much experimentation to be done….”

Building a service to support cOAlition S’s Price & Service Transparency Frameworks: an Invitation to Tender | Plan S

“The European Science Foundation (ESF), on behalf of the cOAlition S members, is seeking to contract with a supplier to build a secure, authentication-managed web-based service which will enable:

academic publishers to upload data, in accord with one of the approved cOAlition S Price and Service Transparency Frameworks;
approved users to be able to login to this service and for a given journal, determine what services are provided and at what price;
approved users to be able to select several journals and compare the services offered and prices charged by the different journals selected;
the Journal Checker Tool (JCT) – via an API call – to determine whether a journal has (or has not) provided data in line with one of the approved Price and Service Transparency Frameworks.

Given that some of the data that will be made accessible through this service is considered sensitive, it is imperative that suppliers can build a secure service such that data uploaded by a publisher, and intended by them for approved users only, cannot be accessed by any other publisher.

This service must be functional – in terms of allowing publishers to upload their “Framework Reports” by 1st of December 2021.  The service must be accessible to all approved users – including the JCT – by the 1st June 2022….”

Increasing transparency through open science badges

“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).

Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”

Open Research Transparency

“Currently, innovative ideas are abundant in science, yet we are still short of practical tools to implement these ideas in everyday practice. A tool is practical if it can achieve its aim without requiring too much or no extra effort from the user. The consideration of user experience, efficiency, and user-friendliness is still weak in the development of scientific tools. In this workshop, three early career researchers will present their innovations that aim to improve scientific practice in an efficient way and we invite the audience to a discussion to formalise our thinking about the development of new tools….”

Association Science2 (Science for Science)

“Our objectives are:

to promote the dissemination of high-quality research without private intermediaries, primarily through the creation of top-level open access journals with low article-processing charges (€500/article + VAT);
to prioritize standards of excellence and complete transparency in the process of open dissemination of science;
to promote the training of early career scientists from around the world, prioritizing excellence….”