Introducing PReF: Preprint Review Features – ASAPbio

“Preprint reviews hold the potential to build trust in preprints and drive innovation in peer review. However, the variety of platforms available to contribute comments and reviews on preprints means that it can be difficult for readers to gain a clear picture of the process that led to the reviews linked to a particular preprint. 

To address this, ASAPbio organized a working group to develop a set of features that could describe preprint review processes in a way that is simple to implement. We are proud to share Preprint Review Features (PReF) in an OSF Preprint. PReF consists of 8 key-value pairs, describing the key elements of preprint review. The white paper includes detailed definitions for each feature, an implementation guide, and an overview of how the characteristics of active preprint review projects map to PReF. We also developed a set of graphic icons (below) that we encourage the preprint review community to reuse alongside PReF. 

While the Peer Review Terminology developed by the STM working group and the Open Peer Review taxonomy provided useful background for our discussions, they were designed with a focus on journal-based peer review, and do not capture all the possible elements that can be part of preprint review. We acknowledge that there are nuances and different views as to what constitutes “peer review,” “feedback,” and “commenting;” rather than create strict definitions, our aim was to parse out important aspects of the process involved in any form of review on preprints, and to do so in a format that could be used by platforms that host, coordinate, or aggregate such activities. Therefore, we are glad to see that PReF is already implemented on ReimagineReview and on review aggregators like Early Evidence Base and Sciety. We hope that our efforts in the development and adoption of PReF will promote better visibility and discoverability of preprint review….”

COAR Welcomes Significant Funding for the Notify Project

We are delighted to announce that COAR has been awarded a US$4 million grant from Arcadia, a charitable fund of Lisbet Rausing and Peter Baldwin. The 4 year grant will go towards the COAR Notify Project, which is developing and implementing a standard protocol for connecting the content in the distributed repository network with peer reviews and assessments in external services, using linked data notifications.

The potential butterfly effect of preregistered peer-reviewed research – The Official PLOS Blog

“Refocusing journal peer review on the study design phase exerts more and greater downstream changes. Peer review that focuses on evaluating significance of the research question, the methods and analytical approach before work begins, has the power to shape stronger, more rigorous and more creative research. Making an editorial decision while results are still unknown minimizes the potential impacts confirmation bias and impact bias, taking science communication back to its roots, with an emphasis on quality, rigor, and a pure intellectual curiosity. As Kiermer explains, “Preregistration and peer review of the study protocol with a journal is a way to tackle publication bias. As long as the protocol is followed, or any deviations explained, it’s a guarantee for the author that the results will be published, even if they don’t confirm their hypothesis.”

In combination, all of these factors contribute to a more complete and efficient scientific record, replete with studies exploring important hypotheses, performed to the very highest technical standards, and free from the distorting influence of impact-chasing, ego, and bias. A scientific record that is both demonstrably trustworthy, and widely trusted. And with that, there is no telling where science might go, or how quickly….”

Designing an Open Peer Review Process for Open Access Guides | Community-led Open Publication Infrastructures for Monographs (COPIM)

by Simon Worthington

The LIBER Citizen Science Working Group is embarking on the design of an open peer review process for the guidebook series being published on the topic of citizen science for research libraries. The LIBER working group in collaboration with COPIM is looking for input and feedback on the design of the open peer review workflow. COPIM is supporting the working group by contributing its experience and knowledge of open access book publishing, with respect to collaborative post-publication input, community peer review processes, and reuse. The first section of the guide Citizen Science Skilling for Library Staff, Researchers, and the Public has already been published with three more sections to follow.

 

SocArXiv Papers | Targeted, actionable and fair: reviewer reports as feedback and its effect on ECR career choices

Abstract:  Previous studies of the use of peer review for the allocation of competitive funding agencies have concentrated on questions of efficiency and how to make the ‘best’ decision, by ensuring that successful applicants are also the more productive or visible in the long term. This paper examines the function of peer review by examining how it can be used as a participatory research governance tool by focusing on the function feedback plays in assisting in the development of ECR applicants. Using a combination of survey, interviews and linguistic-based coding of reviewer reports, this study explores how reviewer reports provided to unsuccessful applicants as an artefact of the peer-review decision making process, can be considered as a method of feedback. Specifically, it examines which components of this feedback underpinned their decisions to re-submit their grant applications following first-failure; change their research topics or withdraw from academia entirely. Peer review feedback, we argue, sends signals to applicants to encourage them to persist (continue) or switch (not continue) even when the initial application has failed. The results lead to identification of standards of feedback for funding agencies and peer-reviewers to promote when providing reviewer feedback to applicants as part of their peer review process. The results also highlight a function of peer review overlooked by current research which is not concentrated solely on the development of an outcome, to one that can be used effectively to support the development of individuals and their future research plans.

 

SurveyMonkey Powered Online Survey

“Thank you in advance for taking the time to respond to this survey about eLife. It should take no more than 10 minutes to complete. 

 

We seek to transform research communication and we’d love to hear your thoughts related to initiatives we’ve got underway.

All questions are optional. Your feedback is anonymous and it will help us better understand the expectations of the community and drive change and innovation in scientific and medical publishing….”

Keep calm and carry on: moral panic, predatory publishers, peer review, and the emperor’s new clothes | Journal of the Medical Library Association

Abstract:  The moral panic over the impact of so-called predatory publishers continues unabated. It is important, however, to resist the urge to simply join in this crusade without pausing to examine the assumptions upon which such concerns are based. It is often assumed that established journals are almost sacrosanct, and that their quality, secured by peer review, is established. It is also routinely presumed that such journals are immune to the lure of easy money in return for publication. Rather than looking at the deficits that may be apparent in the practices and products of predatory publishers, this commentary invites you to explore the weaknesses that have been exposed in traditional academic journals but are seldom discussed in the context of predatory publishing. The inherent message for health and medical services staff, researchers, academics, and students is, as always, to critically evaluate all sources of information, whatever their provenance.

 

I Don’t Peer-Review for Non-Open Journals, and Neither Should You

“Most will also agree that editorial work should also be done in the service of up-and-coming OA journals rather than to prop up the reputations of those that remain paywalled. But withholding peer-review from non-open journals is more controversial. Even OA campaigners sometimes raise objections. These I now propose to rebut….”

ASAPbio Crowd preprint review 2022 sign-up form

“Following our trial last year, ASAPbio is running further preprint crowd review activities in 2022. Our goal is to provide an engaging environment for researchers to participate in providing feedback on preprints and support public reviews for preprints.

In 2022, we will be coordinating public reviews for different disciplines. We are pleased to say that we are collaborating with SciELO Preprints to also coordinate the review of preprints in Portuguese. This year we will cover the following disciplines:

– Cell biology preprints from bioRxiv (English)
– Biochemistry preprints from bioRxiv (English)
– Infectious diseases preprints from SciELO Preprints (Portuguese)

**This form is for reviewers who will participate in the review of preprints from bioRXiv, to sign up for the review of SciELO Preprints in Portuguese, please complete this form: https://docs.google.com/forms/d/e/1FAIpQLSd0wrAa7FLrw8I1j5p9mysWrstehPqDqsn9UPjUbqrwRnQU-A/viewform

We invite researchers in the disciplines above to join our crowd preprint review activities, and particularly encourage early career researchers to participate. The activities will run for three months, from mid May to August 2022….”

New policy: Review Commons makes preprint review fully transparent – ASAPbio

“In a major step toward promoting preprint peer review as a means of increasing transparency and efficiency in scientific publishing, Review Commons is updating its policy: as of 1 June 2022, peer reviews and the authors’ response will be posted by Review Commons to bioRxiv or medRxiv when authors transfer their refereed preprint to the first affiliate journal….”

Economists want to see changes to their peer review system. Let’s do something about it. | VOX, CEPR Policy Portal

Peer review is central to the evaluation of research, but surprisingly little is known about its inner workings. This column presents the results of a survey of over 1,400 economists asking about their experiences with the system. The findings suggest that there are opportunities for improvement in the allocation and production of referee reports, as well as in the length of the process. The authors consider an assortment of proposals to address these issues, some of which command broad support from our respondents.

GigaScience and GigaByte Groups Join Sciety

Over the last month, we have added two new groups, GigaScience and GigaByte, from the journals of the same name, increasing the number of specialist teams displaying their evaluations on Sciety.

GigaScience and GigaByte are part of GigaScience Press. With a decade-long history of open-science publishing, they aim to revolutionise publishing by promoting reproducibility of analyses and data dissemination, organisation, understanding, and use. As open-access and open-data journals, they publish all research objects (publishing data, software and workflows) from ‘big data’ studies across the life and biomedical sciences. These resources are managed using the FAIR Principles for scientific data management and stewardship, which state that research data should be Findable, Accessible, Interoperable and Reusable. They also follow the practices of transparency and openness in science publishing, and as such, they embrace open peer review (which is mandated for both journals) and preprints (which are strongly encouraged in GigaScience and mandated for GigaByte). The opportunities for combining both are covered by GigaScience in its video on open science and preprint peer review for Peer Review Week.

 

Frontiers | Key Factors for Improving Rigor and Reproducibility: Guidelines, Peer Reviews, and Journal Technical Reviews | Cardiovascular Medicine

Abstract:  To respond to the NIH’s policy for rigor and reproducibility in preclinical research, many journals have implemented guidelines and checklists to guide authors in improving the rigor and reproducibility of their research. Transparency in developing detailed prospective experimental designs and providing raw data are essential premises of rigor and reproducibility. Standard peer reviews and journal-specific technical and statistical reviews are critical factors for enhancing rigor and reproducibility. This brief review also shares some experience from Arteriosclerosis, Thrombosis, and Vascular Biology, an American Heart Association journal, that has implemented several mechanisms to enhance rigor and reproducibility for preclinical research….