SurveyMonkey Powered Online Survey

“Thank you in advance for taking the time to respond to this survey about eLife. It should take no more than 10 minutes to complete. 

 

We seek to transform research communication and we’d love to hear your thoughts related to initiatives we’ve got underway.

All questions are optional. Your feedback is anonymous and it will help us better understand the expectations of the community and drive change and innovation in scientific and medical publishing….”

New policy: Review Commons makes preprint review fully transparent – ASAPbio

“In a major step toward promoting preprint peer review as a means of increasing transparency and efficiency in scientific publishing, Review Commons is updating its policy: as of 1 June 2022, peer reviews and the authors’ response will be posted by Review Commons to bioRxiv or medRxiv when authors transfer their refereed preprint to the first affiliate journal….”

eLife welcomes Fiona Hutton as new Head of Publishing | For the press | eLife

eLife is pleased to announce Fiona Hutton as its new Head of Publishing.

Originally a life scientist specialising in cancer virology, Hutton brings 20 years of STM publishing experience to eLife, including her most recent positions as Head of STM Open Access Publishing and Executive Publisher at Cambridge University Press, UK. She formally begins her role with eLife today, taking over from Interim Head of Publishing Peter Rodgers.

Quantitative Science Studies successfully completes transparent peer review pilot | International Society for Scientometrics and Informetrics

“In August 2020 Quantitative Science Studies (QSS) started a transparent peer review pilot, in close collaboration with our publisher, MIT Press. For articles accepted for publication in QSS, the review reports, along with the responses of the authors and the decision letters of the editor, were published in Publons, provided that the authors agreed to participate in the pilot. Reviewer identities were not revealed, unless reviewers preferred to give up their anonymity.

By publishing review reports, QSS aims to provide insight into the strengths and weaknesses of an article and into unresolved disagreements among authors, reviewers, and editors. This information may provide helpful context for readers. It also increases the accountability of reviewers and editors.

We are pleased to announce the successful completion of the QSS transparent peer review pilot. An overwhelming majority of the authors who submitted their work to QSS decided to participate in the pilot. For 90% of the articles submitted to QSS during the pilot and accepted for publication in the journal, the authors agreed to publish the review reports. The review reports for these articles are openly available in Publons under a CC BY license. In a limited number of cases, reviewers decided to reveal their identity….”

Open Peer Reviewers in Africa: Nominations are now open to recruit future peer-review trainers across the continent | Inside eLife | eLife

AfricArXiv, Eider Africa, eLife, PREreview, and TCC Africa have collaborated to develop a peer-review training workshop, Open Peer Reviewers in Africa, tailored to the region-specific context of African researchers. They co-created tools and strategies for scholarly literature evaluation, and are now ready to pilot the new workshop series with researchers who would be interested in sharing their knowledge by training others, and helping co-develop the resources further.

Job posting: ASAPbio Communications Assistant – ASAPbio

“ASAPbio is seeking a part-time Communications Assistant to help us share our work on preprints & open peer review with the life sciences community. This position provides an opportunity to engage with open science and build communication skills; it might be especially well-suited for students or postdocs.”

Nature is trialling transparent peer review — the early results are encouraging

“In an attempt to change things, Nature Communications has since 2016 been encouraging authors to publish peer-review exchanges. In February 2020, and to the widespread approval of Twitter’s science community, Nature announced that it would offer a similar opportunity. Authors of new manuscript submissions can now have anonymous referee reports — and their own responses to these reports — published at the same time as their manuscript. Those who agree to act as reviewers know that both anonymous reports and anonymized exchanges with authors might be published. Referees can also choose to be named, should they desire.

A full year’s data are now in, and the results are encouraging. During 2021, nearly half (46%) of authors chose to publish their discussions with reviewers, although there is variation between disciplines (see ‘Peer review opens up’). Early data suggest more will do so in 2022. This is a promising trend. And we strongly encourage more researchers to take this opportunity to publish their exchanges. Last year, some 69% of Nature Communication’s published research articles were accompanied by anonymous peer-review reports together with author–reviewer exchanges, including manuscripts in life sciences (73% of published papers), chemistry (59%), physics (64%) and Earth sciences (77%)….

The benefits to research are huge. Opening up peer review promotes more transparency, and is valuable to researchers who study peer-review systems. It is also valuable to early-career researchers more broadly. Each set of reports is a real-life example, a guide to how to provide authors with constructive feedback in a collegial manner….”

Guest Post: Preprint Feedback is Here – Let’s Make it Constructive and FAST – The Scholarly Kitchen

“Last year, we convened a Working Group to reflect upon and develop a set of best practices for public preprint feedback. Our rationale was to provide a framework that could benefit and support authors, reviewers, and the community to engage in public and open scientific discussion of preprints, while ensuring a thriving and welcoming environment for everyone. The group worked on a set of “norms” that reflect the behaviors and culture we would like to promote to increase participation and acceptance of preprint public commenting. The result is the FAST principles for preprint feedback, a set of 14 principles clustered around four broad themes:

Focused: assess whether our comments and feedback are targeted towards relevant and actionable parts of the preprint (e.g., the current focus of the paper or the scientific work, rather than suitability for a particular journal).
Appropriate: ensure that before engaging in any kind of scientific discussion, we have reflected on our biases and behave with the same level of integrity as in any other scientific exchange.
Specific: similar to reviewing a manuscript for a journal, preprint feedback should also evaluate the study’s claims against the data and clarify whether the critiques are major or just meant to tackle minor issues that don’t affect the overall findings.
Transparent: as with any type of scientific discussion, it is key to be as open and transparent as possible, embracing any oversights and crediting everyone who participated in the work. However, we acknowledge that not everyone is comfortable signing their comments. In such cases, we provide alternative options for reviewers to disclose their background or expertise to contextualize the comments they post….”

Let’s make our reviews open, starting now

“The good news is that we do not need to wait for a protracted system change, there are ways in which researchers can already make their reviews both more open and useful for the community:

Only review for journals that publish the review reports for the articles they publish.
When reviewing for a journal, if the paper has an associated preprint, post the review publicly on the preprint, as proposed by Ludo Waltman, James Fraser and Cooper Smout. This attaches the evaluation to the scientific work itself, independent of the eventual journal’s decision.
Post your comments and reviews on preprints. Researchers discuss research papers in many formats and forums, independent of a journal’s editorial process. They discuss the latest paper in their research area with collaborators, debate articles at journal clubs or social media, or discuss research at conferences. All that feedback is valuable, and if made public it would not only benefit the authors but also other members of the research community and the public….”

History Can Be Open Source: Democratic Dreams and the Rise of Digital History | The American Historical Review | Oxford Academic

Abstract:  In an ongoing commitment to experimentation, the AHR invited an “open peer review” of a submitted manuscript, “History Can Be Open Source: Democratic Dreams and the Rise of Digital History,” by Joseph L. Locke (University of Houston–Victoria) and Ben Wright (University of Texas at Dallas). Given that Locke and Wright argued for the coexistence of transparency alongside formal academic peer review, subjecting their submission to an open review made sense. The peer review process itself tested the propositions about the democratization of scholarship they put forth in their submission. Their article appears in a new section of the AHR, “Writing History in a Digital Age,” overseen by consulting editor Lara Putnam (https://ahropenreview.com/). The maturation of digital history has propelled historians’ embrace of open educational resources. But, this article argues, open access licensing is not enough. Digital history’s earliest practitioners promised not just more accessible digital materials, but a broader democratization of history itself. This article therefore moves beyond questions of technological innovation and digital access in the rise of digital history to engage more fundamental and intractable questions about inequality, community, and participatory historical inquiry.

 

Preprints and open peer review come of age | Research Information

“Scott Edmunds highlights opportunities to improve the publishing process by improving transparency, accountability and speed

As editor in chief of GigaScience, for a decade we have increased transparency and trust in our published work by throwing open the entire peer review process, and letting people really ‘look under the hood’. While this isn’t a new concept, the first experiments in open peer review go back at least 40 years and the medical community started embracing it in the late 90s, in the last few years the practice has become increasingly mainstream. Many Nature Journals and PLOS have taken the leap in opening their review process, and the UNESCO Recommendation on Open Science specifically highlighted open peer review as one of the approaches its member states should promote for open science. The challenges from the Covid-19 pandemic makes these moves even more timely, as increasing transparency in research is an extremely useful weapon in fighting scepticism in the scientific process during these turbulent times.

As peer review has opened, reviewers are now able to gain credit for their hard work, and technological approaches are now available to help them advertise and amplify their peer reviewing duties. On top of third-party platforms that capture peer review history, such as the pioneering Publons, ORCID has provided peer review functionality and a ‘Reviews Activity’ section on its profile pages. Leveraging this, journals such as GigaScience, F1000Research and PeerJ have been giving their peer reviews DOIs to make them independently citable and easily claimed in author ORCID profiles….”

Preprints and open peer review come of age | Research Information

“Scott Edmunds highlights opportunities to improve the publishing process by improving transparency, accountability and speed

As editor in chief of GigaScience, for a decade we have increased transparency and trust in our published work by throwing open the entire peer review process, and letting people really ‘look under the hood’. While this isn’t a new concept, the first experiments in open peer review go back at least 40 years and the medical community started embracing it in the late 90s, in the last few years the practice has become increasingly mainstream. Many Nature Journals and PLOS have taken the leap in opening their review process, and the UNESCO Recommendation on Open Science specifically highlighted open peer review as one of the approaches its member states should promote for open science. The challenges from the Covid-19 pandemic makes these moves even more timely, as increasing transparency in research is an extremely useful weapon in fighting scepticism in the scientific process during these turbulent times.

As peer review has opened, reviewers are now able to gain credit for their hard work, and technological approaches are now available to help them advertise and amplify their peer reviewing duties. On top of third-party platforms that capture peer review history, such as the pioneering Publons, ORCID has provided peer review functionality and a ‘Reviews Activity’ section on its profile pages. Leveraging this, journals such as GigaScience, F1000Research and PeerJ have been giving their peer reviews DOIs to make them independently citable and easily claimed in author ORCID profiles….”

It is finally time for post-publication review

“Several platforms now exist for constructive post-publication conversations. Some succeed in addressing a few of these challenges, but none have been adopted widely. For example, PubPeer has gained traction as a place where researchers can post criticisms of poor-quality papers. Sites such as ScienceOpen, SciBase and PreReview aim to be repositories of crowdsourced manuscript reviews. A growing list of post-pub platforms is hosted at Reimagine Review. Increasingly, authors can post insights about their own work on social media, and readers will ask questions and provide feedback.

If the conversation about post-publication review has been ongoing for many years, and the platforms exist and are ready to use, there must be larger reasons why scientists rarely engage with papers — their own or their colleagues’ — once the final version goes up online.

I see three main challenges to post-publication peer review:

Incentives. Scientists want issues of reproducibility and post-publication dialogue to be addressed but have no incentive to engage. They rightfully ask, “Will this help me get a postdoc position? How about a promotion? Will this increase my standing in the scientific community?” Any platform that seeks to be effective must align self-interest and nobler motives.

Political realities. Science positions itself as an objective pursuit of truth, but research scientists know that’s not always how it works. How likely is a graduate student to criticize publicly their professor’s work, even when their point is valid? A failure to recognize the prominent role these dynamics play in human behavior will limit any solution’s effectiveness.

Access. Reviewing a paper requires being able to read that paper, but most scientific knowledge is published in subscription-based journals whose business model is built on limited access. While the proliferation of preprint servers and open-access journals is a tremendous step in the right direction, the research community has a long way to go before publishing a paper means everyone can read it. …”

FAST principles for preprint feedback – ASAPbio

“While preprint feedback is beneficial for the authors, reviewers, readers and other stakeholders, public commenting on preprints has so far remained relatively low. Cultural barriers likely influence participation in public preprint feedback. Authors fear that competitors will leave unfair criticism, or that even fair criticism will bias journal editors and evaluators: while nearly every paper will be thoroughly criticized during journal peer review, the rarity of this feedback being out in the open might lead some to believe that the paper receiving it is especially problematic. Potential reviewers, especially those who rely on more senior colleagues for career advancement, are concerned about retribution for public criticism, or simply harming their reputation by leaving uninformed feedback.

In order to overcome these concerns, we convened a Working Group to discuss how to alleviate the social friction associated with public feedback by developing a set of behavioral norms to guide constructive participation in preprint review. The Working Group brought together relevant stakeholders (researchers, editors, preprint review platform representatives, funders) to discuss the challenges around participation in preprint review and explore what cultural norms could enable and foster further participation in public commentary and feedback. …”