Time to Reform Academic Publishing | Forum

“In particular, as graduate, professional, and medical students, we have been shaped by the relics of an inequitable publishing model that was created before the age of the internet. Our everyday work—from designing and running experiments to diagnosing and treating patients—relies on the results of taxpayer-funded research. Having these resources freely available will help to accelerate innovation and level the playing field for smaller and less well-funded research groups and institutions. With this goal of creating an equitable research ecosystem in mind, we want to highlight the importance of creating one that is equitable in whole….

But today, the incentives for institutions do not align with goals of equity, and change will be necessary to help support a more equitable system. Nor do incentives within institutions always align with these goals. This is especially true for early-career researchers, who might struggle to comply with new open-access guidelines if they need to pay a high article publishing fee to make their research open in a journal that is valued by their institutions’ promotion and tenure guidelines.

To these ends, it is imperative that the process for communicating research results to the public and other researchers does not shift from a “pay-to-read” model to a “pay-to-publish” model. That is, we should not use taxpayer dollars to pay publishers to make research available, nor should we simply pass these costs on to researchers. This approach would be unsustainable long-term and would go against the equity goals of the new OSTP policy. Instead, we hope that funders, professional societies, and institutions will come along with us in imagining and supporting innovative ways for communicating science that are more equitable and better for research….”

Survey points to key two challenges with preprint feedback: recognition and trust – ASAPbio

“In preparation for the Recognizing Preprint Peer Review workshop, ASAPbio integrated input from two working groups to prepare a survey for researchers, funders, and journal editors and publishing organization employees. The survey sought to gather views and experience with preprint feedback and review from a broad range of stakeholders, to help inform the conversations at the workshop.

The survey garnered 230 responses, and we share here summaries of the two largest categories of respondents: 161 responses from researchers and 51 responses from journal editors and publishing organization employees. You can view the results on Google Sheets and on Zenodo….

Most respondents had received no feedback on preprints, which, for the purpose of this survey, we defined as any public commentary on preprints. Of those who had received some feedback, only a small fraction indicated that the feedback came in the form of detailed reviews. 

With few researchers having received feedback, perhaps it’s unsurprising that a significant number of them expressed concerns with the prospect: the most significant concerns related to hesitancy about the quality or fairness of the feedback and about the commenter’s motivations for providing it.

However, more than half of respondents said they’d be likely or very likely to request feedback on their preprints if journals incorporated preprint reviews into editorial decisions or treated them like reviews transferred from another journal. Other potential incentives, such as funders recognizing preprint peer reviews in various ways, were not far behind….”

Incentivizing Collaborative Open Research (ICOR) Envisions a Culture That Rewards Open Science – SPARC

“The sweeping movement towards open research has set in motion changes across funding bodies, institutions, and scholars. For open research to take off, sharing at all stages of the research cycle needs to be easy and the benefits explicitly recognized.

A new project is cataloguing best practices and promoting real incentives to work in the open, with the aim of improving reproducibility and accelerating outcomes to advance science.

Incentivizing Collaborative Open Research (ICOR) began in 2020 with discussions among a circle of 20 strategists led by Kristen Ratan, founder of Strategies for Open Science (Stratos) and Sarah Greene, founder of Rapid Science. The team set out to identify policies, tools, and practices that can be tested to provide evidence regarding the power of operating in the open. A goal of ICOR is to highlight pockets of innovation and to connect researchers with concrete practices that ease and improve their work in open science….”

Open Science to Increase Reproducibility in Science – Research Data Management

“OSIRIS will investigate, trial and implement interventions to improve reproducibility in science

While over the past decade many interventions to improve reproducibility have been introduced, targeted at funders, publishers or individual researchers, only few of them have been empirically tested. OSIRIS will do just that, testing existing and newly developed interventions, including Open Science practices, through controlled trials. The underlying drivers, barriers and incentives of reproducibility will also be studied in a systematic way. Aim is to deliver and disseminate guidance about evidence-based interventions that can improve the reproducibility of scientific findings….”

Fifth U.S. Open Government National Action Plan | open.USA.gov

“Broaden Public Access to Federally-Funded Research Findings and Data.

Many important scientific and technological discoveries, including those that have helped mitigate the COVID-19 pandemic, have been supported by American tax dollars. Yet frequently, the results of such Federally-funded research are out of reach for many Americans, available only for a cost or with unnecessary delays. These barriers to accessing Federally-supported research deepen inequalities, as funding disadvantages faced by under-resourced institutions like minority-serving colleges and universities prevent communities from accessing the results of research that taxpayers have funded. To tackle these obstacles and unlock new possibilities for further innovation and participation in science, the Federal Government previously delivered guidance to agencies to develop plans for greater public access to taxpayer-funded research.

Looking forward, the Biden-Harris Administration is taking new steps to expand and accelerate access to publicly-funded research results by ensuring that publications and associated data resulting from Federally funded research are freely and publicly available without delay after publication. Making data underpinning research publications more readily available improves transparency into Federally-supported work, enabling others to replicate and build on research findings. Going forward, the Government commits to supporting access to Federally-funded science and data through several mechanisms, including through the National Science and Technology Council’s Subcommittee on Open Science; by permitting researchers to include publication and data sharing costs in their research budget proposals to Federal grant programs; by launching programs aimed at awarding more grants to early-stage researchers as well as encouraging a diverse pool of award applicants; and by exploring new incentive structures to recognize institutions and researchers who are supporting public access to data and research.”

Market forces influence editorial decisions – ScienceDirect

“In this issue of Cortex Huber et al. recount their experience in attempting to update the scientific record through an independent replication of a published study (Huber, Potter, & Huszar, 2019). In general, publishers resist issuing retractions, refutations or corrections to their stories or papers for fear of losing public trust, diminishing their brand and possibly ceding their market share (Sullivan, 2018). Unfortunately, this is just one way that market logic – retaining a competitive advantage among peers – explicitly or implicitly influences editorial priorities and decisions more broadly….

There’s the well-known tautology that news is what newsrooms decide to cover and what’s “newsworthy” is influenced by market logic. That news organizations, charged with relating truth and facts, are subject to market-based decisions is a major source of contention among the discerning public. It should be even more contentious that the stewards of scientific knowledge, academic publishers, are also beholden to it….

Although top journals are loathe to admit they ‘chase cites’ (Editorial, 2018), market forces make this unavoidable. One example is a strategy akin to product cost cross subsidization such as when in journalism profitable traffic-driving, click-bait articles subsidize more costly and in-depth, long-form investigative reporting. In order to attract the ‘best’ science, top journals must maintain a competitive impact factor. If the impact factor strays too far from the nearest competitor, then the journal will have trouble publishing the science it deems as most important because of the worth coveted researchers place on perceived impact….

Although publishers tout the value of replications and pay lip service to other reformative practices, their policies in this regard are often vague and non-committal….

Most professional editors are committed to advancing strong science, but however well-intentioned and sought in good faith reforms are, they are necessarily hamstrung by market forces. This includes restrained requirements for more rigorous and responsible research conduct. Journals do not want to put in place policies that are seemingly so onerous that authors decide to instead publish in competing but less demanding journals. Researchers need incentives for and enforcement of more rigorous research practices, but they want easier paths to publication. The result is that new policies at top journals allow publishers to maintain a patina of progressiveness in the absence of real accountability….

The reforms suggested by Huber et al. are welcome short-term fixes, but the community should demand longer-term solutions that break up the monopoly of academic publishers and divorce the processes of evaluation, publication and curation (Eisen and Polka, 2018). Only then may we wrest the power of science’s stewardship from the heavy hand of the market.”

Indonesian research access: quantity over quality?

“Prior to the open access movement and the proliferation of the internet, almost all Indonesian higher education institutions made thesis and dissertation collections closed, accessible only with certain permissions….

Prior to the open access movement and the proliferation of the internet, almost all Indonesian higher education institutions made thesis and dissertation collections closed, accessible only with certain permissions….

The lack of selection process and quality control for the scholarly resources uploaded to the institutional repositories had led to some unhelpful material making it way into them: documents with supervisor’s comments still visible; documents that were compressed or password protected; documents that were uploaded as multiple image files; documents that were available only partially; and so on….

 

When quantity trumps quality, the repositories become less effective as a means of disseminating scholarly works….”

Toppling the Ivory Tower: Increasing Public Participation in Research Through Open and Citizen Science

“Prior to the emergence of professional researchers, amateurs without formal training primarily made contributions to science in what is known as ‘citizen science.’ Over time, science has become less accessible to the public, while at the same time public participation in research has decreased. However, recent progress in open and citizen science may be the key to strengthening the relationship between researchers and the public. Citizen science may also be key to collecting data that would otherwise be unobtainable through traditional sources, such as measuring progress on the United Nations Sustainable Development Goals (SDGs). However, despite myriad benefits, there has been limited legislative action taken to promote open and citizen science policies. The underlying issues are incentive systems which overemphasize publication in high impact, for-profit journals. The suggested policy solutions include: 1) creating an open database for citizen science projects, 2) restricting publishers from disadvantaging citizen science, and 3) incorporating open science in researcher evaluation.”

Equitable Research Capacity Towards the Sustainable Development Goals: The Case for Open Science Hardware

“Changes in science funders’ mandates have resulted in advances in open access to data, software, and publications. Research capacity, however, is still unequally distributed worldwide, hindering the impact of these efforts. We argue that to achieve the Sustainable Development Goals (SDGs), open science policies must shift focus from products to processes and infrastructure, including access to open source scientific equipment. This article discusses how conventional, black box, proprietary approaches to science hardware reinforce inequalities in science and slow down innovation everywhere, while also representing a threat to research capacity strengthening efforts. We offer science funders three policy recommendations to promote open science hardware for research capacity strengthening: a) incorporating open hardware into existing open science mandates, b) incentivizing demand through technology transfer and procurement mechanisms, c) promoting the adoption of open hardware in national and regional service centers. We expect this agenda to foster capacity building towards enabling the more equitable and efficient science needed to achieve the SDGs.”

Reducing Barriers to Open Science by Standardizing Practices and Realigning Incentives

“Open science, the practice of sharing findings and resources towards the collaborative pursuit of scientific progress and societal good, can accelerate the pace of research and contribute to a more equitable society. However, the current culture of scientific research is not optimally structured to promote extensive sharing of a range of outputs. In this policy position paper, we outline current open science practices and key bottlenecks in their broader adoption. We propose that national science agencies create a digital infrastructure framework that would standardize open science principles and make them actionable. We also suggest ways of redefining research success to align better with open science, and to incentivize a system where sharing various research outputs is beneficial to researchers.”

Open research is a tough nut to crack. Here’s how

“Investment, training and incentives are required if the sector is going to rise to the challenge of truly embracing open research…

But despite enthusiasm from funders, appropriate support for researchers is often lacking, perhaps because of the incentives that act against institutions finding shared solutions. Open research requires digital infrastructure combined with appropriate training. This is a team challenge – researchers, technicians and professional services staff (such as those working in library teams but also in staff development) need to work together to deliver this effectively….

The solution, in our view, is collaboration.

 

Training in open research practices, for example, can (and should) be coordinated. To a degree it can even be centralised, in a similar way to how digital infrastructure can be centralised where that is appropriate (such as with Jisc). For example, train-the-trainer courses allow institutions to send trainers (perhaps drawn from both academic and professional services staff, and across career stages) to work together to develop individual workshops that are tailored to the local audience but share common elements that maximise interoperability. This, of course, will require institutions to contribute to a common effort – a very different approach to the local approach to training that is typical, but one that is ultimately likely to be both more effective and more cost-effective.

 

Similarly, incentives such as promotion criteria, open research prizes and so on can also be harmonised across institutions. Aligned promotion criteria will also serve to promote researcher mobility, if what is good for a researcher’s career (and for research) at one institution will also benefit them when they move to another. Offering open research prizes across multiple institutions – perhaps taking advantage of existing regional clusters – will reduce costs for individual institutions and also foster the sharing of effective and innovative approaches to open research across institutions, to mutual benefit. Plus, the impact of this training and these incentives can be monitored through targeted evaluation across all participating institutions, allowing for ongoing evaluation and benchmarking….”

Champions of Transparency in Education: What Journal Reviewers Can Do to Encourage Open Science Practices

Abstract:  As the field of education and especially gifted education gradually moves towards open science, our research community increasingly values transparency and openness brought by open science practices. Yet, individual researchers may be reluctant to adopt open science practices due to low incentives, barriers of extra workload, or lack of support to apply these in certain areas, such as qualitative research. We encourage and give guidelines to reviewers to champion open science practices by warmly influencing authors to consider applying open science practices to quantitative, qualitative, and mixed methods research and providing ample support to produce higher-quality publications. Instead of imposing open science practices on authors, we advocate reviewers suggest small, non-threatening, specific steps to support authors without making them feel overwhelmed, judged, or punished. We believe that these small steps taken by reviewers will make a difference to create a more supportive environment for researchers to adopt better practices.

 

‘Stop Congratulating Colleagues for Publishing in High-Impact Factor Journals’ – The Wire Science

The current scholarly publishing system is detrimental to the pursuit of knowledge and needs a radical shift. Publishers have already anticipated new trends and have tried to protect their profits.
Current publishers’ power stems from the historical roots of their journals – and researchers are looking for symbolic status in the eye of their peers by publishing in renowned journals.
To counter them effectively, we need to identify obstacles that researchers themselves might face. Journals still perform some useful tasks and it requires effort to devise working alternatives.
There have already been many attempts and partial successes to drive a new shift in scholarly publishing. Many of them should be further developed and generalised.
In this excerpt from a report prepared by the Basic Research Community for Physics, the authors discuss these successes and make recommendations to different actors….”

Open Scholarship Priorities and Next Steps: Public Workshop Registration, Mon, Dec 5, 2022 at 8:30 AM | Eventbrite

“The National Academies of Sciences, Engineering, and Medicine’s Roundtable on Aligning Incentives for Open Scholarship will organize a one-day public workshop in conjunction with its Fall 2022 meeting. The workshop will explore actions being taken by various stakeholder organizations to foster the broad adoption of policies and practices in support of open scholarship. It will bring together participants from universities, scholarly societies, federal agencies, and private research funders. A Proceedings of a Workshop–in Brief will be prepared by designated rapporteurs and distributed broadly.

The public is invited to register to join virtually….”

Motivations, concerns and selection biases when posting preprints: A survey of bioRxiv authors | PLOS ONE

Abstract:  Since 2013, the usage of preprints as a means of sharing research in biology has rapidly grown, in particular via the preprint server bioRxiv. Recent studies have found that journal articles that were previously posted to bioRxiv received a higher number of citations or mentions/shares on other online platforms compared to articles in the same journals that were not posted. However, the exact causal mechanism for this effect has not been established, and may in part be related to authors’ biases in the selection of articles that are chosen to be posted as preprints. We aimed to investigate this mechanism by conducting a mixed-methods survey of 1,444 authors of bioRxiv preprints, to investigate the reasons that they post or do not post certain articles as preprints, and to make comparisons between articles they choose to post and not post as preprints. We find that authors are most strongly motivated to post preprints to increase awareness of their work and increase the speed of its dissemination; conversely, the strongest reasons for not posting preprints centre around a lack of awareness of preprints and reluctance to publicly post work that has not undergone a peer review process. We additionally find evidence that authors do not consider quality, novelty or significance when posting or not posting research as preprints, however, authors retain an expectation that articles they post as preprints will receive more citations or be shared more widely online than articles not posted.