Diversity, sustainability and quality must be the hallmarks of academic publishing in Europe – The Guild

“Ahead of the June Competitiveness Council, where the ministers will be invited to adopt conclusions on research assessment and implementation of Open Science policies, The Guild urges the member states to ensure that Open Access serves science, not publishers.

While research excellence requires free flow of knowledge, some Open Access strategies and models increase the financial burden on research institutions. Article Processing Charges (APCs), used by some of the Open Access journals, exacerbate the unsustainable situation of journal spending in university libraries and create unequal access to knowledge. Greater transparency on the publication costs for Open Access journals, and fair and transparent contractual arrangements with publishers are crucial for monitoring the proper use of public research funding.

It is important to develop alternative and sustainable non-APC Open Access models. The Guild calls for the member states to support the development and uptake of Diamond Open Access journals and platforms which consist often of community-driven, and academic-led and owned publishing initiatives. Unlike other Open Access models, Diamond Open Access journals and platforms do not charge any fees from the authors or readers. Thus, they can further empower researchers to disseminate their research results, ensuring bibliodiversity and vital academic publishing….”

Diversity, sustainability and quality must be the hallmarks of academic publishing in Europe – The Guild

“Ahead of the June Competitiveness Council, where the ministers will be invited to adopt conclusions on research assessment and implementation of Open Science policies, The Guild urges the member states to ensure that Open Access serves science, not publishers.

While research excellence requires free flow of knowledge, some Open Access strategies and models increase the financial burden on research institutions. Article Processing Charges (APCs), used by some of the Open Access journals, exacerbate the unsustainable situation of journal spending in university libraries and create unequal access to knowledge. Greater transparency on the publication costs for Open Access journals, and fair and transparent contractual arrangements with publishers are crucial for monitoring the proper use of public research funding.

It is important to develop alternative and sustainable non-APC Open Access models. The Guild calls for the member states to support the development and uptake of Diamond Open Access journals and platforms which consist often of community-driven, and academic-led and owned publishing initiatives. Unlike other Open Access models, Diamond Open Access journals and platforms do not charge any fees from the authors or readers. Thus, they can further empower researchers to disseminate their research results, ensuring bibliodiversity and vital academic publishing….”

The potential butterfly effect of preregistered peer-reviewed research – The Official PLOS Blog

“Refocusing journal peer review on the study design phase exerts more and greater downstream changes. Peer review that focuses on evaluating significance of the research question, the methods and analytical approach before work begins, has the power to shape stronger, more rigorous and more creative research. Making an editorial decision while results are still unknown minimizes the potential impacts confirmation bias and impact bias, taking science communication back to its roots, with an emphasis on quality, rigor, and a pure intellectual curiosity. As Kiermer explains, “Preregistration and peer review of the study protocol with a journal is a way to tackle publication bias. As long as the protocol is followed, or any deviations explained, it’s a guarantee for the author that the results will be published, even if they don’t confirm their hypothesis.”

In combination, all of these factors contribute to a more complete and efficient scientific record, replete with studies exploring important hypotheses, performed to the very highest technical standards, and free from the distorting influence of impact-chasing, ego, and bias. A scientific record that is both demonstrably trustworthy, and widely trusted. And with that, there is no telling where science might go, or how quickly….”

Why preprints are good for patients | Nature Medicine

“Rapid communication of clinical trial results has likely saved lives during the COVID-19 pandemic and should become the new norm….

But during health emergencies, there are many tensions, one of which is the mismatch between the urgent need for information and evidence and the much longer time frames of scientific peer review and publication. The COVID-19 pandemic is the first global health emergency of the new information age, with data and results widely shared via social media. This has resulted in very real difficulties in distinguishing important information from noise, and real news from fake news. How should the research and medical community best manage this new reality?…

Some may argue that the speed advantage of preprints does not outweigh the risks of poor-quality, misleading or even fraudulent research being published and acted upon. I would counter that clinicians should not rely solely on peer review to assess the validity and meaningfulness of research findings. This is because dubious, perhaps fraudulent data can still get through peer review, as was seen with early COVID papers published and then retracted from two of the most prestigious medical journals. In addition, even valid data can be misleading. There has been an avalanche of observational data that passed peer review and was then used to justify treatments, most notably with hydroxychloroquine, but the susceptibility of observational methodology to moderate biases means that such data should not be the basis of patient care.

I take two lessons from our experience running the largest COVID-19 clinical trial over the last two years. The first is that that the preprint system has come of age, demonstrating huge value in rapidly communicating important research findings. Almost daily I am alerted through social media alerts from trusted sources and colleagues of important new findings published as preprints. A degree of immediate peer review is also available by means of the preprint comments section and from colleagues via social media. The full peer-reviewed manuscripts usually appear many weeks or even months later. I cannot envisage a future without such rapid dissemination of new evidence.

 

Given this new reality, the second lesson is that we must ensure that the medical community and policy makers are sufficiently skilled in critical thinking and scientific methods that they can make sensible decisions, regardless of whether an article is peer reviewed or not.”

Questionable research practices among researchers in the most research?productive management programs – Kepes – – Journal of Organizational Behavior – Wiley Online Library

Abstract:  Questionable research practices (QRPs) among researchers have been a source of concern in many fields of study. QRPs are often used to enhance the probability of achieving statistical significance which affects the likelihood of a paper being published. Using a sample of researchers from 10 top research-productive management programs, we compared hypotheses tested in dissertations to those tested in journal articles derived from those dissertations to draw inferences concerning the extent of engagement in QRPs. Results indicated that QRPs related to changes in sample size and covariates were associated with unsupported dissertation hypotheses becoming supported in journal articles. Researchers also tended to exclude unsupported dissertation hypotheses from journal articles. Likewise, results suggested that many article hypotheses may have been created after the results were known (i.e., HARKed). Articles from prestigious journals contained a higher percentage of potentially HARKed hypotheses than those from less well-regarded journals. Finally, articles published in prestigious journals were associated with more QRP usage than less prestigious journals. QRPs increase in the percentage of supported hypotheses and result in effect sizes that likely overestimate population parameters. As such, results reported in articles published in our most prestigious journals may be less credible than previously believed.

 

» How College Students Are Improving Wikipedia

“Some of that information has been added by college students from New England, written as a class assignment. Wiki Education, a small nonprofit, runs a program called the Wikipedia Student Program, in which we support college and university faculty who want to assign their students to write Wikipedia articles as part of their coursework.

Why do instructors assign their students to edit Wikipedia as a course assignment? Research shows a Wikipedia assignment increases motivation for students, while providing them learning objectives like critical thinking, research, writing for a public audience, evaluating and synthesizing sources and peer review. Especially important in today’s climate of misinformation and disinformation is the critical digital media literacy skills students gain from writing for Wikipedia, where they’re asked to consider and evaluate the reliability of the sources they’re citing. In addition to the benefits to student learning outcomes, instructors are also glad to see Wikipedia’s coverage of their discipline get better. And it does get better; studies such as this and this and this have shown the quality of content students add to Wikipedia is high.

Since 2010, more than 5,100 courses have participated in the program and more than 102,000 student editors have added more than 85 million words to Wikipedia. That’s 292,000 printed pages or the equivalent of 62 volumes of a printed encyclopedia. To put that in context, the last print edition of Encyclopedia Britannica had only 32 volumes. That means Wikipedia Student Program participants have added nearly twice as much content as was in Britannica. …”

Rethinking Research Assessment for the Greater Good: Findings from the RPT Project – Scholarly Communications Lab | ScholCommLab

“The review, promotion, and tenure (RPT) process is central to academic life and workplace advancement. It influences where faculty direct their attention, research, and publications. By unveiling the RPT process, we can inform actions that lead towards a greater opening of research. 

Between 2017 and 2022, we conducted a multi-year research project involving the collection and analysis of more than 850 RPT guidelines and 338 surveys with scholars from 129 research institutions across Canada and the US. Starting with a literature review of academic promotion and tenure processes, we launched six studies applying mixed methods approaches such as surveys and matrix coding.

So how do today’s universities and colleges incentivize open access research? Read on for 6 key takeaways from our studies….”

The rise of preprints — University Affairs

“Peer review, despite its flaws, is one of the most important pillars of the scientific process. So preprint servers, which make scientific papers that have yet to be reviewed or published available online, have been slow to catch on in many fields.

But then came the pandemic.

“COVID changed everything,” says Jim Handman, executive director of the Science Media Centre of Canada. Scientists, science communicators, and journalists who had been wary of using preprints in the past suddenly felt the urgency to get important new information out as fast as possible to help deal with the unprecedented public health threat. The use of preprint servers skyrocketed. Now, everyone is adapting to this new way of working, developing best practices to harness the benefits of increased speed and wider reach while mitigating the risks of sharing unreviewed science.

Most of the time, the world of scholarly publishing moves at an almost glacial pace. New publications can take months or even years to wind their way through the process of peer review and publication. Even then, they can be hard to access for most people. So 30 years ago, some scientists started posting their work in online repositories before it had been formally reviewed and published. ArXiv, which shares research on math, physics, and astronomy, was the first to launch in 1991. It was followed by repositories for other subject areas over the next few decades….”

What do participants think of our research practices? An examination of behavioural psychology participants’ preferences | Royal Society Open Science

Abstract:  What research practices should be considered acceptable? Historically, scientists have set the standards for what constitutes acceptable research practices. However, there is value in considering non-scientists’ perspectives, including research participants’. 1873 participants from MTurk and university subject pools were surveyed after their participation in one of eight minimal-risk studies. We asked participants how they would feel if (mostly) common research practices were applied to their data: p-hacking/cherry-picking results, selective reporting of studies, Hypothesizing After Results are Known (HARKing), committing fraud, conducting direct replications, sharing data, sharing methods, and open access publishing. An overwhelming majority of psychology research participants think questionable research practices (e.g. p-hacking, HARKing) are unacceptable (68.3–81.3%), and were supportive of practices to increase transparency and replicability (71.4–80.1%). A surprising number of participants expressed positive or neutral views toward scientific fraud (18.7%), raising concerns about data quality. We grapple with this concern and interpret our results in light of the limitations of our study. Despite the ambiguity in our results, we argue that there is evidence (from our study and others’) that researchers may be violating participants’ expectations and should be transparent with participants about how their data will be used.

 

 

The methodological quality of physical therapy related trial… : American Journal of Physical Medicine & Rehabilitation

Abstract:  Objective 

We aimed to compare the methodological quality of physical therapy-related trials published in open access with that of trials published in subscription-based journals, adjusting for subdiscipline, intervention type, endorsement of the consolidated standards of reporting trials (CONSORT), impact factor, and publication language.

Design 

In this meta-epidemiological study, we searched the Physiotherapy Evidence Database (PEDro) on May 8, 2021, to include any physical therapy-related trials published from January 1, 2020. We extracted variables such as CONSORT endorsement, the PEDro score, and publication type. We compared the PEDro score between the publication types using a multivariable generalized estimating equation (GEE) by adjusting for covariates.

Results 

A total of 2,743 trials were included, with a mean total PEDro score (SD) of 5.8 (±1.5). Trials from open access journals had a lower total PEDro score than those from subscription-based journals (5.5 ± 1.5 vs. 5.9 ± 1.5, mean difference [MD]: ?0.4; 95% confidence interval: 0.3–0.5). GEE revealed that open access publication was significantly associated with the total PEDro score (MD: ?0.42; P < 0.001).

Conclusions 

In the recent physical therapy-related trials, open access publications demonstrated lower methodological quality than subscription-based publications, although with a small difference.

Opening Up to Open Science

“This way of sharing science has some benefits: peer review, for example, helps to ensure (even if it never guarantees) scientific integrity and prevent inadvertent misuse of data or code. But the status quo also comes with clear costs: it creates barriers (in the form of publication paywalls), slows the pace of innovation, and limits the impact of research. Fast science is increasingly necessary, and with good reason. Technology has not only improved the speed at which science is carried out, but many of the problems scientists study, from climate change to COVID-19, demand urgency. Whether modeling the behavior of wildfires or developing a vaccine, the need for scientists to work together and share knowledge has never been greater. In this environment, the rapid dissemination of knowledge is critical; closed, siloed knowledge slows progress to a degree society cannot afford. Imagine the consequences today if, as in the 2003 SARS disease outbreak, the task of sequencing genomes still took months and tools for labs to share the results openly online didn’t exist. Today’s challenges require scientists to adapt and better recognize, facilitate, and reward collaboration….

This tension between individual and institutional incentives and the progress of science must be recognized and resolved in a manner that contributes to solving the great challenges of today and the future. To change the culture, researchers must do more than take a pledge; they must change the game—the structures, the policies, and the criteria for success. In a word, open science must be institutionalized….

A powerful open science story can be found in the World Climate Research Programme’s Coupled Model Intercomparison Project (CMIP), established in 1995. Before CMIP, with the internet in its infancy, climate model results were scattered around the world and difficult to access and use. CMIP inspired 40 modeling groups and about 1,000 researchers to collaborate on advancing modeling techniques and setting guidelines for how and where to share results openly. That simple step led to an unexpected transformation: as more people were able to access the data, the community expanded, and more groups contributed data to CMIP. More people asking questions and pointing out issues in their results helped drive improvements. In its assessment reports, the Intergovernmental Panel on Climate Change relied on research publications using CMIP data to assess climate change. As a platform, CMIP enabled thousands of scientists to work together, self-correct their work, and create further ways to collaborate—a virtuous circle that attracted more scientists and more data, and increased the speed and usefulness of the work….

The most important message from these reports is that all parts of science, from individual researchers to universities and funding agencies, need to coordinate their efforts to ensure that early adopters aren’t jeopardizing their careers by joining the open science community. The whole enterprise has to change to truly realize the full benefits of open science. Creating this level of institutional adoption also requires updating policies, providing training, and recognizing and rewarding collaborative science….”

Frontiers | Key Factors for Improving Rigor and Reproducibility: Guidelines, Peer Reviews, and Journal Technical Reviews | Cardiovascular Medicine

Abstract:  To respond to the NIH’s policy for rigor and reproducibility in preclinical research, many journals have implemented guidelines and checklists to guide authors in improving the rigor and reproducibility of their research. Transparency in developing detailed prospective experimental designs and providing raw data are essential premises of rigor and reproducibility. Standard peer reviews and journal-specific technical and statistical reviews are critical factors for enhancing rigor and reproducibility. This brief review also shares some experience from Arteriosclerosis, Thrombosis, and Vascular Biology, an American Heart Association journal, that has implemented several mechanisms to enhance rigor and reproducibility for preclinical research….

Digitales Publizieren und die Qualitätsfrage – AuROA

From Google’s English:  “The two-day interdisciplinary event deals with digital publishing in the humanities. The thematic priorities are quality criteria in humanities publications in connection with and as a result of open access, digital publishing and scholar-led publishing as well as current problems of scientific publishing such as reputation building mechanisms, peer review and data tracking.

A theoretical part deals with sociological, scientific-theoretical and political issues relating to open access in the humanities. The different disciplinary perspectives of humanities scholars are brought together through common problems and interests. The specification of quality criteria and the current publication practice leads to the controversial topic of peer review.

In the practical part, the goal is the joint development of position papers on problem areas and task-oriented requirements for quality assurance in the humanities (in book format). Current examples of the implementation of academic and library-organized publishing are presented and discussed.

The third part focuses on other current problems of scientific publishing, such as data tracking….”