BishopBlog: Is Hindawi “well-positioned for revitalization?”

“Over the past year, special issues of dozens of Hindawi journals have been exposed as being systematically manipulated, resulting in the delisting of more than 20 Hindawi journals from major journal databases, as well as the retraction of more than 2,700 papers by the publisher. This “unexpected event” at Hindawi also led to a slump in profits for the parent company, John Wiley & Sons. However, in a recent statement, the president, CEO & director of Wiley, Brian Napack, stated that Hindawi was now ready for revitalization and reinstatement of the special issue program. In my opinion, Wiley has not dealt adequately with the integrity issues that led to the problem, but appears focused on growth  through the medium of special issues. This raises questions as to whether Hindawi’s operation is sustainable in the long term. …”

ROSiE General Guidelines on Responsible Open Science are now available!

“Responsible Open Science has emerged as a critical framework for promoting transparency, collaboration, and ethical conduct in the rapidly evolving landscape of scientific research. As part of the ROSiE project, D5.2, titled “Strategic Policy Paper on Responsible Open Science,” addresses various crosscutting issues and challenges in Open Science (OS), research ethics, and integrity.

D5.2 aims to equip policymakers, research institutions (RPOs and RFOs), publishers, researchers, and the public with the necessary tools and knowledge to facilitate the transition towards action and practice-oriented policy methods.

D5.3 represents a significant step forward by transforming D5.2’s and the other deliverables’ recommendations into a set of guidelines. These guidelines are designed to support stakeholders in embracing Responsible Open Science practices and provide practical tools and knowledge.

Furthermore, D5.3 establishes the first-ever guidance document on Open Science in Europe, showcasing the ROSiE project’s commitment to advancing the field.

By adopting the ROSiE General Guidelines for Responsible Open Science, stakeholders across the research landscape can actively contribute to the promotion of responsible Open Science. Through this collaborative effort, we aim to foster transparency, and societal impact in Europe and beyond.”

DIAMAS deliverable: D3.1 IPSP Best Practices Quality evaluation criteria, best practices, and assessment systems for Institutional Publishing Service Providers (IPSPs) | Zenodo

“This report outlines existing quality evaluation criteria, best practices, and assessment systems for IPSPs developed by international associations, RPOs, governments, and international databases. It also analyses academic literature on research evaluation of IPSPs, assessment criteria and indicators. The analysis matrix includes the following categories, which will also be the core components of EQSIP: 

Funding: description of the funding model, OA business model, transparency in listing all funding sources, etc. 

Ownership and governance: legal ownership, mission, and governance.

Open science practices: OA policy, copyright and licensing, open peer review, data availability, new approaches to research assessment, etc.

Editorial quality, editorial management, and research integrity.  

Technical service efficiency: technical strength, interoperability – metadata, ISSN, PIDs, machine readability, and accessible  journal website. 

Visibility, including indexation, communication, marketing and impact.

Equity, Diversity  and Inclusion (EDI): multilingualism, gender equity….”

Best practices for open access publishing | EIFL

“The DIAMAS (Developing Institutional Open Access Publishing Models to Advance Scholarly Communication) project has published a best practices report highlighting quality evaluation criteria and assessment systems for Institutional Publishing Service Providers (IPSPs).  

EIFL is a partner in the DIAMAS project, which was formed to support high-quality, sustainable, open access publishing, and to develop common standards, guidelines and practices for the Diamond institutional publishing sector. Diamond Open Access refers to a scholarly publication model in which journals and platforms do not charge fees to either authors or readers. 

Iryna Kuchma, Manager of the EIFL Open Access Programme, and Milica Ševkuši?, Project Coordinator for the EIFL Open Access Programme, co-authored this report, which is based on analyses of existing quality evaluation criteria, best practices and assessment systems for IPSPs developed by international associations, Research Performing Organizations, governments,and international databases. The report also analyzes academic literature on research evaluation of IPSPs, assessment criteria and Indicators.

The recommendations and tips cover seven categories, which are also the core components of the Extensible Quality Standard for Institutional Publishing (EQSIP). Also included in the report is a self-assessment checklist for IPSPs which you can use to see how your publishing practices measure up….”

Reproducibility and Research Integrity – Science, Innovation and Technology Committee

“The United Kingdom is experiencing the largest-ever increase in public investment in research and development, with the Government R&D budget set to reach £20 billion a year by 2024/5. The creation of the new Department for Science, Innovation and Technology has been advanced by the Government as heralding an increased focus on research and innovation—seen to be among Britain’s main strengths.

At the same time, there have been increasing concerns raised that the integrity of some scientific research is questionable because of failures to be able to reproduce the claimed findings of some experiments or analyses of data and therefore confirm that the original researcher’s conclusions were justified. Some people have described this as a ‘reproducibility crisis’.

In 2018, our predecessor committee published a report ‘Research Integrity’. Some of the recommendations of that report were implemented—such as the establishment of a national research integrity committee.

This report looks in particular at the issue of the reproducibility of research….

We welcome UKRI’s policy of requiring open access to research that it funds, but we recommend that this should go further in requiring the recipients of research grants to share data and code alongside the publications arising from the funded research….”

Webinar – Scholarly Communication in Crisis: Research Integrity and Open Scholarship – OASPA

“Research integrity and ethical standards for publication underpin the research endeavor, ensuring that researchers can confidently build on the outputs of others and ensuring public trust in research. The integrity of scholarly communications is dependent on the trust of the research community in the peer review and publication processes that are part of it. However, this confidence is starting to break down, due to a significant rise in unethical research and publication practices, fueled by academic incentive structures heavily skewed toward certain types of publication metrics. These practices, including “paper mills” and “peer review rings” are happening at scale and systemically undermine publication processes, striking at the heart of what publishers do as the custodians of the research record. 

This problem is a complex and interconnected one and this webinar brings together experts to explore the question of whether open scholarship practices and tools can help detect malpractices and be part of the solution. The speakers will approach this shared problem from a variety of angles, albeit all through the lens of open research and scholarship and how they are building open tools and evidencing the impact of their work.   

We welcome panelists Adam Day, Brian Nosek and Dorothy Bishop, and Chair Catriona MacCallum….”

The Cape Town Statement on fairness, equity and diversity in research

“Even the push towards openness and transparency in science publishing — which many have argued is a way to foster greater integrity in research — has created more barriers for investigators in low-resource environments.

Sharing data, for example, requires having enough institutional infrastructure and resources to first curate, manage, store and (in the case of data relating to people) encrypt the data — and to deal with requests to access them. Also, the pressure placed on researchers of LMICs by high-income-country funders to share their data as quickly as possible frequently relegates them to the role of data collectors for better-resourced teams. With enough time, all sorts of locally relevant questions that were not part of the original project could be investigated by local researchers. But, well-resourced investigators in high-income countries — who were not part of the original project — are often better placed to conduct secondary analyses.

Unforeseen difficulties are arising around publishing, too. Currently, the costs to publish an article in gold open-access journals (which typically range from US$500–$3,000) are prohibitive for most researchers and institutions in LMICs. The University of Cape Town, for example, which produces around 3,300 articles each year, has an annual budget of $180,000 for article-processing costs. This covers only about 120 articles per year.

Because of this, researchers in these countries frequently publish their papers in subscription-based journals. But scientists working in similar contexts can’t access such journals because the libraries in their institutions are unable to finance subscriptions to a wide range of journals. All this makes it even harder for researchers to build on locally relevant science….”

 

The Cape Town Statement on Fostering Research Integrity through Fairness and Equity advocates for fair practice from conception to implementation of research and provides 20 recommendations aimed at all involved stakeholders.

“The 7th World Conference on Research Integrity (7thWCRI) was held in Cape Town in May 2022 with the conference theme “Fostering Research Integrity in an unequal world”. Participants at this conference recognised that unfair and inequitable research practices remain prevalent at all stages of research from proposal development to funding application, data collection, analysis, sharing and access, reporting and translation. These practices can impact the integrity of research in many ways, including skewing research priorities and agendas with research questions that are irrelevant for local needs, power imbalances that undermine fair recognition of knowledge contributions within collaborations, including unfair acknowledgement of contributions to published work, lack of diversity and inclusivity in collaborations, and unfair data management practices that disadvantage researchers in low resource environments. Furthermore, a drive towards open science as a pillar of research integrity fails to recognise the financial burden placed on under-resourced researchers and institutions, and the reality that highly trained and well-resourced researchers in HIC may disproportionately benefit from reanalysing openly shared data by LMIC researchers. In response to these challenges the following statement of goals, values and recommendations aims to contribute to the growing global recognition that fairness and equity are essential requirements of integrity in all research contexts.

This statement advocates for fair practice from conception to implementation of research and provides 20 recommendations aimed at all involved stakeholders. These recommendations are grouped under values that were identified as important underpinning considerations in discussion groups at the 7th WCRI. These values include diversity, inclusivity, mutual respect, shared accountability, indigenous knowledge recognition and epistemic justice (ensuring that the value of knowledge is not based on biases related to gender, race, ethnicity, culture, socio-economic status etcetera)….”

Why research integrity matters and how it can be improved

Scholars need to be able to trust each other, because other – wise they cannot collaborate and use each other’s findings. Similarly trust is essential for research to be applied for individuals, society or the natural environment. The trustworthiness is threatened when researchers engage in questionable research practices or worse. By adopting open science practices, research becomes transparent and accountable. Only then it is possible to verify whether trust in research findings is justified. The magnitude of the issue is substantial with a prevalence of four percent for both fabrication and falsification, and more than 50% for questionable research practices. This implies that researchers regularly engage in behaviors that harm the validity and trustworthiness of their work. What is good for the quality and reliability of research is not always good for a scholarly career. Navigating this dilemma depends on how virtuous the researcher at issue is, but also on the local research climate and the perverse incentives in the way the research system functions. Research institutes, funding agencies and scholarly journals can do a lot to foster research integrity, first and foremost by improving the quality of peer review and reforming researcher assessment

Guest Post – Enabling Trustable, Transparent, and Efficient Submission and Review in an Era of Digital Transformation – The Scholarly Kitchen

“As the Open Science movement produces increasingly complex scientific analyses and rich research outputs that include not only articles but also data, models, physical samples, software, media, and more, those outputs also need to meet the FAIR criteria (findable, accessible, interoperable, and reusable). Developing shared storehouses for data, submissions, and images — a direction that STM publishers are heading in — could be key to making AI tools better trained, and thus more useful, allowing detection of integrity issues such as duplication and image manipulation across, as well as within, publications….”

TIER2

“Enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility…

TIER2 aims to boost knowledge on reproducibility, create tools, engage communities, implement interventions and policy across different contexts to increase re-use and overall quality of research results….”

OSTP Releases Framework for Strengthening Federal Scientific Integrity Policies and Practices | OSTP | The White House

“Today, the White House Office of Science and Technology Policy (OSTP) released A Framework for Federal Scientific Integrity Policy and Practice, a roadmap that will help strengthen scientific integrity policies and practices across the federal government.

This framework builds on the assessment of federal scientific integrity policies and practices described in the January 2022 report, Protecting the Integrity of Government Science, and draws from extensive input from federal agencies, as well as from across sectors, including academia, the scientific community, public interest groups, and industry. It has several key components that federal departments and agencies will use to improve scientific integrity policies and practices, including:

A consistent definition of scientific integrity for all federal agencies

A model scientific integrity policy to guide agencies as they build and update their policies

A set of tools to help agencies regularly assess and improve their policies and practices…”

Open issues for education in radiological research: data integrity, study reproducibility, peer-review, levels of evidence, and cross-fertilization with data scientists | SpringerLink

Abstract:  We are currently facing extraordinary changes. A harder and harder competition in the field of science is open in each country as well as in continents and worldwide. In this context, what should we teach to young students and doctors? There is a need to look backward and return to “fundamentals”, i.e. the deep characteristics that must characterize the research in every field, even in radiology. In this article, we focus on data integrity (including the “declarations” given by the authors who submit a manuscript), reproducibility of study results, and the peer-review process. In addition, we highlight the need of raising the level of evidence of radiological research from the estimation of diagnostic performance to that of diagnostic impact, therapeutic impact, patient outcome, and social impact. Finally, on the emerging topic of radiomics and artificial intelligence, the recommendation is to aim for cross-fertilization with data scientists, possibly involving them in the clinical departments.

 

Research Integrity and Reproducibility are Two spects of the Same Underlying Issue – A Report from STM Week 2022 – The Scholarly Kitchen

“Imagine if the integrity of the publishing process didn’t rely purely on publishers’ ability to detect fraud, malpractice, or mistakes based on the limited information available in a submitted manuscript. Instead, what if this responsibility were spread throughout the ecosystem, from funder grant management system, to data management plan, to data center, to lab notebook, to preprint, to published version of record, making use of trusted assertions to build an open, verifiable research environment that also leverages transparency so that publishers, funders, institutions, and other researchers could all trace findings and claims back through the whole research process? 

The vision I laid out above may sound utopian, but much of the technology and tools required already exist. As well as the TREs [Trusted Research Environments], which can be seen as a model for traceability, and ORCID trust markers, which illustrate how the same thing can be done securely in the open, initiatives like Center for Open Science, and Octopus show how a range of outputs and activities can be used to document the entire research process.

The problem is not technology, it’s a wicked mix of perverse incentives, network effects, business model inertia, and sustainability challenges that lock us all into the same restrictive ideas about what constitutes a research publication, and what counts for prestige and career advancement. To address the range of challenges from poor research practice to industrial-scale fraud by paper mills, we need a whole-sector approach that involves funders, institutional research management and libraries, researchers, and publishers. As fellow Chef Alice Meadows and I wrote in a previous post, it really does take a village, and cross-sector collaboration is vital to building the interoperable research information infrastructure needed to connect the people, places, and things of the scholarly ecosystem in a way that is verifiable and trusted.”