Quantitative research assessment: using metrics against gamed metrics | Internal and Emergency Medicine

Abstract:  Quantitative bibliometric indicators are widely used and widely misused for research assessments. Some metrics have acquired major importance in shaping and rewarding the careers of millions of scientists. Given their perceived prestige, they may be widely gamed in the current “publish or perish” or “get cited or perish” environment. This review examines several gaming practices, including authorship-based, citation-based, editorial-based, and journal-based gaming as well as gaming with outright fabrication. Different patterns are discussed, including massive authorship of papers without meriting credit (gift authorship), team work with over-attribution of authorship to too many people (salami slicing of credit), massive self-citations, citation farms, H-index gaming, journalistic (editorial) nepotism, journal impact factor gaming, paper mills and spurious content papers, and spurious massive publications for studies with demanding designs. For all of those gaming practices, quantitative metrics and analyses may be able to help in their detection and in placing them into perspective. A portfolio of quantitative metrics may also include indicators of best research practices (e.g., data sharing, code sharing, protocol registration, and replications) and poor research practices (e.g., signs of image manipulation). Rigorous, reproducible, transparent quantitative metrics that also inform about gaming may strengthen the legacy and practices of quantitative appraisals of scientific work.

 

The Responsible Research(er) Recruitment Checklist: A best practice guide for applying principles of responsible research assessment in researcher recruitment materials

Abstract:  Assessment of potential academic staff members is necessary for making recruitment decisions. Amidst growing concern over the use of inappropriate quantitative indicators for research and researcher evaluation, Institutions have begun to reform their policies to emphasise broader, responsible researcher assessment. To help implement such reforms, here we share a best practice Responsible Research(er) Recruitment Checklist for engaging with the principles of responsible research assessment in the writing of recruitment materials such as job adverts for research and academic roles. Aligned with The San Francisco Declaration on Research Assessment (DORA) principles, the checklist provides guidance on how to emphasise the primacy of research content and researcher contributions to published articles, without reliance on journal-based metrics. The checklist also recommends that evaluations consider a broad range of research outputs, and that collaboration, citizenship, author contributions, and Open Research practices be recognised. At the time of writing, the checklist is being piloted.

Results of PLOS experiments to increase sharing and discovery of research data – The Official PLOS Blog

“For PLOS, increasing data-sharing rates—and especially increasing the amount of data shared in a repository—is a high priority. 

Research data is a vital part of the scientific record, essential to both understanding and reproducing published research. And data repositories are the most effective and impactful way to share research data. Not only is deposited data safer and more discoverable, articles with data in a repository have a 25% higher citation rate on average.

With support from the Wellcome Trust, we’ve been experimenting with two solutions designed to increase awareness about data repositories and promote data repository use among both authors and readers. One solution didn’t achieve its expected outcome in the context we tested it (a “negative” result) while the other shows promise as a tool for increasing engagement with deposited data. The mixed outcomes are an example of why it’s so important to share all research results regardless of their outcome – whether “positive” or “negative” results. We hope that our experiences, what we’ve learned, and above all the data and results, can help the scholarly communications community to develop new and better solutions to meet the challenges we all face, and advance Open Science.

Read on for a quick summary of the studies we conducted. Or get the full details from our new preprint on Figshare, and explore the data for yourself….”

Incentivising best practice in research data sharing: Experiments to increase use of and engagement with data repositories

Abstract:  Improving the uptake of repositories to share research data is an aim of many publishers, funders and infrastructure providers. Even at the publisher PLOS, which has a mandatory data sharing policy, repositories are still used less commonly than Supporting Information to share data. This preprint presents the results of two experiments that tested solutions that aimed to increase the use of repositories for data sharing as well as increase engagement with shared data. The experiments—integration of the Dryad repository into the manuscript submission system at PLOS Pathogens and implementing an Accessible Data icon to signal data shared in a repository on published articles across the PLOS journal portfolio—were designed to be interventions that required minimal extra effort for authors (researchers). We collected usage data on these solutions as well as survey (n=654 and n=4,898) and interview (n=12) data from submitting authors. The results show that author uptake of the integrated repository (used by ~2% of submissions) was lower than expected in part due to lack of awareness despite various communication methods being used. Integration of data repositories into the journal submission process, in the context in which we tested it, may not increase use of repositories without additional visibility, or policy incentives. Our survey results suggest the Accessible Data icon did have some effect on author behaviour, although not in the expected way, as it influenced repository choice for authors who had already planned to use a repository rather than influencing the choice of sharing method. Furthermore, the Accessible Data icon was successful in increasing engagement with shared data, as measured by an increase in average monthly views of datasets linked to a cohort of 543 published articles that displayed it from 2.5 to 3.0 (an increase of 20%) comparing 12-month periods either side of the introduction of the icon. The results of these two experiments provide valuable insights to publishers and other stakeholders about strategies for increasing the use of repositories for sharing research data.

 

How to make research reproducible: psychology protocol gives 86% success rate

“In a bid to restore its reputation, experimental psychology has now brought its A game to the laboratory. A group of heavy-hitters in the field spent five years working on new research projects under the most rigorous and careful experimental conditions possible and getting each other’s labs to try to reproduce the findings.

Published today in Nature Human Behaviour1, the results show that the original findings could be replicated 86% of the time — significantly better than the 50% success rate reported by some systematic replication efforts….”

Open Access Best Practices and Licensing – Sridhar Gutam

“Within scholarly communication, open licensing plays a pivotal role in making work openly accessible while preserving rights and control. Open licenses facilitate dissemination, collaboration, and knowledge exchange by offering clarity and reducing access barriers. They promote transparency and can be applied to various research outputs, seamlessly aligning with OA principles. Open licensing extends permissions beyond default copyright law, granting creators the ability to define how others can access, engage with, share, and build upon their work. Creative Commons licenses exemplify this approach….”

Adopting Good Practices in Asian Repositories: A Community Conversation

“This 2-hour webinar will include presentations about two COAR recommendations – the COAR Community Framework for Good Practices in Repositories and COAR’s good practice advice on managing multilingual and non-English language content in repositories – and will be followed by an interactive discussion with attendees.”

Using Altmetric Data Responsibly: A Guide to Interpretation and Good Practice

Abstract:  This guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs. This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.

The Research Data Management Workbook

“The Research Data Management Workbook is made up of a collection of exercises for researchers to improve their data management. The Workbook contains exercises across the data lifecycle, though the range of activities is not comprehensive. Instead, exercises focus on discrete practices within data management that are structured and can be reproduced by any researcher.

The book is divided into chapters, loosely by phases of the data lifecycle, with one or more exercises in each chapter. Every exercise comes with a description of its value within data management, instructions on how to do the exercise, original source of the exercise (when applicable), and the exercise itself.

The Workbook is intended as a supplement to existing data management education. If you would like to learn more about the principles of data management, please see the article “Foundational Practices of Research Data Management” (K. Briney et al., 2020) or read the book “Data Management for Researchers” (K. A. Briney, 2015). …”

The Research Data Management Workbook

“The Research Data Management Workbook is made up of a collection of exercises for researchers to improve their data management. The Workbook contains exercises across the data lifecycle, though the range of activities is not comprehensive. Instead, exercises focus on discrete practices within data management that are structured and can be reproduced by any researcher.

The book is divided into chapters, loosely by phases of the data lifecycle, with one or more exercises in each chapter. Every exercise comes with a description of its value within data management, instructions on how to do the exercise, original source of the exercise (when applicable), and the exercise itself.

The Workbook is intended as a supplement to existing data management education. If you would like to learn more about the principles of data management, please see the article “Foundational Practices of Research Data Management” (K. Briney et al., 2020) or read the book “Data Management for Researchers” (K. A. Briney, 2015). …”

Journal Production Guidance for Software and Data Citations | Scientific Data

Abstract:  Software and data citation are emerging best practices in scholarly communication. This article provides structured guidance to the academic publishing community on how to implement software and data citation in publishing workflows. These best practices support the verifiability and reproducibility of academic and scientific results, sharing and reuse of valuable data and software tools, and attribution to the creators of the software and data. While data citation is increasingly well-established, software citation is rapidly maturing. Software is now recognized as a key research result and resource, requiring the same level of transparency, accessibility, and disclosure as data. Software and data that support academic or scientific results should be preserved and shared in scientific repositories that support these digital object types for discovery, transparency, and use by other researchers. These goals can be supported by citing these products in the Reference Section of articles and effectively associating them to the software and data preserved in scientific repositories. Publishers need to markup these references in a specific way to enable downstream processes.

Monitoring and evaluating the effectiveness of UKRI’s open access policy: Principles, opportunities and challenges | Policy Commons

Abstract:  This report sets out principles, opportunities and challenges for the development of a monitoring and evaluation framework for UK Research and Innovation’s open access (OA) policy. The recommended evaluation questions were identified through interviews and workshops with a range of external stakeholders and in-depth desk research investigating existing monitoring and evaluation activities. The report also provides an overview of stakeholder views about key considerations for monitoring and evaluating the policy including principles of best practice. The report annex sets out recommended approaches to answering the questions, including data sources, aggregation and analysis methodologies. UKRI will consider the outcomes and recommendations of this project in developing its final monitoring and evaluation framework.

REPORT: Best Practices for Institutional Publishing Service Providers – DIAMAS

“DIAMAS plans to improve Open Access publishing practices. To do so, we will create Extensible Quality Standard for Institutional Publishing (EQSIP), which aim to ensure the quality and transparency of governance, processes and workflows in institutional publishing. The Best practices report is an initial step in this process.

The report is based on an analysis of existing quality evaluation criteria, best practices, and assessment systems in publishing developed by international publishers’ associations, research funding organisations, international indexing databases, etc (full dataset available here). If you are an institutional publisher, a service provider involved in Open Access publishing, or a journal editor, this report can help you learn about current best practices and identify where you need to align.

Our recommendations and tips cover seven categories, which are also the core components of the Extensible Quality Standard for Institutional Publishing (EQSIP): 1) Funding; 2) Ownership and governance; 3) Open science practices; 4) Editorial quality, editorial management, and research integrity; 5) Technical service efficiency; 6) Visibility; and 7) Equity, Diversity and Inclusion (EDI).

A self-assessment checklist summarises the best practices outlined in the report. Institutional publishers, service providers and journal editors can use it to get an idea of the future Extensible Quality Standard for Institutional Publishing (EQSIP), and assess their current practices and see where to make improvements.”

Dear Colleague Letter: Innovations in Open Science (IOS) Planning Workshops (nsf23141) | NSF – National Science Foundation

“The recent memo titled “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research,” also referred to as the Nelson Memo1, issued by the Office of Science and Technology Policy (OSTP), has provided policy guidance to federal agencies on public access requirements for federally funded research. The need for a better, innovative data and research infrastructure that embraces open science principles to serve the interconnected scientific communities has never been as urgent.

Through this Dear Colleague Letter (DCL), the Division of Atmospheric and Geospace Sciences (AGS) in the Directorate for Geosciences (GEO) is calling for workshop proposals2 focused on identifying critical needs for innovations in open science for data infrastructure that can serve the research community at a national-needs level, and have the potential to significantly advance research in atmospheric and geospace sciences, ensuring their research outputs, broadly defined, in compliance with the FAIR (Findable, Accessible, Interoperable, and Reproducible) principles. The workshop proposals will provide the AGS community an opportunity to come together to discuss needs, best practices, and resources necessary to build a data infrastructure through which open and equitable research can be achieved….”