Quantitative research assessment: using metrics against gamed metrics | Internal and Emergency Medicine

Abstract:  Quantitative bibliometric indicators are widely used and widely misused for research assessments. Some metrics have acquired major importance in shaping and rewarding the careers of millions of scientists. Given their perceived prestige, they may be widely gamed in the current “publish or perish” or “get cited or perish” environment. This review examines several gaming practices, including authorship-based, citation-based, editorial-based, and journal-based gaming as well as gaming with outright fabrication. Different patterns are discussed, including massive authorship of papers without meriting credit (gift authorship), team work with over-attribution of authorship to too many people (salami slicing of credit), massive self-citations, citation farms, H-index gaming, journalistic (editorial) nepotism, journal impact factor gaming, paper mills and spurious content papers, and spurious massive publications for studies with demanding designs. For all of those gaming practices, quantitative metrics and analyses may be able to help in their detection and in placing them into perspective. A portfolio of quantitative metrics may also include indicators of best research practices (e.g., data sharing, code sharing, protocol registration, and replications) and poor research practices (e.g., signs of image manipulation). Rigorous, reproducible, transparent quantitative metrics that also inform about gaming may strengthen the legacy and practices of quantitative appraisals of scientific work.

 

Publishing your research work: Updated concepts and nuances of few metrics used to assess journal quality – PubMed

Abstract:  Authors have a multitude of options for journals for publishing their research. However, their choice is mostly based on academic credits required for promotion, cost of publication, timeliness of process, etc., The purpose of this narrative review is to enlighten the authors about some other journal metrics used to assess journal ranking and quality in international scenario. The main concepts discussed in this paper are the impact factor and cite score. The paper includes an explanation of terms like web of science, journal citation reports, and how they are related to impact factor. This will help the authors to make the right decision about choosing the right journal for publishing their research. Along with the historic concepts we have included the latest updates about changes being made to the journal citation report and impact factor released in 2023 June. Hopefully with the review paper, we will be able to encourage the inclusion of such concepts and curriculum of post-graduation courses considering publishing a paper and choosing a journal are an integral aspect of a researcher’s work life.

 

The Responsible Research(er) Recruitment Checklist: A best practice guide for applying principles of responsible research assessment in researcher recruitment materials

Abstract:  Assessment of potential academic staff members is necessary for making recruitment decisions. Amidst growing concern over the use of inappropriate quantitative indicators for research and researcher evaluation, Institutions have begun to reform their policies to emphasise broader, responsible researcher assessment. To help implement such reforms, here we share a best practice Responsible Research(er) Recruitment Checklist for engaging with the principles of responsible research assessment in the writing of recruitment materials such as job adverts for research and academic roles. Aligned with The San Francisco Declaration on Research Assessment (DORA) principles, the checklist provides guidance on how to emphasise the primacy of research content and researcher contributions to published articles, without reliance on journal-based metrics. The checklist also recommends that evaluations consider a broad range of research outputs, and that collaboration, citizenship, author contributions, and Open Research practices be recognised. At the time of writing, the checklist is being piloted.

Exploring the Dimensions of Scientific Impact: A Comprehensive Bibliometric Analysis Investigating the Influence of Gender, Mobility, and Open Access

Abstract: The Science of Science field advances the measurement, evaluation, and prediction of scientific outcomes through the study of extensive scholarly data. For these purposes, bibliometrics is an appropriate approach that studies large volumes of scientific data using mathematical and statistical methods, and is widely used to assess the impact of papers and authors within a specific field or community. However, conducting bibliometric analyses poses several methodological, technical, and informational challenges (e.g., collecting and cleaning data, calculating indicators) which need to be addressed. This thesis aims to tackle some of these challenges and shed light on the factors influencing scientific impact, specifically focusing on open access publishing, international mobility, and influential factors on the h-index. This thesis tackles methodological contributions, such as author disambiguation and co-authorship network analysis, as they provide insights into methodological and informational challenges within bibliometric analysis. Another methodological challenge addressed in this research is the inference of gender for a significant number of authors to obtain gender-related insights. By employing gender inference techniques, the research explores gender as an influential factor in scientific impact, shedding light on potential gender inequalities within the scholarly community. The research employs a bibliometric approach and utilizes mainly Scopus, a comprehensive dataset encompassing various disciplines to make the following contributions:

• We explore the impact of publishing behavior, particularly the adoption of open access practices, on knowledge dissemination and scholarly communication. With this intention, we investigate the impact of journals flipping from closed access to open access publishing models [74]. Changes in publication volumes and citation impact are analyzed, demonstrating an overall increase in publication output and improved citation metrics following the transition to open access. However, the magnitude of changes varies across scientific disciplines. In another study [76], we utilize a dataset of articles published by Springer Nature and employ correlation and regression analyses to examine the relationship between authors’ country affiliations, publishing models, and citation impact. Utilizing machine learning approach, we estimate the publishing model of papers based on different factors. The findings reveal different patterns in authors’ choices of publishing models based on income levels, availability of Article Processing Charges waivers, and journal rank. The study highlights potential inequalities in access to open access publishing and its citation advantage.

• We investigate the association between scholars’ mobility patterns, socio-demographic characteristics, and their scientific activity and impact. By utilizing network and regression analyses, along with various statistical techniques, we investigate the international mobility of researchers. Furthermore, we conduct a comparative analysis of scientific outcomes, considering factors such as publications, citations, and measures of co-authorship network centrality. The findings reveal gender inequalities in mobility across scientific fields and countries and positive correlations between mobility and scientific success.

• Centered on the prediction of scholars’ h-index as a metric of scientific impact, another one of our studies [77] employs machine learning techniques. We examine author, coauthorship, paper, and venue-specific characteristics, in addition to prior impact-based features. The results emphasize the significance of non-prior impact-based features, particularly for early-career scholars in the long term, while also revealing the limited influence of gender on h-index prediction. 

The findings of this research hold implications for researchers, academic institutions, and policymakers aiming to advance scientific knowledge and foster equitable practices. By unviii covering the influential factors that shape scientific impact and addressing potential gender disparities, this research contributes to the broader objective of promoting diversity, inclusivity, and excellence within the scholarly community. 

bjoern.brembs.blog » German funder DFG: Why the sudden inconsistency?

“Given this long and consistent track-record, now complemented by two major official statements, one could be forgiven to think that applicants for funding at the DFG now feel assured that they will not be judged by their publication venues any longer. After all, journal prestige is correlated with experimental unreliability, so using it as an indicator clearly constitutes “inappropriate use of journal-based metrics”. With all this history, it came as a shock to many when earlier this year, one of the DFG panels deciding which grant proposals get funded, published an article in the German LaborJournal magazine that seemed to turn the long, hard work of the DFG in this area on its head….”

Edinburgh Open Research Conference 2024, May 29, 2024 | The University of Edinburgh

The forthcoming Edinburgh Open Research Conference (EOR) will take place on?Wednesday 29th May 2024. While the conference will be hybrid, we ask that all contributors attend in person. 

Edinburgh University is committed to making Open Research the new normal and a vital part of achieving that is by contributing to positive culture change within research. But how do we and other institutions go about this?   

So, this year, we are asking the question: 

How can Open Research contribute to positive Culture Change in Research more broadly? 

The focus will be on the role of Open Research in changing research culture for the better. We will come together to ask how principles of FAIRness, reproducibility, recognition, integrity, and participation can steer us towards a healthier, more vibrant research environment. We will also consider the underlying factors that can drive research culture change in all its forms, how we can measure progress, and how Open Research intersects with other aspects of research culture that involve a shift in research value, behaviour, expectations and attitudes such as EDI, health and working patterns. 

The three central themes of the conference will be:  

Next Generation Metrics 
Research Integrity 
Education and Skills 

But we also welcome contributions addressing the other Pillars of Open Science; FAIR Data, Scholarly Communications, Reward & Recognition, Citizen Science, & EOSC 

We are seeking the following types of contributions addressing this question in the form of:  

Talks (15mins) 
Lightning Talks (5 mins) 
Posters  

We are keen for an array of speakers from a range of backgrounds (academic, professional services, and students). We especially welcome contributions from early career researchers, junior professional services staff, and technicians.

 

Hostler (2023) Open Research Reforms and The Capitalist University: Areas of Opposition and Alignment | SocArXiv Papers

Hostler, T. (2022, May 7). Open Research Reforms and The Capitalist University: Areas of Opposition and Alignment. https://doi.org/10.31235/osf.io/r4qgc

Abstract:There is a need for a nuanced and theoretically grounded analysis of the socio-political consequences of methodological reforms proposed by the open research movement. This paper contributes by utilising the theory of academic capitalism and considering how open research reforms may interact with the priorities and practices of the capitalist university. Three manifestations of academic capitalism are considered: the development of a highly competitive job market for researchers based on metricized performance, the increase in administration resulting from university systems of compliance, and the reorganization of academic labour along principles of “post-academic science”. The ways in which open research reforms both oppose and align with these manifestations is then considered, to explore the relationships between specific reforms and academic capitalist praxis. Overall, it is concluded that open research advocates must engage more closely with the potential of reforms to negatively affect academic labour conditions, which may bring them into conflict with either university management, or those who uphold the traditional principles of an ‘all round’ academic role

 

Open Access Best Practices and Licensing – Sridhar Gutam

“Within scholarly communication, open licensing plays a pivotal role in making work openly accessible while preserving rights and control. Open licenses facilitate dissemination, collaboration, and knowledge exchange by offering clarity and reducing access barriers. They promote transparency and can be applied to various research outputs, seamlessly aligning with OA principles. Open licensing extends permissions beyond default copyright law, granting creators the ability to define how others can access, engage with, share, and build upon their work. Creative Commons licenses exemplify this approach….”

Higher Education, Vol. 86, Iss. 4: Special Issue – The institutionalization of rankings in higher education: continuities, interdependencies, engagement

Guest Editors:

Jelena Brankovic, Bielefeld University, Germany
Julian Hamann, Humboldt-Universität zu Berlin, Germany
Leopold Ringel, Bielefeld University, Germany

[…] we introduce the special issue of Higher Education that centers on the question of the institutionalization of rankings in higher education. The article has three parts. In the first part, we argue that the grand narratives such as globalization and neoliberalism are unsatisfactory as standalone explanations of why and how college and university rankings become institutionalized. As a remedy, we invite scholars to pay closer attention to the dynamics specific to higher education that contribute to the proliferation, persistence, and embeddedness of rankings. In the second part, we weave the articles included in the issue into three sub-themes—continuities, interdependencies, and engagement—which we link to the overarching theme of institutionalization. Each contribution approaches the subject of rankings from a different angle and casts a different light on continuities, interdependencies, and engagement, thus suggesting that the overall story is much more intricate than often assumed. In the third and final part, we restate the main takeaways of the issue and note that systematic comparative research holds great promise for furthering our knowledge on the subject. We conclude the article with a hope that the special issue would stimulate further questioning of rankings—in higher education and higher education research.

 

Snijder (2023) Measured in a context: making sense of open access book data | UKSG Insights

 

Abstract: Open access (OA) book platforms, such as JSTOR, OAPEN Library or Google Books, have been available for over a decade. Each platform shows usage data, but this results in confusion about how well an individual book is performing overall. Even within one platform, there are considerable usage differences between subjects and languages. Some context is therefore necessary to make sense of OA books usage data. A possible solution is a new metric – the Transparent Open Access Normalized Index (TOANI) score. It is designed to provide a simple answer to the question of how well an individual open access book or chapter is performing. The transparency is based on clear rules, and by making all of the data used visible. The data is normalized, using a common scale for the complete collection of an open access book platform and, to keep the level of complexity as low as possible, the score is based on a simple metric. As a proof of the concept, the usage of over 18,000 open access books and chapters in the OAPEN Library has been analysed, to determine whether each individual title has performed as well as can be expected compared to similar titles.

Using Altmetric Data Responsibly: A Guide to Interpretation and Good Practice

Abstract:  This guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs. This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.

The quantification of Open Scholarship – a mapping review | Quantitative Science Studies | MIT Press

Abstract:  This mapping review addresses scientometric indicators that quantify open scholarship. The goal is to determine what open scholarship metrics are currently being applied and which are discussed, e.g. in policy papers. The paper contributes to a better understanding on how open scholarship is quantitatively recorded in research assessment and where gaps can be identified. The review is based on a search in four databases, each with 22 queries. Out of 3385 hits, we coded 248 documents chosen according to the research questions. The review discusses the open scholarship metrics of the documents as well as the topics addressed in the publications, the disciplines the publications come from and the journals they were published. The results indicate that research and teaching practices are unequally represented regarding open scholarship metrics. Open research material is a central and exhausted topic in publications. Open teaching practices, on the other hand, play a role in the discussion and strategy papers of the review, but open teaching material is not recorded using concrete scientometric indicators. Here, we see a research gap and discuss potentials for further research and investigation.

The quantification of Open Scholarship – a mapping review | Quantitative Science Studies | MIT Press

Abstract:  This mapping review addresses scientometric indicators that quantify open scholarship. The goal is to determine what open scholarship metrics are currently being applied and which are discussed, e.g. in policy papers. The paper contributes to a better understanding on how open scholarship is quantitatively recorded in research assessment and where gaps can be identified. The review is based on a search in four databases, each with 22 queries. Out of 3385 hits, we coded 248 documents chosen according to the research questions. The review discusses the open scholarship metrics of the documents as well as the topics addressed in the publications, the disciplines the publications come from and the journals they were published. The results indicate that research and teaching practices are unequally represented regarding open scholarship metrics. Open research material is a central and exhausted topic in publications. Open teaching practices, on the other hand, play a role in the discussion and strategy papers of the review, but open teaching material is not recorded using concrete scientometric indicators. Here, we see a research gap and discuss potentials for further research and investigation.

[2309.15884] The strain on scientific publishing

Abstract:  Scientists are increasingly overwhelmed by the volume of articles being published. Total articles indexed in Scopus and Web of Science have grown exponentially in recent years; in 2022 the article total was 47% higher than in 2016, which has outpaced the limited growth, if any, in the number of practising scientists. Thus, publication workload per scientist (writing, reviewing, editing) has increased dramatically. We define this problem as the strain on scientific publishing. To analyse this strain, we present five data-driven metrics showing publisher growth, processing times, and citation behaviours. We draw these data from web scrapes, requests for data from publishers, and material that is freely available through publisher websites. Our findings are based on millions of papers produced by leading academic publishers. We find specific groups have disproportionately grown in their articles published per year, contributing to this strain. Some publishers enabled this growth by adopting a strategy of hosting special issues, which publish articles with reduced turnaround times. Given pressures on researchers to publish or perish to be competitive for funding applications, this strain was likely amplified by these offers to publish more articles. We also observed widespread year-over-year inflation of journal impact factors coinciding with this strain, which risks confusing quality signals. Such exponential growth cannot be sustained. The metrics we define here should enable this evolving conversation to reach actionable solutions to address the strain on scientific publishing.