Correlating article citedness and journal impact: an empirical investigation by field on a large-scale dataset | SpringerLink

Abstract:  In spite of previous research demonstrating the risks involved, and counsel against the practice as early as 1997, some research evaluations continue to use journal impact alone as a surrogate of the number of citations of hosted articles to assess the latter’s impact. Such usage is also taken up by research administrators and policy-makers, with very serious implications. The aim of this work is to investigate the correlation between the citedness of a publication and the impact of the host journal. We extend the analyses of previous literature to all STEM fields. Then we also aim to assess whether this correlation varies across fields and is stronger for highly cited authors than for lowly cited ones. Our dataset consists of a total of almost one million authorships of 2010–2019 publications authored by about 28,000 professors in 230 research fields. Results show a low correlation between the two indicators, more so for lowly cited authors as compared to highly cited ones, although differences occur across fields.

 

The Twitter accounts of scientific journals: a dataset

Abstract:  Twitter harbours dense networks of academics, but to what extent do scientific journals use that platform? This article introduces a dataset of 3,485 Twitter accounts pertaining to a sample of 13,821 journals listed in Web of Science’s three major indices (SCIE, SSCI and AHCI). The summary statistics indicate that 25.2% of the journals have a dedicated Twitter presence. This number is likely to grow, as, on average, every one and a half days sees yet another journal setting up a new profile. The share of Twitter presence, however, varies strongly by publisher and discipline. The most active discipline is political science, which has almost 75% of its journals on Twitter, while other research categories have zero. The median account issues 116 messages a year and it interacts with distinct other users once in two to three Tweets. Approximately 600 journals refer to themselves as ‘peer-reviewed’, while 263 journals refer to their citation-based impact (like the impact factor) in their profile description. All in all, the data convey immense heterogeneity with respect to the Twitter behaviour of scientific journals. As there are numerous deceptive Twitter profile names established by predatory publishers, it is recommended that journals establish their official accounts lest bogus journals mislead the public about scientific findings. The dataset is available for use for further scientometric analyses.

“Open Access helps both: authors and readers” : Peter Suber in an interview with Bodo Rödel (24 June 2022)

(The interview is in English and the abstract in German.)

Abstract:  From Google’s English:  In the interview, Open Access expert Peter Suber and Bodo Rödel, head of the “Publications and Scientific Information Services” department, discuss the effects of Open Access described in Suber’s 2012 book, the future development of publication platforms, the role of publishers and changed user requirements in science. In addition, topics are also addressed that are not originally caused by Open Access, such as gaining reputation or the impact of the predominance of an academic language on other non-native speakers. 

[2212.07811] Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from this http URL and Mendeley associate with journal article quality. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014-17/18, split into 34 Units of Assessment (UoAs). The results show that altmetrics are better indicators of research quality than previously thought, although not as good as raw and field normalised Scopus citation counts. Surprisingly, field normalising citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best, tweet counts are also a relatively strong indicator in many fields, and Facebook, blogs and news citations are moderately strong indicators in some UoAs, at least in the UK. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities. The Altmetric Attention Score, although hybrid, is almost as good as Mendeley reader counts as a quality indicator and reflects more non-scholarly impacts.

 

[2212.05416] In which fields are citations indicators of research quality?

Abstract:  Citation counts are widely used as indicators of research quality to support or replace human peer review and for lists of top cited papers, researchers, and institutions. Nevertheless, the extent to which citation counts reflect research quality is not well understood. We report the largest-scale evaluation of the relationship between research quality and citation counts, correlating them for 87,739 journal articles in 34 field-based Units of Assessment (UoAs) from the UK. We show that the two correlate positively in all academic fields examined, from very weak (0.1) to strong (0.5). The highest correlations are in health, life sciences and physical sciences and the lowest are in the arts and humanities. The patterns are similar for the field classification schemes of Scopus and this http URL. We also show that there is no citation threshold in any field beyond which all articles are excellent quality, so lists of top cited articles are not definitive collections of excellence. Moreover, log transformed citation counts have a close to linear relationship with UK research quality ranked scores that is shallow in some fields but steep in others. In conclusion, whilst appropriately field normalised citations associate positively with research quality in all fields, they never perfectly reflect it, even at very high values.

 

PsyArXiv Preprints | Three Myths about Open Science That Just Won’t Die

Abstract:  Knowledge and implementation of open science principles and behaviors remains uneven between and within sub-disciplines in psychology, despite over 10 years of education and advocacy. One reason for the slow and uneven progress of the movement is a set of closely-held myths about the implications of open science practices, exacerbated by the relative isolation of various sub-disciplines in the field. This talk will cover three of the major recurring myths: that open science is in conflict with prioritizing diversity, that “open data” is a binary choice between fully open and accessible and completely closed off, and that preregistration and registered reports are only appropriate for certain types of research designs. Putting these myths to rest is necessary as we work towards improving our scientific practice.

 

Investigating Open Access Publishing Practices of Early and Mid?Career Researchers in Humanities and Social Sciences Disciplines – Ayeni – 2022 – Proceedings of the Association for Information Science and Technology – Wiley Online Library

Abstract:  Although open access (OA) to research outputs has been proven to improve research readership, citation, and impact, the uptake of OA in some disciplines has remained low. In this paper, we investigated and compared OA publishing practices of early career and mid-career researchers in the Humanities, Arts, and Social Sciences (HASS) disciplines in Canada. The descriptive survey design with the use of online questionnaire was employed. Participants were drawn from a group of 15 public research universities via their openly available emails on university websites. Survey data was analyzed with descriptive and inferential statistics. Findings show that in the last three years, 74.1% of mid-career researchers have published in OA journals, compared to 63.1% of early career researchers. However, OA publishing of monographs (21.3%) and conference proceedings (29.9%), as well as the frequency and extent OA publishing remains low among all participants. ANOVA results (F [2, 218] = 3.683, p = .027, ?2 = .033) showed that 3.3% of the variance in researchers’ OA publishing frequency can be attributed to their disciplines. Overall, OA publishing among researchers in the HASS disciplines is still low. Hence, there is a need to identify factors that facilitate or hinder HASS researchers’ OA publishing.

 

Directory of Open Access Preprint Repositories: Home

“It is becoming an increasingly common practice for researchers to share their preprints because it allows them to disseminate their research results quickly and openly with the rest of the world. As a result, there is a growing number of preprint-specific and generalist repositories that support the sharing of preprints.

This directory provides a list of preprint repositories that are available to the research community. It helps researchers find the most appropriate platform for them, enabling them to browse through existing repositories by discipline, location, language, functionalities, and other facets.

The directory is jointly managed by Centre pour la Communication Scientifique Directe (CCSD) and Confederation of Open Access Repositories (COAR). The data in this directory was originally compiled through the GPPdP (Groupe Projet Plateformes de Prepublications) project, with financial support from the French Ministry of Research’s Open Science Committee (CoSO)….”

Understanding differences of the OA uptake within the German university landscape (2010-2020). Part 1: journal-based OA

Abstract:  This study investigates the determinants for the uptake of Full and Hybrid Open Access (OA) in the university landscape of Germany. It adapts the governance equaliser as a heuristic for this purpose and distinguishes between three factors: The disciplinary profile (academic self-governance), infrastructures and services of universities that aim to support OA (managerial self-governance) and large transformative agreements (part of state regulation). The uptake of OA, the influence of the disciplinary profile of universities and the influence of transformative agreements is measured by combining several data sources (incl. Web of Science, Unpaywall, an authority file of standardised German affiliation information, the ISSN-Gold-OA 4.0 list, and lists of publications covered by transformative agreements). For managerial self-governance, a structured data collection was created by harvesting different sources of information and by manual online search. To determine the explanatory power of the different factors, a series of regression analyses was performed for different periods and for both Full as well as Hybrid OA. As a result of the regression analyses, the most determining factor for the explanation of differences in the uptake of both OA-types turned out to be academic self-governance. For the year 2020, Hybrid OA transformative agreements have become a second relevant factor. However, all variables that reflect local infrastructural support and services for OA (managerial self-governance) turned out to be non-significant. To deepen the understanding of the adoption of OA on the level of institutions, the outcomes of the regression analyses are contextualised by an interview study conducted with 20 OA officers of German universities.

 

Charting variety, scope, and impact of open access diamond journals in various disciplines and regions: a survey-based observational study

Abstract
Purpose: The variety, scope, and impact of open access (OA) diamond journals across
disciplines and regions from July 22 to September 11, 2020 were charted to characterize
the current OA diamond landscape.

Methods: The total number of diamond journals was estimated, including those outside the
Directory of Open Access Journals (DOAJ). The distribution across regions, disciplines, and
publisher types was described. The scope of journals in terms of authorship and readership
was investigated. Information was collected on linguistic diversity, journal dynamics and life
cycle, and their visibility in scholarly databases.

Results: The number of OA diamond journals is estimated to be 29,000. OA diamond journals
are estimated to publish 356,000 articles per year. The OA diamond sector is diverse in terms of
regions (45% in Europe, 25% in Latin America, 16% in Asia, and 5% in the United States/Cana-
da) and disciplines (60% humanities and social sciences, 22% sciences, and 17% medicine). More
than 70% of OA diamond journals are published by university-owned publishers, including uni-
versity presses. The majority of OA diamond journals are small, publishing fewer than 25 articles
a year. English (1,210), Spanish (492), and French (342) are the most common languages of the
main texts. Out of 1,619 journals, 1,025 (63.3%) are indexed in DOAJ, 492 (30.4%) in Scopus,
and 321 (19.8%) in Web of Science.

Conclusion: The patterns and trends reported herein provide insights into the diversity and im-
portance of the OA diamond journal landscape and the accompanying opportunities and chal-
lenges in supporting this publishing model.

SocArXiv Papers | Misapplied Metrics: Variation in the h-index within and between disciplines

Abstract:  Scholars and university administrators have a vested interest in building equitable valuation systems of academic work for both practical (e.g., resource distribution) and more lofty purposes (e.g., what constitutes “good” research). Well-established inequalities in science pose a difficult challenge to those interested in constructing a parsimonious and fair method for valuation as stratification occurs within academic disciplines, but also between them. Despite warnings against the practice, the popular h-index has been formally used as one such metric of valuation. In this article, we use the case of the h-index to examine how within and between discipline inequalities extend from the reliance of metrics, an illustration of the risk involved in the so-called “tyranny of metrics.” Using data from over 42,000 high performing scientists across 120 disciplines, we construct multilevel models predicting the h-index. Results suggest significant within-discipline variation in several forms, including a female penalty, as well as significant between discipline variation. Conclusions include recommendations to avoid using the h-index or similar metrics for valuation purposes.

Guest Post – Has Peer Review Created a Toxic Culture in Academia? Moving from ‘Battering’ to ‘Bettering’ in the Review of Academic Research – The Scholarly Kitchen

” It seems that many reviewers see their primary role as deflating the arguments and methodologies of the manuscripts they receive, often without any concern for the way the author will receive the comments or whether the critique can be addressed and revised….

A format that might be appealing, not only to authors and reviewers but to publishers as well, takes a page out of the work of HSS book publishers and how they review manuscripts.

 

One of the main differences between the STEM journal and HSS book submission process is that book acquisitions editors get involved in the process before the manuscript is complete (and sometimes before there is a manuscript at all). This process starts at a relatively early stage in the writing process, creating a situation whereby editors are incentivized to help authors and sign them up before other publishers can swoop in and publish it themselves. Consider the potential parallels with the increasing use of journal preprints, as a place where journal editors could hop in and start the process of working with authors at an early stage in the process. (It may also help that you can submit book proposals to multiple publishers simultaneously)….”

 

SciELO – Brazil – Availability of Open Access journals by scientific fields, specialization and Open Access regulations in the YERUN universities Availability of Open Access journals by scientific fields, specialization and Open Access regulations in the YERUN universities

Abstract:  The availability of Open Access journals in the various fields of knowledge in Clarivate Analytics’ Web of Science is hypothesized to present strong inequalities, thus affecting the choice of journals by researchers wishing to publish their research results in Open Access. The first objective of this research was to contrast this hypothesis, by crossing the list of journals available at WoS with the lists of the Directory of Open Access Journals. The availability of OA journals presents strong inequalities, ranging from 5 to 40% depending on the field of knowledge. At the level of universities, such disparity in the availability of Open Access journals is an important factor regarding their accomplishment of Open Access mandates considering their specialization profiles. In this work, as the second objective, the publications available on the Web of Science (from 2016 to 2020) of the universities belonging to the YERUN Network (Young European Research Universities) are studied in order to identify their specialization profiles, their Open Access types (and evolution) and the possible interactions between their specialization and the availability of Open Access journals and their respective fields of specialization. A general overview of the volumes of funded research and the different proportions of Open Access and non-Open Access in funded and non-funded research is also provided. The indicator “Open Access Likelihood” is introduced and applied as a proxy for the likelihood of Open Access publications taking into account the fields of specialization of the YERUN universities. The results of its application underline the need to take into consideration both, specialization and Open Access availability when designing feasible Open Access mandates. Future research includes the study of the availability of Open Access journals by tiers of impact actors.

 

Open Research in the Humanities | Unlocking Research

“The Working Group on Open Research in the Humanities was chaired by Prof. Emma Gilby (MMLL) with Dr. Rachel Leow (History), Dr. Amelie Roper (UL), Dr. Matthias Ammon (MMLL and OSC), Dr. Sam Moore (UL), Prof. Alexander Bird (Philosophy), and Prof. Ingo Gildenhard (Classics). We met for four meetings in July, September, October and December 2021, with a view to steering and developing services in support of Open Research in the Humanities. We aimed notably to offer input on how to define Open Research in the Humanities, how to communicate effectively with colleagues in the Arts and Humanities (A&H), and how to reinforce the prestige around Open Research. We hope to add our perspective to the debate on Open Science by providing a view ‘from the ground’ and from the perspective of a select group of humanities researchers. These disciplinary considerations inevitably overlap, in some measure, with the social sciences and indeed some aspects of STEM, and we hope that they will therefore have a broad audience and applicability.

Academics in A&H are, in the main, deeply committed to sharing their research. They consider their main professional contribution to be the instigation and furthering of diverse cultural conversations. They also consider open public access to their work to be a valuable goal, alongside other equally prominent ambitions: aiming at research quality and diversity, and offering support to early career scholars in a challenging and often precarious employment landscape.  

Although A&H cover a diverse range of disciplines, it is possible to discern certain common elements which guide their profile and impact. These common elements also guide the discussion that follows….”