Impact Factors, Altmetrics, and Prestige, Oh My: The Relationship Between Perceived Prestige and Objective Measures of Journal Quality | SpringerLink

Abstract:  The focus of this work is to examine the relationship between subjective and objective measures of prestige of journals in our field. Findings indicate that items pulled from Clarivate, Elsevier, and Google all have statistically significant elements related to perceived journal prestige. Just as several widely used bibliometric metrics related to prestige, so were altmetric scores.

 

Starstruck by journal prestige and citation counts? On students’ bias and perceptions of trustworthiness according to clues in publication references | SpringerLink

Abstract:  Research is becoming increasingly accessible to the public via open access publications, researchers’ social media postings, outreach activities, and popular disseminations. A healthy research discourse is typified by debates, disagreements, and diverging views. Consequently, readers may rely on the information available, such as publication reference attributes and bibliometric markers, to resolve conflicts. Yet, critical voices have warned about the uncritical and one-sided use of such information to assess research. In this study we wanted to get insight into how individuals without research training place trust in research based on clues present in publication references. A questionnaire was designed to probe respondents’ perceptions of six publication attributes. A total of 148 students responded to the questionnaire of which 118 were undergraduate students (with limited experience and knowledge of research) and 27 were graduate students (with some knowledge and experience of research). The results showed that the respondents were mostly influenced by the number of citations and the recency of publication, while author names, publication type, and publication origin were less influential. There were few differences between undergraduate and graduate students, with the exception that undergraduate students more strongly favoured publications with multiple authors over publications with single authors. We discuss possible implications for teachers that incorporate research articles in their curriculum.

 

Less ‘prestigious’ journals can contain more diverse research, by citing them we can shape a more just politics of citation. | Impact of Social Sciences

“The ‘top’ journals in any discipline are those that command the most prestige, and that position is largely determined by the number of citations their published articles garner. Despite being highly problematic, citation-based metrics remain ubiquitous, influencing researchers’ review, promotion and tenure outcomes. Bibliometric studies in various fields have shown that the ‘top’ journals are heavily dominated by research produced in and about a small number of ‘core’ countries, mostly the USA and the UK, and thus reproduce existing global power imbalances within and beyond academia.

In our own field of higher education, studies over many years have revealed persistent western hegemony in published scholarship. However, we observed that most studies tend to focus their analysis on the ‘top’ journals, and (by default) on those that publish exclusively in English. We wondered if publication patterns were similar in other journals. So, we set out to compare (among other things) the author affiliations and study contexts of articles published in journals in the top quartile of impact (Q1), with those in the bottom quartile of impact (Q4)….”

Why making academic research free is complicated – Vox

“Freeing research largely paid for by taxpayer money can seem like a no-brainer, but over time, the potential downsides of open science efforts like the Plan S mandate have become more apparent. While pay-to-publish but free-to-read platforms bring more research to the public, they can add barriers for researchers and worsen some existing inequalities in academia. Scientific publishing will remain a for-profit industry and a highly lucrative one for publishers. Shifting the fees onto authors doesn’t change this.

Many of the newly founded open-access journals drop the fees entirely, but even if they’re not trying to make a profit, they still need to cover their operating costs. They fall back on ad revenue, individual donations or philanthropic grants, corporate sponsorship, and even crowdfunding.

But open-access platforms often lack the prestige of well-known top journals like Nature. Scientists early in their careers — as well as those at less wealthy universities in low-income countries — often rely on precarious, short-term grant funding to carry out their research. Their career depends on putting out an impressive publication record, which is already an uphill battle….”

 

Communities, Commoning, Open Access and the Humanities: An Interview with Martin Eve – ScienceOpen

Abstract:  Leading open access publishing advocate and pioneer Professor Martin Paul Eve considers several topics in an interview with WPCC special issue editor Andrew Lockett. These include the merits of considering publishing in the context of commons theory and communing, digital platforms as creative and homogenous spaces, cosmolocalism, the work of intermediaries or boundary organisations and the differing needs of library communities. Eve is also asked to reflect on research culture, the academic prestige economy, the challenges facing the humanities, digital models in trade literature markets and current influences in terms of work in scholarly communications and recent academic literature. Central concerns that arise in the discussion are the importance of values and value for money in an environment shaped by increasing demands for policies determined by crude data monitoring that are less than fully thought through in terms of their impact and their implications for academics and their careers.

 

Journal prestige is still important in how scholars judge one another

“Aside from an individual’s personal interactions with another academic, the perceived quality of the journal where a researcher publishes is the most influential factor when forming an opinion on their academic standing, with almost half (49 percent) of 9,609 respondents saying it is important and 12 percent saying it is most important.

Asked about citation metrics, 24 percent say a scholar’s h-index and other similar measures are important, and 5 percent say they are the most crucial factor….

Last month more than 350 organizations from more than 40 countries signed a new compact, building on the 2015 Leiden Manifesto, which would see research evaluated mainly on qualitative measures and the journal-based metrics abandoned. That agreement came nearly 10 years after the signing of the San Francisco Declaration on Research Assessment, which sought to phase out the use of journal-based metrics when making funding, appointment and promotion decisions, and which has now been signed by almost 20,000 individuals and 2,600 institutions worldwide….”

‘Replacing Academic Journals’ | Jeff Pooley

[…]

There’s lots to unpack in the Brembsian alternative proposed here. One cornerstone is the adoption of open standards that—as best I understand it—would enable university repositories and nonprofit, community-led platforms like Open Library of Humanities (OLH) to form a kind of global, interoperable library. A second cornerstone is a regulated market for services. In an open procurement process, publishers and other firms—nonprofit or otherwise—would submit bids for peer review services, for example, or for copy editing or even writing software. The idea is that a regulated marketplace will, through competition enabled by open standards, discipline the overall system’s cost.

It’s a fascinating proposal, one that—as the paper notes—could be implemented with existing technologies. The problem is the lever of change. The incumbent publishers’ entrenched position, Brembs et al explain, renders a first move by libraries or scholars impractical. That leaves funders, whose updated rules and review criteria could, the paper argues, tip the incentive structure in the direction of an open, journal-free alternative.

[…]

 

Rethinking Research Assessment for the Greater Good: Findings from the RPT Project – Scholarly Communications Lab | ScholCommLab

“The review, promotion, and tenure (RPT) process is central to academic life and workplace advancement. It influences where faculty direct their attention, research, and publications. By unveiling the RPT process, we can inform actions that lead towards a greater opening of research. 

Between 2017 and 2022, we conducted a multi-year research project involving the collection and analysis of more than 850 RPT guidelines and 338 surveys with scholars from 129 research institutions across Canada and the US. Starting with a literature review of academic promotion and tenure processes, we launched six studies applying mixed methods approaches such as surveys and matrix coding.

So how do today’s universities and colleges incentivize open access research? Read on for 6 key takeaways from our studies….”

At what point do academics forego citations for journal status? | Impact of Social Sciences

“The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evidence from a new analysis of business and management academics, Rossella Salandra and Ammon Salter and James Walker¸ explore how journal status is valued by these academics and the point at which journal status becomes more prized than academic influence….”

At what point do academics forego citations for journal status? | Impact of Social Sciences

“The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evidence from a new analysis of business and management academics, Rossella Salandra and Ammon Salter and James Walker¸ explore how journal status is valued by these academics and the point at which journal status becomes more prized than academic influence….”

An open science argument against closed metrics

“In the Open Scientist Handbook, I argue that open science supports anti-rivalrous science collaborations where most metrics are of little, or of negative value. I would like to share some of these arguments here….

Institutional prestige is a profound drag on the potential for networked science. If your administration has a plan to “win” the college ratings game, this plan will only make doing science harder. It makes being a scientist less rewarding. Playing finite games of chasing arbitrary metrics or ‘prestige’ drags scientists away from the infinite play of actually doing science….

As Cameron Neylon said at the metrics breakout of the ‘Beyond the PDF’ conference some years ago, “reuse is THE metric.” Reuse reveals and confirms the advantage that open sharing has over current, market-based, practices. Reuse validates the work of the scientist who contributed to the research ecosystem. Reuse captures more of the inherent value of the original discovery and accelerates knowledge growth….”

bjoern.brembs.blog » Why publication services must not be negotiated

“Recently, the “German Science and Humanities Council” (Wissenschaftsrat) has issued their “Recommendations on the Transformation of Academic Publishing: Towards Open Access“. On page 33 they write that increasing the competition between publishers is an explicit goal of current transformative agreements:

publishers become publication service providers and enter into competition with other providers

This emphasis on competition refers back to the simple fact that as content (rather than service) providers, legacy publishers currently enjoy monopolies on their content, as, e.g., the European Commission has long recognized: In at least two market analyses, one dating as far back as 2003 and one from 2015, the EC acknowledges the lack of a genuine market due to the lack of substitutability…

Without such prestige, the faculty argue, they cannot work, risk their careers and funding. Arguments that these ancient vehicles are unreliable, unaffordable and dysfunctional are brushed away by emphasizing that their academic freedom allows them to drive whatever vehicle they want to their field work. Moreover, they argue, the price of around one million is “very attractive” because of the prestige the money buys them.

With this analogy, it becomes clear why and how tenders protect the public interest against any individual interests. In this analogy, it is likely also clear that academic freedom does not and should not trump all other considerations. In this respect, I would consider the analogy very fitting and have always argued for such a balance of public and researcher interests: academic freedom does not automatically exempt academics from procurement rules.

Therefore, ten experts advocate a ban on all negotiations with publishers and, instead, advocate policies that ensure that all publication services for public academic institutions must be awarded by tender, analogous the the example set by Open Research Europe and analogous to how all other, non-digital infrastructure contracts are awarded.”

bjoern.brembs.blog » Replacing the prestige signal

“Evidence suggests that the prestige signal in our current journals is noisy, expensive and flags unreliable science. There is a lack of evidence that the supposed filter function of prestigious journals is not just a biased random selection of already self-selected input material. As such, massive improvement along several variables can be expected from a more modern implementation of the prestige signal….

How could a more modern system support a ‘prestige signal’ that would actually deserve the moniker? Obviously, if journals were to be replaced by a modern information infrastructure, only our imagination is the limit for which filters the scholarly community may want to implement. Some general ideas may help guide that brainstorming process: If the use of such ‘prestige’ not only in career advancement and funding, but also in the defense of the current system is anything to go by, there should be massive demand for a prestige signal that was worth its price. Today, this prestige arises from selectivity based on expertise (Nature’s slogan always was “the world’s best science”). This entails an expert-based filter that selects only very few (‘the best’) out of of the roughly 3 million peer-reviewed articles being published each year. Importantly, there is no a priori need to objectively specify and determine the criteria for this filter in advance. In a scenario after all journals had been replaced by a modern infrastructure for text, data and code, such services (maybe multiple services, competing for our subscriptions?) need only record not just the articles they selected (as now) but also those they explicitly did not select in addition to the rest that wasn’t even considered. Users (or the services themselves or both) would then be able to compute track records of such services according to criteria that are important to them.

Mimicking current implementations, e.g., the number of citations could be used to determine which service selected the most highly cited articles, how many it missed and which it falsely didn’t even consider. But why stop at bare citations? A modern infrastructure allows for plenty of different markers for scholarly quality. One could just as well use a (already existing) citation typology to differentiate between different types of citations, one could count replications, media mentions, anything, really, to derive track records by which these services may be compared. Given the plentiful demand indicated by the fervent supporters of prestigious journals, services would compete with each other using their track records for the subscriptions of individual and institutional users, providing for innovation at competitive prices, just like any other service market. Such an efficient, competitive marketplace of services, however, can ever only arise, if the current monopoly journals are replaced with a system that allows for such a market to be designed. If demand was not as high as expected, but such a signal nevertheless desired by some, a smaller, more basic implementation could be arranged on a non-profit, open source basis, funded by the vast savings that replacing journals would entail. One may also opt to hold competitions for such services, awarding prizes to the service that best serves the needs of the scholarly community. The possibilities are endless – but only once the scholarly community finds a way to put itself into a position where it has any power over the implementation of its infrastructure.”

 

Journal Prestige Index: Expanding the Horizons of Assessment of Research Impact

“In the backdrop of the above facts, a new research metric for a transparent, fair and comprehensive assessment of research impact was the need of the hour. Of late, Higher Education Commission (HEC) of Pakistan has taken a bold initiative in this regard and developed a new combinatorial, proprietary and derived metric known as journal prestige index (JPI), which takes into account six well established, publicly available, and most influential citation-related parameters for its calculation.7 These were chosen from a list of 29 different citation and usage metrics.8 Both, raw scores and percentiles, are used to give equal weightage to all six factors….

HJRS represents a bold step in right direction for academic journal recognition and ranking.”