HEIs must embrace 2028 REF’s research culture focus | Times Higher Education (THE)

“The UK’s next Research Excellence Framework (REF) exercise promises changes that could make research more effective and the research environment more equitable. It is up to higher education institutions whether this rare opportunity to recognise the teams they rely on is seized or squandered….

Of course publications are vital, but they are far from the only important output, and they frequently owe their existence to other, overlooked research outputs. For instance, about 70 per cent of researchers from across disciplines report that software is fundamental to their work. But of the 186,000 outputs submitted to the last REF, only 31 were for data work and 11 for software….”

My research culture is better than yours | Wonkhe

“So whilst I disagree with Iain Mansfield that it’s a mistake to allocate 25 per cent of REF outcomes to research culture, we need to make sure this has the desired long-term effect. The risk of pitting us all against each other in some unholy research culture competition is that hyper-competition was always at the heart of so many of our unhelpful research cultures. In fact, a lot of the research culture challenges we face are outwith the agency and reach of individual institutions, leaving collaboration as our only mechanism to create real change.

One thing is for sure: if we don’t get this right and research culture does become the next big competition in HE, we all know who’s going to win: our large, old and wealthy friends, the Very Research Intensives. Not only do they do more research – a fundamental prerequisite when it comes to research culture – they also benefit from many other forms of social and economic ‘research capital’….”

Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessment

“This review was commissioned by the joint UK higher education (HE) funding bodies as part of the Future Research Assessment Programme (FRAP). It revisits the findings of the 2015 review The Metric Tide to take a fresh look at the use of indicators in research management and assessment. 

While this review feeds into the larger FRAP process, the authors have taken full advantage of their independence and sought to stimulate informed and robust discussion about the options and opportunities of future REF exercises. The report should be read in that spirit: as an input to ongoing FRAP deliberations, rather than a reflection of their likely or eventual conclusions. 

The report is written in three sections. Section 1 plots the development of the responsible research assessment agenda since 2015 with a focus on the impact of The Metric Tide review and progress against its recommendations. Section 2 revisits the potential use of metrics and indicators in any future REF exercise, and proposes an increased uptake of ‘data for good’. Section 3 considers opportunities to further support the roll-out of responsible research assessment policies and practices across the UK HE sector. Appendices include an overview of progress against the recommendations of The Metric Tide and a literature review. 

We make ten recommendations targeted at different actors in the UK research system, summarised as: 

1: Put principles into practice. 

2: Evaluate with the evaluated. 

3: Redefine responsible metrics. 

4: Revitalise the UK Forum. 

5: Avoid all-metric approaches to REF. 

6: Reform the REF over two cycles. 

7: Simplify the purposes of REF. 

8: Enhance environment statements. 

9: Use data for good. 

10: Rethink university rankings….”

Should the UK replace journals with a REF repository? | Times Higher Education (THE)

“There is a long-standing debate about whether the UK’s Research Excellence Framework is a waste of time and money given its insistence on re-assessing tens of thousands of papers that have already been reviewed by journals. Why not just base REF scores on journal rankings instead?

One answer is that, as Robert de Vries put it in a recent article for Times Higher Education, journal-administered peer review “sucks”. De Vries is conscious, though, that the obvious alternative to journals, post-publication review on subject repositories, might quickly descend into a social-media-style “attention-economy hellscape”, which would be even worse.

His solution is to oblige everyone who publishes on such platforms to undertake post-publication review to ensure that visibility is a function of merit. But I believe that a specific REF repository would be a better solution, eliminating reviewing redundancy while upholding high standards….”

Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021 – Thelwall – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from Altmetric.com and Mendeley associate with individual article quality scores. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014–2017/2018, split into 34 broadly field-based Units of Assessment (UoAs). Altmetrics correlated more strongly with research quality than previously found, although less strongly than raw and field normalized Scopus citation counts. Surprisingly, field normalizing citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best altmetric (e.g., three Spearman correlations with quality scores above 0.5), tweet counts are also a moderate strength indicator in eight UoAs (Spearman correlations with quality scores above 0.3), ahead of news (eight correlations above 0.3, but generally weaker), blogs (five correlations above 0.3), and Facebook (three correlations above 0.3) citations, at least in the United Kingdom. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities.

Can artificial intelligence assess the quality of academic journal articles in the next REF? | Impact of Social Sciences

“For journal article prediction, there is no knowledge base related to quality that could be leveraged to predict REF scores across disciplines, so only the machine learning AI approach is possible. All previous attempts to produce related predictions have used machine learning (or statistical regression, which is also a form of pattern matching). Thus, we decided to build machine learning systems to predict journal article scores. As inputs, based on an extensive literature review of related prior work, we chose: field and year normalised citation rate; authorship team size, diversity, productivity, and field and year normalised average citation impact; journal names and citation rates (similar to the Journal Impact Factor); article length and abstract readability; and words and phrases in the title, keywords and abstract. We used provisional REF2021 scores for journal articles with these inputs and asked the AI to spot patterns that would allow it to accurately predict REF scores….”

Open access in scholarly publishing: Where are we now? | Research Information

“Notably, 2023 marks a decade since two important events. Not only David Bowie’s return to releasing records, but Research Councils UK’s (the predecessor to UKRI) launch of its open access policy. This was a watershed moment for UK research, a clear statement of intent to make open access a full-scale reality. But 10 years on, it is pertinent to ask, where are we now?…

In fact, 2022 certainly witnessed a continuing paradigm shift, particularly UKRI’s open access policy coming into effect for articles and conference proceedings. This represents a step-change to full and immediate open access for publicly funded research, and essentially incorporates Plan S into the UK research landscape. Similar policies have been launched by other funders, including the National Institute for Health & Care Research and Cancer Research UK. 

 

Moreover, 2022 saw the release of the Research Excellence Framework (REF) 2021 results, marking another milestone for open access. REF 2021’s open access mandate for journal articles and conference proceedings has arguably had the greatest impact in driving open access engagement by researchers. What was once a niche pursuit that was opposed by many researchers is now overwhelmingly regarded as an everyday part of the research lifecycle. There is a growing sense of positive engagement too, with researchers increasingly publishing open access because they want to and not just because they have to….”

[2212.07811] Do altmetric scores reflect article quality? Evidence from the UK Research Excellence Framework 2021

Abstract:  Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from this http URL and Mendeley associate with journal article quality. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014-17/18, split into 34 Units of Assessment (UoAs). The results show that altmetrics are better indicators of research quality than previously thought, although not as good as raw and field normalised Scopus citation counts. Surprisingly, field normalising citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best, tweet counts are also a relatively strong indicator in many fields, and Facebook, blogs and news citations are moderately strong indicators in some UoAs, at least in the UK. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities. The Altmetric Attention Score, although hybrid, is almost as good as Mendeley reader counts as a quality indicator and reflects more non-scholarly impacts.

 

RI Webinar: Conforming to the REF: An international view – 1579136

“Summary

The REF is a UK-specific measure for research institutions to assess the quality of their research output and is pertinent to libraries, research offices, university planning departments and institutions. But this webcast will aim to look at this within a global context and explore other frameworks that are in place around the world.

Key takeaways

Learn how the outputs of scholarly research are evaluated globally

Hear from experts about best practices in the assessment of researchers and scholarly research

Key insights into why representation of researchers in the design of research assessment practices across the world is crucial…”

David Sweeney: UK right to pursue impact agenda | Times Higher Education (THE)

“Mr Sweeney’s powerful influence in steering the UK sector towards open-access research is a key part of his legacy, helping to set up the Finch report in 2011, which later laid down the “unanswerable” principle that “results of research that has been publicly funded should be freely accessible in the public domain”. As UK Research and Innovation’s lead on open access, Sweeney was also influential in ensuring the funder was an early supporter of Plan S, the Europe-wide open access drive, while UKRI’s own policies, which took effect in April, pushed requirements further. “The Finch report was significant and moved the dial on open access but without this global collaboration we won’t be able to move the system further,” he reflected….”

Assessing the Impact of the UK’s Research Excellence Framework on the Relationship between University Scholarly Output and Education and Regional Economic Growth | Academy of Management Learning & Education

Abstract:  This paper assesses the relationship between stakeholder influence, university scholarly and educational output, and regional economic growth. Specifically, we theorize that stakeholder intervention with respect to university teaching and learning, scholarly research, and entrepreneurship enhances the contribution of universities to regional economic growth. We test this theory using data from the UK’s Research Excellence Framework (REF), an evaluation of the research impact of British higher education institutions. We find that business school graduates, as well as graduates in STEM and health fields, have a positive impact on regional human capital development. On the other hand, stakeholder influence, through the REF, appears to have a negative effect on the retention of human capital, but a positive effect on commercialization in the region. Our findings provide new evidence of positive economic spillovers arising from university research and education and the role of fields, such as business administration, in enhancing human capital development and economic growth. They also lend credence to the notion that graduates are an important channel of knowledge and technology transfer.

 

Future Research Assessment Programme – UKRI

“The Future Research Assessment Programme has been initiated at the request of UK and devolved government ministers and funding bodies.

This significant piece of work will be led by the four UK higher education funding bodies:

Research England
Scottish Funding Council
Higher Education Funding Council for Wales
Department for the Economy, NI.

It aims to explore possible approaches to the assessment of UK higher education research performance.

Through dialogue with the higher education sector, the programme seeks to understand what a healthy, thriving research system looks like and how an assessment model can best form its foundation.

The work strands include evaluating the REF 2021, understanding international research assessment practice, and well investigating possible evaluation models and approaches, to identify those that can encourage and strengthen the emphasis on delivering excellent research and impact, and support a positive research culture, while simplifying and reducing the administrative burden on the HE sector.

This programme of work is expected to conclude by late 2022….”

Open Access Monographs: Making Mandates Reality Tickets, Thu 23 Jun 2022 at 14:00 | Eventbrite

“This half-day webinar galvanises a much-needed sector-wide conversation on OA monographs in the context of the UK’s policy landscape. Expert panels of speakers from the library, publishing and policy worlds will outline the current state-of-play and discuss how we can move to meet the imminent OA mandates from cOAlition S/Plan S in Europe and UKRI in the UK, and potential implications of the REF.

Featuring expert speakers from UKRI (Rachel Bruce) and Jisc (Caren Milloy), the event will open with a discussion of monograph policies and mandates before moving to an academic viewpoint from Professor Martin Eve (Birkbeck, University of London) who will talk about various international OA funding models and the need to move quickly from pilot phases to business as usual.

The second half of the session will highlight the challenges of getting OA metadata into supply chains and systems often designed for closed books, and will discuss the concomitant challenges posed by metrics and reporting on OA books (speakers TBC). The afternoon will close with a view from the library perspective and expert speakers from the libraries at the Universities of York (Sarah Thompson), Aberdeen (Simon Bains) and Imperial College (Chris Banks). There will be time for Q&A after each set of speakers….”

Evaluating research assessment | Jisc

“A large-scale review, commissioned by Research England on behalf of the four higher education funding bodies and published by RAND Europe, collected attitudes to the REF in real time as UK institutions prepared their submissions. It gathered views via a survey (with 3,000+ researcher responses), as well as focus groups and one-to-one interviews with researchers, research managers, and institutional leads.

The review also considered the impact of changes made to the REF since the previous exercise in 2014….”

Industry not harvest: Principles to minimise collateral damage in impact assessment at scale | Impact of Social Sciences

“As the UK closes the curtains on the Research Excellence Framework 2021 (REF2021) and embarks on another round of consultation, there is little doubt that, whatever the outcome, the expectation remains that research should be shown to be delivering impact. If anything, this expectation is only intensifying. Fuelled by the stated success of REF 2014, the appetite for impact assessment also appears – at least superficially – to be increasing internationally, albeit largely stopping short of mirroring a fully formalised REF-type model. Within this context, the UK’s Future Research Assessment Programme was recently announced, with a remit to explore revised or alternative approaches. Everything is on the table, so we are told, and the programme sensibly includes the convening of an external body of international advisors to cast their, hopefully less jaded eyes upon proceedings….”