Research assessment exercises are necessary — but we need to learn to do them better

“Research evaluation at the Australian Research Council (ARC), one of the country’s main funding bodies, is set to get a makeover. Last month, an independent expert review recommended that the ARC scrap its 13-year-old research-scoring system, known as Excellence in Research for Australia (ERA), and its companion exercise, the Engagement and Impact assessment, which grades the real-world benefits of institutions’ work. Both had been on hold since last August, awaiting the findings of the review.

This is a rare case in which an evaluation system can be rewritten from scratch. The ARC should take this opportunity to improve how it measures and communicates the value of Australia’s research workforce, on the basis of not just lessons learnt from the ERA’s deficiencies, but also principles that have been developed and implemented elsewhere in the world. In doing so, it will help to create a research culture that reflects the best possible values that research should represent….”

Science Europe signs DORA – Science Europe

“Agreed at the last Science Europe Governing Board meeting and based on discussions and advice from the Science Europe Working Groups on Research Culture and Open Science, we are pleased to announce that Science Europe is signing the San Francisco Declaration on Research Assessment (DORA).”

What Should Impact Assessment Look Like for Social Science? — Sage

“A decade ago, the San Francisco Declaration on Research Assessment, or DORA, tackled the pressing need to improve how funders, institutions, policy makers and others evaluated scientific research and its outputs. Existing measures, centered on scholarly citation, tended to use where the outputs were published as a proxy for the research’s quality, utility, and impact, measuring all disciplines with the same yardstick.?

In the 10 years since, various efforts to improve assessment and measure societal impact have launched that downplay or even eliminate literature-based measurements. Ideas for these new measures focus on impact in the real world, address disciplinary differences such as those between social science and physical science, and offer useful tools for researchers and end-users alike.?

This panel will engage representatives of various social and behavioral science disciplines, as well as publishers, to discuss:?

What does impact assessment look like from their perch?

What should it look like??

How have their perspectives on impact changed over the last decade?

What changes would they like to see 10 years from now??

What necessary next steps should be taken – whether immediately practical or aspirational?…”

DORA 10th Anniversary Events | DORA

“The San Francisco Declaration on Research Assessment (DORA) will be 10 years old on May 16, 2023.

To mark the occasion, we invite the worldwide community to join with us in organizing a week of events to examine both DORA’s impact on the reform of research assessment and the challenges that still lie ahead. The momentum for change is building across the globe, but we need to keep pushing together to achieve our goals!…”

Towards the future of responsible research assessment: Announcing DORA’s new three-year strategic plan | DORA

“Five years after making a decisive shift to become an international campaigning initiative and completing the aims of its first strategic plan, the San Francisco Declaration on Research Assessment (DORA) is pleased to announce a new three-year strategic plan to continue its focus on implementing global research assessment reform.

To meet the goals of our first strategic plan, published in 2018, we have:

Worked to increase awareness through the creation of the resource library, our social media, public events and community calls, and co-hosting the meeting Driving Institutional Change for Research Assessment Reform with HHMI.
Promoted the use of tools and processes to implement reform by partnering with community members to hold workshops and create resources, establishing an international dialogue via the creation of communities of practice for research funders and initiatives working to implement assessment reform, and creating the case study repository and the community engagement grants program.
Extended the disciplinary and geographic scope of DORA by updating our operational structure to do our utmost to distribute power and address structural inequities that limited participation in the Steering Committee….”

Young researchers in action: the road towards a new PhD evaluation | DORA

“Less emphasis on bibliometrics, more focus on personal accomplishments and growth in research-related competencies. That is the goal of Young Science in Transition’s (Young SiT) new evaluation approach for PhD candidates in Utrecht, the Netherlands. But what do PhD candidates think about the new evaluation? With the DORA engagement grant, we did in-depth interviews with PhD candidates and found out how the new evaluation can be improved and successfully implemented.

The beginning: from idea to evaluation

Together with Young SiT, a thinktank of young scientists at the UMC Utrecht, we (Inez Koopman and Annemijn Algra) have been working on the development and implementation of a new evaluation method for PhD candidates since 2018.1 In this new evaluation, PhD candidates are asked to describe their progress, accomplishments and learning goals. The evaluation also includes a self-assessment of their competencies. We started bottom-up, small, and locally. This meant that we first tested our new method in our own PhD program (Clinical and Experimental Neurosciences, where approximately 200 PhD’s are enrolled). After a first round of feedback, we realized the self-evaluation tool (the Dutch PhD Competence Model) needed to be modernized. Together with a group of enthusiastic programmers, we critically reviewed its content, gathered user-feedback from various early career networks and transformed the existing model into a modern and user-friendly web-based tool.2

In the meantime, we started approaching other PhD programs from the Utrecht Graduate School of Life Sciences (GSLS) to further promote and enroll our new method. We managed to get support ‘higher up’: the directors and coordinators of the GSLS and Board of Studies of Utrecht University were interested in our idea. They too were working on a new evaluation method, so we decided to team up. Our ideas were transformed into a new and broad evaluation form and guide that can soon be used by all PhD candidates enrolled in one of the 15 GSLS programs (approximately 1800 PhD’s).

However, during the many discussions we had about the new evaluation one question kept popping up: ‘but what is the scientific evidence that this new evaluation is better than the old one’? Although the old evaluation, which included a list of all publications and prizes, was also implemented without any scientific evidence, it was a valid question. We needed to further understand the PhD perspective, and not only the perspective from PhD’s in early career networks. Did PhD candidates think the new evaluation was an improvement and if so, how it could be improved even further?

We used our DORA engagement grant to set up an in-depth interview project with a first group of PhD candidates using the newly developed evaluation guide and new version of the online PhD Competence Model. Feedback about the pros and cons of the new approach helps us shape PhD research assessment….”

Navigating Responsible Research Assessment Guidelines – Leiden Madtrics

“Responsible Research Assessment is discussed and used in many contexts. However, Responsible Research Assessment does not have a unifying definition, and likewise its guidelines indicate that the implementation of Responsible Research Assessment can have many different scopes.

Research assessment has a long history continuously introducing new methods, tools, and agendas, for example, peer review of publications dating back to 17th century and catalogues from the 19th century that facilitated publication counting. This blog post discusses Responsible Research Assessment (RRA), an agenda gaining attention today. The blog post gives an introduction to RRA and discusses how to navigate RRA guidelines, which can be a complex task….”

DORA at 10: Looking back at the history and forward to the future of research assessment | DORA

“DORA will be 10 years old in May 2023 and we are planning to mark the occasion! We’ll be holding a weeklong celebration for DORA’s 10th Anniversary and we’re inviting you to join in by organizing an event on research assessment for your local community. We want to have conversations about what DORA has done and what we still need to do all over the globe! DORA’s 10th Anniversary Celebration will be comprised of two parts:

DORA’s 10th Anniversary Celebration will be comprised of two parts:

Two plenary online sessions to discuss the state of the field, our past decade of work, and our future plans.
A global program of local or regional events that will allow communities to share insights and challenges in reforming, innovating, and researching responsible research assessment policies and practices….”

‘The attitude of publishers is a barrier to open access’ | UKSG

“Transitioning to open research is incredibly important for the University of Liverpool for two reasons: the external environment we are now operating in, and our own philosophy and approach to research.

But there are barriers, particularly the research culture and the attitude of publishers….

In my experience, the biggest barrier is culture: researchers are used to operating in a particular way. Changing practice and mindset takes time and must be conducted sensitively.

Open research benefits all researchers, so having their support on this journey is vitally important.

Some researchers are concerned that publishing their work open access has implications for their intellectual property (IP) rights. In fact, this is a perceived problem, since the same IP protections apply to all work, whether published behind a paywall or published open access.

Despite the recognition that citation metrics are not a suitable proxy for research assessment, some researchers continue to seek the kudos of publishing in a so-called prestige journal with a high-impact factor, such as ‘Nature’.  They see this as a key career goal and worry their progression will falter without this achievement….

So, while I acknowledge there has been significant progress towards open access globally, and in particular compliance with UKRI’s open access policy, the attitude of publishers which are driven by profit margins continues to be an unacceptable barrier….”

Responsible Research Assessment I: Implementing DORA for hiring and promotion in psychology | PsychArchives

Abstract:  The use of journal impact factors and other metric indicators of research productivity, such as the h-index, has been heavily criticized for being invalid for the assessment of individual researchers and for fueling a detrimental “publish or perish” culture. Multiple initiatives call for developing alternatives to existing metrics that better reflect quality (instead of quantity) in research assessment. This report, written by a task force established by the German Psychological Society, proposes how responsible research assessment could be done in the field of psychology. We present four principles of responsible research assessment in hiring and promotion and suggest a two-step assessment procedure that combines the objectivity and efficiency of indicators with a qualitative, discursive assessment of shortlisted candidates. The main aspects of our proposal are (a) to broaden the range of relevant research contributions to include published data sets and research software, along with research papers, and (b) to place greater emphasis on quality and rigor in research evaluation.


The Commission signs the Agreement on Reforming Research Assessment and endorses the San Francisco Declaration on Research Assessment

“Today, the Commission has signed the Agreement on Reforming Research Assessment. The Agreement sets a common direction for changes in assessment practices for research, researchers and research organisations, with the goal to maximise the quality and impact of research. It covers the principles, commitments and timeframe for reforms and lays out the principles for the Coalition for Advancing Research Assessment (CoARA). The Coalition is a group of organisations willing to work together to implement the reform. The Coalition’s establishment is one of the main expected outcomes of the European Research Area (ERASearch for available translations of the preceding linkEN•••) Policy Agenda for 2022-2024Search for available translations of the preceding linkEN•••, which includes an action to advance the reform of the assessment system for research, researchers and institutions.


Together with the Agreement’s signature, today also marked the Commission’s endorsement of the San Francisco Declaration on Research Assessment(DORA), which sets recommendations to improve the evaluation of researchers and the outputs of scholarly research.The Commission signed the Agreement and endorsed DORA in its capacity of a research funding organisation….”


DORA’s new policy on engagement and outreach for organizational signatories | DORA

DORA is pleased to announce today the publication of our Engagement and outreach policy for organizational signatories to the San Francisco Declaration on Research Assessment.

This policy aims to give a more specific answer to the question ‘How should DORA be enforced?’, which has arisen on a number of occasions when reports have been received of signatory organizations apparently not acting in compliance with the provisions of the Declaration. On such occasions, DORA has re-iterated that it is not an accrediting organization but seeks through constructive dialogue to resolve any misunderstandings.