“DORA sought to fund ideas to advance assessment reform at academic institutions at any stage of readiness. Projects could be targeted to any level within an academic institution, including (but not limited to) reform efforts at the graduate program, department, library, or institution level, and should address one or more key aspects of education, planning, implementing, training, iteratively improving, and scaling policies and practices. More than 55 ideas were submitted from individuals and teams in 29 countries! After careful review, members of the Steering Committee selected 10 proposals to support….”
“Global research and education leader Wiley today announced it has signed the Declaration on Research Assessment (DORA), which is a world-wide initiative designed to improve the ways in which the outputs of scholarly research are evaluated.
As the publisher of nearly 2,000 academic journals, Wiley will deliver more ways to assess and recognize research outputs, which in turn supports healthy scholarship and allows more researchers to thrive in their careers. To this end, Wiley will roll out a broad range of journal and article metrics across its journal portfolio with the aim of providing a holistic, well-rounded view of the value and impact of any author’s research. This includes metrics that measure levels of impact beyond citation value, including usage, re-use, reproducibility, peer review assessment, geographic reach, and public recognition via references in media outlets….”
“The review, promotion, and tenure (RPT) process is central to academic life and workplace advancement. It influences where faculty direct their attention, research, and publications. By unveiling the RPT process, we can inform actions that lead towards a greater opening of research.
Between 2017 and 2022, we conducted a multi-year research project involving the collection and analysis of more than 850 RPT guidelines and 338 surveys with scholars from 129 research institutions across Canada and the US. Starting with a literature review of academic promotion and tenure processes, we launched six studies applying mixed methods approaches such as surveys and matrix coding.
So how do today’s universities and colleges incentivize open access research? Read on for 6 key takeaways from our studies….”
Abstract: There has been much debate around the role of metrics in scholarly communication, with particular focus on the misapplication of journal metrics, such as the impact factor in the assessment of research and researchers. Various initiatives have advocated for a change in this culture, including the Declaration on Research Assessment (DORA), which invites stakeholders throughout the scholarly communication ecosystem to sign up and show their support for practices designed to address the misuse of metrics. This case study provides an overview of the process undertaken by a large academic publisher (Taylor & Francis Group) in signing up to DORA and implementing some of its key practices in the hope that it will provide some guidance to others considering becoming a signatory. Our experience suggests that research, consultation and flexibility are crucial components of the process. Additionally, approaching signing with a project mindset versus a ‘sign and forget’ mentality can help organizations to understand the practical implications of signing, to anticipate and mitigate potential obstacles and to support cultural change.
“Good examples of leadership by example include work done by UKRI and NWO on narrative CVs. Some institutions are being bold as well, for example the work being done at Technical University of Denmark on Open Science Research Profiles; TU Delft are pioneering a cooperative self-assessment approach to research quality assurance; and of course, the pioneering work at Leiden University. Complementing these efforts are innovations in infrastructure like the CREDiT taxonomy and RAiD, the research activity identifier from ARDC. There are more leadership examples listed on this page compiled by Rachel Miles at Virginia Tech….”
DORA challenges academic institutions, funding organizations, and scholarly societies to include a broader representation of researchers in the design of responsible research assessment policies and practices. We apply this same standard to our organization, and we want to do our utmost to address the long-standing structural inequalities that limit participation and success in academia.
Over the past 18 months, DORA has reviewed our operational structure to see how we can better live our values as an organization. While we have global aspirations to accelerate research assessment reform, our governance effectively limited participation in the Steering Committee to representatives from organizations in Europe and North America. Although our Advisory Board had representation from every continent and provided a global perspective, this dual committee structure did not embody the aspirations for equity that we wish to see incorporated into responsible research assessment. The DORA Steering Committee and Advisory Board therefore worked collaboratively—through multiple rounds of discussion, feedback and refinements—to develop a new governance set-up. The work was informed by examples drawn from the wider scholarly communications community. We are particularly grateful to Invest in Open Infrastructure and Code for Science & Society for sharing their work on developing anti-racist governance.
“Here we outline reasons for and against accepting Elsevier’s 7th proposal. It is based on information provided by the University of Cambridge library. …
We recommend the deal be rejected after considering the following pro/cons. If you agree, please consider signing at the bottom of the document, and share with colleagues. (Anyone from UK can sign.) …”
“Each year, DORA reflects on progress toward responsible research assessment. How is research valued in different communities and how might that have changed in 2021? What tools are the community creating to support policy development? What types of research assessment policies are being developed to reduce the influence of journal-based metrics and recognize a broad range of contributions? How are communities coming together to improve practice and support culture change?
The following list of new developments and recommended reading, viewing, and listening were created with input from the DORA Steering Committee. While the search was extensive, it was not exhaustive, and we might have missed something in the process. Please let us know what other advances we should consider adding to the resource library (email firstname.lastname@example.org). …”
From Google’s English: “The approach by which Dutch science has risen to the top 5 in the world since the 1980s is under threat, write Raymond Poot and more than a hundred other scientists. Not through Open Access or Recognition and Valuation, but through the link between this and the signing of DORA and the roll-out of Open Science. In this contribution, Poot shares the conclusions and recommendations from a study into the consequences of Open Science and DORA. “A scenario of an internationally competitive Dutch science, where different talents can come into their own, is entirely possible. However, the current policy has to be drastically adjusted for that.” …
Dutch scientists are no longer assessed on the basis of international, scientific and measurable criteria, as was done very successfully at NWO for thirty years. These criteria have been partly removed by Open Science and DORA and replaced by criteria that are politically motivated and difficult to measure. As we described in our previous contribution in ScienceGuide, the negative effects of Open Science and DORA at NWO are amplified because measurable criteria are replaced by narratives. Sometimes the CV is even omitted entirely. …
To show that ‘policy’ based on Open Science and DORA contains major risks that we should not get used to, I wrote a report with Bas de Bruin and Frank Grosveld that goes deeper into the matter. The report is supported by 105 scientists (further support for the report can be emailed to Raymond Poot). In the report we discuss the effects of DORA on evaluations, and we examine the underlying reasoning behind DORA. We also discuss the focus of Open Science on the (direct) benefit of research for society, the focus on public involvement in research and the focus on team science and leadership.’. We discuss the current Open Access policy of Open Science, Plan S, to enforce Open Access for all Dutch scientific publications.
The conclusions of our report are alarming.
1) The combination of different Open Science policies with DORA puts the fundamental sciences at a disadvantage compared to the more applied sciences. Through the ERC and Marie Curie competitions, Europe spends twenty-five percent of its innovation budget on scientist-initiated fundamental research, which is selected for excellence. The Netherlands spends only five percent of its budget on such research. Europe has a reason to spend so much on scientist-initiated research, according to conclusion two of our report.
2) Scientist-initiated fundamental research that is selected on the basis of scientific quality provides considerably more social benefit per euro spent in the medium term than research that is selected on the basis of direct social or industrial relevance. This apparent paradox is related to the observation that the usefulness of scientific discoveries is very difficult to predict, while it is clear that without real discoveries there is little progress. While this message is difficult to sell to politicians, it is a very important one.
3) Various Open Science measures reduce the quality of Dutch science by not selecting for scientific quality and at the same time creating a lot of bureaucracy. …”
“As institutions experiment with and refine academic assessment policies and practices, there is a need for knowledge sharing and tools to support culture change. On September 9, 2021, we held a community call to gather early-stage input for a new resource: an interactive online dashboard to identify, track, and display good practices for academic career assessment. The dashboard is one of the key outputs of Tools to Advance Research Assessment (TARA), which is a DORA project sponsored by Arcadia – a charitable fund of Lisbet Rausing and Peter Baldwin to facilitate the development of new policies and practices for academic career assessment….
It comes as no surprise that academic assessment reform is complex. Institutions are at different stages of readiness for reform and have implemented new practices in a variety of academic disciplines, career stages, and evaluation processes. The dashboard aims to capture this progress and provide counter-mapping to common proxy measures of success (e.g., Journal Impact Factor (JIF), H-index, and university rankings). Currently, we picture the general uses of the dashboard will include:
Tracking policies: Collecting academic institutional standards for hiring, promotion, and tenure.
Capturing new and innovative policies: Enabling the ability to share new assessment policies and practices.
Visualizing content: Displaying source material to see or identify patterns or trends in assessment reform.
Because the dashboard will highlight positive trends and examples in academic career assessment, it is important to define what constitutes good practice. One idea comes from the 2020 working paper from the Research on Research Institute (RoRI), where the authors define responsible research assessment as: approaches to assessment which incentivize, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures….”
“Be as open as you can, publish as openly as you can, submit preprints and open data – but continue publishing in the journals that you think are the best for your career. No one has to become an open science martyr, you can be open without harming your career chances. But at the same time, recognize the deep flaws of the current system of evaluation and rewards and call for a reform – as an ERC grantee your voice carries weight….”
“Over the past year, it has become increasingly clear that research assessment reform is a systems challenge that requires collective action. Point interventions simply do not solve these types of complex challenges that involve multiple stakeholders. Because of this, we dedicated our efforts in 2020 on building a community of practice and finding new ways to support organizations seeking to improve the decision-making that impacts research careers.
Current events also influenced our approach this year and evolved our thinking about research assessment reform. The Covid-19 pandemic led to the abrupt global disruption of academic research, along with many other industries. For academics with limited access to research laboratories and other on-campus resources, work stalled. Without appropriate action, this disruption will have a profound effect on the advancement and promotion of the academic workforce, and it will likely disproportionately affect women and underrepresented and minoritized researchers. So in April DORA called on institutions to redefine their expectations and clearly communicate how evaluation procedures will be modified. In May, DORA organized a webinar with Rescuing Biomedical Research to better understand specific faculty concerns as a result of the pandemic….
In the Fall of 2020, DORA initiated a new community project with Schmidt to develop a means for institutions to gauge their ability to support academic assessment interventions and set them up for success. Our goal for the project was to support the development of new practices by helping institutions analyze the outcomes of their efforts. More than 70 individuals in 26 countries and 6 continents responded to our informal survey in August, and about 35 people joined us for 3 working sessions in September. From these activities, we heard it was important to look beyond individual interventions to improve assessment, because the success of these interventions depends on institutional conditions and capabilities. We were also reminded that institutional capabilities impact interventions, so it is important not only to gauge success but also to support interventions. These and other insights led us to create SPACE to Evolve Academic Assessment: a rubric for analyzing institutional conditions and progress indicators. The first draft of the rubric was developed in the last quarter of 2020. The final version was released in 2021 after an initial pilot phase with seven members of the academic community, including a college dean, policy advisor, research administrator, faculty member, and graduate student….
Another addition to the website was a repository of case studies documenting key elements of institutional change to improve academic career assessment, such as motivations, processes, timelines, new policies, and the types of people involved. The repository, Reimagining academic assessment: stories of innovation and change, was produced in partnership with the European University Association and SPARC Europe. At the time of launch, the repository included 10 structured case studies coming from 7 universities and 3 national consortia. Nine of the 10 cases are from Europe and one is from China. The case studies have shown us the importance of coalition-building to gain bottom-up support for change. We also learned that limited awareness and capacity for incentivizing and rewarding a broader range of academic activities were challenges that all the cases had to overcome. By sharing information about the creation of new policies and practices, we hope the case studies will serve as a source of inspiration for institutions seeking to review or improve academic career assessment….
Policy progress for research assessment reform continued to gain momentum in 2020. A new national policy on research assessment in China announced in February prohibits cash rewards for research papers and indicates that institutions can no longer exclusively hire or promote researchers based on their number of publications or citations. In June, Wellcome published guidance for research organizations on how to implement responsible and fair approaches for research assessment that are grounded i
“One lesson is that the declaration’s authors did not consider redundancy as a possible outcome of research assessment, focusing instead on hiring, promotion and funding decisions. However, in my view, redundancy processes should not be delegated to crude metrics and should be informed by the principles of Dora.
That said, it is not Dora’s job as an organisation to intervene in the gritty particulars of industrial disputes. Nor can we arbitrate in every dispute about research assessment practices within signatory organisations. …
Recently, we have re-emphasised that university signatories must make it clear to their academic staff what signing Dora means. Organisations should demonstrate their commitment to Dora’s principles to their communities, not seek accreditation from us. In doing so, they empower their staff to challenge departures from the spirit of the declaration. Grant conditions introduced by signatory funders such as the Wellcome Trust and Research England buttress this approach.
Dora’s approach to community engagement taps into the demand for research assessment reform while acknowledging the lack of consensus on how best to go about it. The necessary reforms are complex, intersecting with the culture change needed to make the academy more open and inclusive. They also have to overcome barriers thrown up by academics comfortable with the status quo and the increasing marketisation of higher education. In such a complex landscape, Dora has no wish to be prescriptive. Rather, we need to help institutions find their own way, which will sometimes mean allowing room for course corrections….”
“Dutch academics are putting their foot on the gas in the rebellion against the uncritical use of journal impact factors to recognise and reward researchers, which was set in motion by the 2012 San Francisco Declaration on Research Assessment, or DORA.
From early next year, Utrecht University in the Netherlands will officially stop using the so-called ‘impact factor’ in all its hiring and promotions and judge its researchers by their commitment to open science, teamwork, public engagement and data sharing.
And despite opposition from some Dutch professors, the sweeping changes are gathering pace, with Leiden University among the Dutch institutions also pledging their support with their Academia in Motion paper….”
“A US$1.2 million grant will fund an effort to identify and publicize the criteria that universities around the world use to hire and promote researchers. The Declaration on Research Assessment (DORA), a global initiative to reform the evaluation of researchers, will use part of the funds to create an interactive dashboard that will shine much-needed light on a process that is often opaque and controversial, says programme director Anna Hatch, who is based in Washington DC. “When criteria are visible and transparent, universities can be held accountable,” she says. “Researchers will know how their contributions will be measured, so they can make a better case for themselves.”
DORA, conceived in 2012 at the annual meeting of the American Society for Cell Biology, called for improvements to the evaluation of researchers and the outputs of scholarly research. The declaration specifically calls for doing away with impact factors as a way to judge the merit of academics. So far, it has been signed by more than 20,000 individuals and institutions around the world.
The grant is from the Arcadia Fund, a UK-based charity that has supported many academic initiatives since its founding in 2001….”