Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment | DORA

“Join DORA for a community call to introduce two new responsible research evaluation tools and provide feedback on future tool development. The toolkit is part of Project TARA, which aims to identify, understand, and make visible the criteria and standards universities use to make hiring, promotion, and tenure decisions. This interactive call will explore these new tools, which were each created to help community members who are seeking:

Strategies on how to debias committees and deliberative processes: It is increasingly recognized that more diverse decision-making panels make better decisions. Learn how to debias your committees and decision-making processes with this one-page brief.
Ideas on how to incorporate a wider range of contributions in their evaluation policies and practices: Capturing scholarly “impact” often relies on familiar suspects like h-index, JIF, and citations, despite evidence that these indicators are narrow, often misleading, and generally insufficient to capture the full richness of scholarly work. Learn how to consider a wider breadth of contributions in assessing the value of academic activities with this one-page brief….”

Grants and hiring: will impact factors and h-indices be scrapped?

“Universities, scientific academies, funding institutions and other organizations around the world will have the option to sign a document that would oblige signatories to change how they assess researchers for jobs, promotions and grants.

Signatories would commit to moving away from standard metrics such as impact factors, and adopting a system that rewards researchers for the quality of their work and their full contributions to science. “People are questioning the way they are being evaluated,” says Stephane Berghmans, director of research and innovation at the European University Association (EUA). The Brussels-based group helped to draft the agreement, which is known as the Agreement on Reforming Researcher Assessment. “This was the time.” 

Universities and other endorsers will be able to sign the agreement from 28 September. The European Commission (EC) announced plans last November for putting together the agreement; it proposed that assessment criteria reward ethics and integrity, teamwork and a variety of outputs, along with ‘research quality’ and impact. In January, the commission began to draft the agreement with the EUA and others….”

Supporting Open Science in the Promotion & Tenure Process: Lessons from the University of Maryland

“The academic promotion and tenure process establishes the incentive structure for institutions of higher education. Open science champions have long advocated for the process to better reflect important open science scholarship that is often under-valued and neglected in academia. This webinar will highlight the five-year effort in the Psychology Department at the University of Maryland to adopt new guidelines that explicitly codify open science as a core criteria in tenure and promotion review. Discussion will include forces supporting and resisting open science behaviors and strategies for creating buy-in across the department. According to Dr. Doughetry, Department Chair, the new policy was necessary to ensure incentives for advancement reflect the values of scientists and their institutions.”

“Open Access Publishing Biases OER” by Chelsee Dickson and Christina Holm

Knowing that the peer review process can introduce issues of bias, what then of other aspects of the publishing cycle? For example, what of the subvention funding provided by some institutions to support their faculty in pursuing dissemination of research in Open Access (OA) journals? This Open Educational Resource (OER) will present an overview of the OA landscape and provide learners with tools to develop their own inquiries into the inequities present within the OA publishing industry. All assignments include suggested grading rubrics and build upon one another in a cumulative manner.

Colleges Should Reward Efforts to Make Research Open | MIT Libraries News

“We applaud the August 25 memorandum from the White House Office of Science and Technology Policy (OSTP) on Ensuring Free, Immediate, and Equitable Access to Federally Funded Research that calls on federal agencies to develop policies that will provide immediate open access to the outputs of federally funded research (“‘A Historic Moment’: New Guidance Requires Federally Funded Research to Be Open Access,” The Chronicle, August 25).

The potential benefits of immediate open access to research articles and to the data underlying the research include improving rigor and reliability, increased opportunity for reuse of data to ask new questions, faster and wider dissemination of new knowledge, broader participation in the research process, and the potential to reduce global inequities in publishing of and access to federally funded research.

Along with a diverse community of long-time advocates of open scholarship, we welcome the new OSTP guidance and its potential for accelerating a transition to a more open and equitable scholarly ecosystem. Funder requirements, however, are only one element of a complex system of norms and incentives. A major barrier to the widespread embrace of — and therefore the ultimate success of — mandates like the OSTP guidance is the degree to which scholars experience current incentive systems as at odds with practicing open scholarship. When individual career success incentives and reward systems — as codified in hiring, promotion, and tenure standards — are experienced as misaligned with open scholarship values and mandates, individual scholars are left in an impossible bind. Left unresolved, this misalignment will undermine the potential positive impacts of open scholarship generally and the OSTP guidance specifically, as many scholars are likely to navigate the seemingly inherent tensions via pro-forma compliance at best, and active resistance at worst. Something has to give.

The good news is that universities can make simple changes to hiring, promotion, and tenure practices to ensure that the work scholars do to make their research openly available is recognized and rewarded. Including language in hiring, promotion, and tenure guidelines that signal that open sharing of research outputs, and the impact of that sharing, is valued, will go a long way to aligning the incentives for career success with the practice of open scholarship — making what is now increasingly required, also what is rewarded.”

Declaración de CLACSO “Una nueva evaluación académica y científica para una ciencia con relevancia social en América Latina y el Caribe» | Universo Abierto

From Google’s English:  “This declaration was approved by the XXVII General Assembly of CLACSO, within the framework of the  9th Latin American and Caribbean Conference on Social Sciences , in Mexico City in June 2022. In turn, it was enriched with the contributions of various regional and international specialists and representatives of CLACSO member centers, who participated in the plenary “Balance, perspectives and challenges for a new agenda for academic evaluation in Latin America and the Caribbean” at the International Seminar of the  Forum Latin American Scientific Evaluation (FOLEC)- CLACSO  during the 9th. Conference.

In this way, CLACSO-FOLEC, together with a multiplicity of actors and actors committed to the issue, has managed to consolidate a common Declaration of Principles and high consensus on responsible academic evaluation from and for Latin America and the Caribbean. Following these guidelines, CLACSO-FOLEC seeks to promote the implementation of these principles – converted into proposals and tools for action – by the National Science and Technology Organizations, scientific institutions and higher education in the region. Likewise, it mobilizes the study and survey of good practices and different innovations in the evaluation processes,     

We would very much like your individual and/or institutional support for the Declaration. For that, you can offer your adhesion in the link.”

ALLEA’s Response to Council Conclusions on Research Assessment and Open Science – ALLEA

“ALLEA welcomes the adoption of the Conclusions on Research Assessment and Implementation of Open Science by the Council of the European Union on 10 June.

The Conclusions are in agreement with points that ALLEA has made over the years, in particular on the necessity of appropriately implementing and rewarding open science practices and the development of research assessment criteria that follow principles of excellence, research integrity and trustworthy science.

At the same time, ALLEA continues to stress that it matters how we open knowledge, as the push for Open Access publishing has also paved the way for various unethical publishing practices. The inappropriate use of journal- and publication-based metrics in funding, hiring and promotion decisions has been one of the obstacles in the transition to a more open science, and furthermore fails to recognize and reward the diverse set of competencies, activities, and outputs needed for our research ecosystem to flourish….”

Reforming research assessment: the Agreement is now final

“Launched in January 2022 as a co-creation exercise, the process of drafting an agreement for reforming research assessment has reached an important milestone. On 8 July, the final version of the agreement was presented at a Stakeholder Assembly bringing together the 350+ organisations from 40+ countries having expressed interest in being involved in the process. Today, the final Agreement is made public with this news.

Organisations involved include public and private research funders, universities, research centres, institutes and infrastructures, associations and alliances thereof, national and regional authorities, accreditation and evaluation agencies, learned societies and associations of researchers, and other relevant organisations, representing a broad diversity of views and perspectives. They have provided feedback to the evolving drafts of the agreement, as prepared by a team composed of representatives from the European University Association (EUA), Science Europe, the European Commission and Dr Karen Stroobants in her individual capacity as researcher with expertise in research on research. A core group of 20 research organisations, representing the diversity of the research community across Europe, also contributed to the drafting process, while EU Member States and Associated Countries have been consulted on the agreement in the framework of the ERA Forum and the European Research Area Committee (ERAC).

The Agreement on Reforming Research Assessment sets a shared direction for changes in assessment practices for research, researchers and research performing organisations, with the overarching goal to maximise the quality and impact of research.  The Agreement includes the principles, commitments and timeframe for reforms and lays out the principles for a Coalition of organisations willing to work together in implementing the changes. The Final version of the Agreement can be accessed here….”

Research assessment reform in action: Updates from research funders in Canada and New Zealand | DORA

“Research funding organizations are often thought of as leaders in the movement toward more responsible research evaluation practices. Often, the perception of “excellence” in research culture is filtered through the lens of who and what type of work receives funding. However, when a narrow set of indicators is used to determine who receives funding, the result can be a subsequent narrowing of academia’s perceptions of research excellence (e.g., journal impact factor (JIF), h-index). This places funders in the unique position of being able to help “set the tone” for research culture through their own efforts to reduce reliance on flawed proxy measures of quality and implement a more holistic approach to the evaluation of researchers for funding opportunities. Whether funders are seeking inspiration from their peers or insight on iterative policy development, the ability to come together to discuss reform activity is critical for achieving widespread culture change. At DORA’s June Funder Community of Practice (CoP) meetings, we heard how DORA is being implemented by the Natural Sciences and Engineering Research Council of Canada (NSERC) and the New Zealand Ministry of Business, Innovation and Employment (MBIE)….”

European Commission signs first grant agreements under Horizon Europe | European Research Executive Agency

The European Commission recently signed grant agreements with 49 projects that successfully applied to Horizon Europe: Reforming and Enhancing the European R&I System and Research Infrastructures.  

Find out more about these two funding opportunities and the upcoming projects below.   

Reforming and Enhancing the European R&I System 

Reforming the European R&I System is part of the Horizon Europe’s Widening participation and strengthening the European Research Area call (Destination 3). 

Call for funding opened on 08 June 2021 and closed on 23 September 2021. 

Out of the 44 applications received, 20 projects covering 15 topics were funded, for a total of about 50.5 million euros of European Commission contribution. 

Projects start between June 2022 and September 2022.

Find below an overview of the selected projects per call topic(s)/type(s) of action:

[…]

Journal prestige is still important in how scholars judge one another

“Aside from an individual’s personal interactions with another academic, the perceived quality of the journal where a researcher publishes is the most influential factor when forming an opinion on their academic standing, with almost half (49 percent) of 9,609 respondents saying it is important and 12 percent saying it is most important.

Asked about citation metrics, 24 percent say a scholar’s h-index and other similar measures are important, and 5 percent say they are the most crucial factor….

Last month more than 350 organizations from more than 40 countries signed a new compact, building on the 2015 Leiden Manifesto, which would see research evaluated mainly on qualitative measures and the journal-based metrics abandoned. That agreement came nearly 10 years after the signing of the San Francisco Declaration on Research Assessment, which sought to phase out the use of journal-based metrics when making funding, appointment and promotion decisions, and which has now been signed by almost 20,000 individuals and 2,600 institutions worldwide….”

Recognizing Our Collective Responsibility in the Prioritization of Open Data Metrics · Issue 4.3, Summer 2022

Abstract:  With the rise in data-sharing policies, development of supportive infrastructure, and the amount of data published over the last decades, evaluation and assessment are increasingly necessary to understand the reach, impact, and return on investment of data-sharing practices. As biomedical research stakeholders prepare for the implementation of the updated National Institutes of Health (NIH) Data Management and Sharing Policy in 2023, it is essential that the development of responsible, evidence-based open data metrics are prioritized. If the community is not mindful of our responsibility in building for assessment upfront, there are prominent risks to the advancement of open data-sharing practices: failing to live up to the policy’s goals, losing community ownership of the open data landscape, and creating disparate incentive systems that do not allow for researcher reward. These risks can be mitigated if the community recognizes data as its own scholarly output, resources and leverages open infrastructure, and builds broad community agreement around approaches for open data metrics, including using existing standards and resources. In preparation for the NIH policy, the community has an opportune moment to build for researchers’ best interests and support the advancement of biomedical sciences, including assessment, reward, and mechanisms for improving policy resources and supportive infrastructure as the space evolves.

Support Europe’s bold vision for responsible research assessment

“Concerns about the distorting effects of commonly used assessment procedures have already led to initiatives such as the San Francisco Declaration on Research Assessment (so far signed by more than 2,500 institutions, including Nature’s publisher Springer Nature, and 19,000 individuals); the Leiden Manifesto for research metrics; the SCOPE principles established by the International Network of Research Management Societies; and the Metric Tide report, commissioned by UK funding bodies. There are, in fact, at least 15 distinct efforts urging policymakers, funders and heads of institutions to ensure that assessment systems minimize harm.

Many of the architects of these projects are becoming concerned that each subsequent initiative amounts to more (no doubt, valuable) talk, but less by way of practical action.

The Agreement on Reforming Research Assessment, announced on 20 July and open for signatures on 28 September, is perhaps the most hopeful sign yet of real change. More than 350 organizations have pooled experience, ideas and evidence to come up with a model agreement to create more-inclusive assessment systems. The initiative, four years in the making, is the work of the European University Association and Science Europe (a network of the continent’s science funders and academies), in concert with predecessor initiatives. It has the blessing of the European Commission, but with an ambition to become global….”

Gardner et al. (2022) Implementing the Declaration on Research Assessment: a publisher case study

Gardner, Victoria, Mark Robinson, and Elisabetta O’Connell. 2022. “Implementing the Declaration on Research Assessment: A Publisher Case Study”. Insights 35: 7. DOI: http://doi.org/10.1629/uksg.573

Abstract

There has been much debate around the role of metrics in scholarly communication, with particular focus on the misapplication of journal metrics, such as the impact factor in the assessment of research and researchers. Various initiatives have advocated for a change in this culture, including the Declaration on Research Assessment (DORA), which invites stakeholders throughout the scholarly communication ecosystem to sign up and show their support for practices designed to address the misuse of metrics. This case study provides an overview of the process undertaken by a large academic publisher (Taylor & Francis Group) in signing up to DORA and implementing some of its key practices in the hope that it will provide some guidance to others considering becoming a signatory. Our experience suggests that research, consultation and flexibility are crucial components of the process. Additionally, approaching signing with a project mindset versus a ‘sign and forget’ mentality can help organizations to understand the practical implications of signing, to anticipate and mitigate potential obstacles and to support cultural change.

 

Reforming research assessment: what does it … | Open Research Europe

“This is an exciting announcement for Open Research Europe, as the Agreement would contribute to mainstream practices that support robustness, openness and transparency of research and the research process. Open Research Europe already implements many of these practices, such as:

No Journal Impact Factor, instead promoting the responsible use of individual article indicators.
Early sharing of results with open post-publication peer-review as well as an open data policy where research data supporting articles is deposited in trusted repositories, facilitating reproducibility of research.
14 article types accepted across all subject areas, including publication of confirmatory, null, and negative results.
Education, training, and support for researchers for peer-review and open research practices.
Opportunities for increased credit for additional activities undertaken by researchers, such as Advisory Board roles and peer reviewing (each peer-review can be cited independently from the article)….”