“Our vision is that the assessment of research, researchers and research organisations recognises the diverse outputs, practices and activities that maximise the quality and impact of research. This requires basing assessment primarily on qualitative judgement, for which peer review is central, supported by responsible use of quantitative indicators.”
“Launched in January 2022 as a co-creation exercise, the process of drafting an agreement for reforming research assessment has reached an important milestone. On 8 July, the final version of the agreement was presented at a Stakeholder Assembly bringing together the 350+ organisations from 40+ countries having expressed interest in being involved in the process. Today, the final Agreement is made public with this news.
Organisations involved have provided feedback to the evolving drafts of the agreement, as prepared by a team composed of representatives from the European University Association (EUA), Science Europe, and the European Commission, alongside Dr Karen Stroobants in her individual capacity as researcher with expertise in research on research.
A core group of 20 research organisations, representing the diversity of the research community across Europe, also contributed to the drafting process, while EU Member States and Associated Countries have been consulted on the agreement in the framework of the ERA Forum and the European Research Area Committee (ERAC).
The Agreement on Reforming Research Assessment sets a shared direction for changes in assessment practices for research, researchers and research performing organisations, with the overarching goal to maximise the quality and impact of research. The Agreement includes the principles, commitments and timeframe for reforms and lays out the principles for a Coalition of organisations willing to work together in implementing the changes.
Signatories will commit to a common vision, which is that the assessment of research, researchers and research organisations recognises the diverse outputs, practices and activities that maximise the quality and impact of research. This requires basing assessment primarily on qualitative judgement supported by responsible use of quantitative indicators.”
“The academic promotion and tenure process establishes the incentive structure for institutions of higher education. Open science champions have long advocated for the process to better reflect important open science scholarship that is often under-valued and neglected in academia. This webinar will highlight the five-year effort in the Psychology Department at the University of Maryland to adopt new guidelines that explicitly codify open science as a core criteria in tenure and promotion review. Discussion will include forces supporting and resisting open science behaviors and strategies for creating buy-in across the department. According to Dr. Doughetry, Department Chair, the new policy was necessary to ensure incentives for advancement reflect the values of scientists and their institutions.”
“Asian research powerhouses will introduce open access (OA) mandates within the next “two to three” years, experts have predicted, in the wake of last month’s landmark order by the Biden administration.
Under the US decision, the published results of federally funded research must be made immediately and freely available to readers, starting from 2025. This follows the introduction of similar rules across Europe and the UK, spearheaded by the Plan S initiative.
Home to four of the top 10 research-producing countries – China, Japan, South Korea and India – Asia now appears poised to become the next battleground….”
“We applaud the August 25 memorandum from the White House Office of Science and Technology Policy (OSTP) on Ensuring Free, Immediate, and Equitable Access to Federally Funded Research that calls on federal agencies to develop policies that will provide immediate open access to the outputs of federally funded research (“‘A Historic Moment’: New Guidance Requires Federally Funded Research to Be Open Access,” The Chronicle, August 25).
The potential benefits of immediate open access to research articles and to the data underlying the research include improving rigor and reliability, increased opportunity for reuse of data to ask new questions, faster and wider dissemination of new knowledge, broader participation in the research process, and the potential to reduce global inequities in publishing of and access to federally funded research.
Along with a diverse community of long-time advocates of open scholarship, we welcome the new OSTP guidance and its potential for accelerating a transition to a more open and equitable scholarly ecosystem. Funder requirements, however, are only one element of a complex system of norms and incentives. A major barrier to the widespread embrace of — and therefore the ultimate success of — mandates like the OSTP guidance is the degree to which scholars experience current incentive systems as at odds with practicing open scholarship. When individual career success incentives and reward systems — as codified in hiring, promotion, and tenure standards — are experienced as misaligned with open scholarship values and mandates, individual scholars are left in an impossible bind. Left unresolved, this misalignment will undermine the potential positive impacts of open scholarship generally and the OSTP guidance specifically, as many scholars are likely to navigate the seemingly inherent tensions via pro-forma compliance at best, and active resistance at worst. Something has to give.
The good news is that universities can make simple changes to hiring, promotion, and tenure practices to ensure that the work scholars do to make their research openly available is recognized and rewarded. Including language in hiring, promotion, and tenure guidelines that signal that open sharing of research outputs, and the impact of that sharing, is valued, will go a long way to aligning the incentives for career success with the practice of open scholarship — making what is now increasingly required, also what is rewarded.”
“ALLEA welcomes the adoption of the Conclusions on Research Assessment and Implementation of Open Science by the Council of the European Union on 10 June.
The Conclusions are in agreement with points that ALLEA has made over the years, in particular on the necessity of appropriately implementing and rewarding open science practices and the development of research assessment criteria that follow principles of excellence, research integrity and trustworthy science.
At the same time, ALLEA continues to stress that it matters how we open knowledge, as the push for Open Access publishing has also paved the way for various unethical publishing practices. The inappropriate use of journal- and publication-based metrics in funding, hiring and promotion decisions has been one of the obstacles in the transition to a more open science, and furthermore fails to recognize and reward the diverse set of competencies, activities, and outputs needed for our research ecosystem to flourish….”
“The International Network of Research Management Societies (INORMS) Research Evaluation Group (REG) brings together representatives from a range of global member research management societies to work towards better, fairer, and more meaningful research evaluation. The SCOPE Framework was developed by the REG as a practical way of implementing responsible research evaluation principles to design robust evaluations. We hope this guide will provide a useful steer to research evaluators around the world who are keen to engage with best practice and provide the best service to their organisations….”
“The University of Maryland is rewarding faculty members in the department of psychology who perform and disseminate research in accordance with open science practices. In April, the department adopted new guidelines that explicitly codify open science as a core criteria in tenure and promotion review.
The change was several years in the making and championed by Michael Dougherty, chair of the department. “When you think about the goal and purpose of higher education and why we take these positions, it’s because we felt there would be some good that we could impart on the world,” Dougherty said. “The traditional markers of impact are how many times you’ve been cited [in a journal]. That’s not the type of impact that is valuable to the broader society.”
The new policy was necessary, he said, so incentives for advancement reflect the values of scientists and their institutions….”
“Yet it isn’t clear what the relationship is between the greater sharing of research materials and the so-called democratisation at work in open science. What actually is democratising and collectivising about what HELIOS is trying to do?
It is important to ask this question because HELIOS is, by all accounts, a top-down initiative led by senior figures of research-intensive universities in the US. Despite the casual association between open science and collectivity, it appears that HELIOS is more a way for university leaders to coerce researchers into a cultural change, not something that is led by the research community at large. While changing tenure guidelines to prioritise publishing in open access journals, sharing FAIR data and releasing reusable open code may have some good outcomes, they are not themselves the basis for greater collective governance of science. Instead, these changes will provide an economic reason for researchers to adopt open science practices, a reason still based on individual progress within the academy….”
“A large coalition of colleges and universities aims to change hiring, promotion, and tenure practices to reward collaboration….
As Bahlai’s experience shows, scientists aren’t always rewarded for conducting research in accordance with open science principles. A new initiative plans to change that. The Higher Education Leadership Initiative for Open Scholarship, or HELIOS, which launched this March, is a coalition of more than 75 member colleges and universities that have committed to fostering open science practices, including through their hiring, promotion, and tenure decisions….
“The scientists bought into it,” Yamamoto says, adding that he doesn’t blame Lewin for coming up with an innovative marketing strategy. The journals wouldn’t have succeeded in shifting the culture if the scientific community hadn’t bought into the concepts of prestige and status, he says.
Yamamoto says this competition to publish work in prestigious journals led to an emphasis on individual contributions over collaboration in academia today. For example, tenure committees heavily weigh publication as a first or senior author, especially in prestigious journals—a process that can take several years, delaying when others have access to advances in scientific knowledge, he says. Yamamoto says it’s also common for committees to completely disregard papers where the tenure candidate is listed as a middle author.
Those individualistic values aren’t limited to universities and colleges. Grant agencies, for example, may decide to deny funding to a group of researchers if they get scooped by another team investigating a similar problem, Yamamoto says.
“So those kinds of values and practices then serve a very strong disincentive for an investigator to practice open science,” he says….
HELIOS wants to bend academia’s incentive structures toward cultivating collaboration. To accomplish this, like-minded institutions have gathered several times since 2021—beginning with a roundtable discussion convened by the National Academy of Sciences—to discuss priorities and strategies. The proceedings of a 2021 member workshop, “Developing a Toolkit for Open Science Practices,” includes language that institutions can use to show students and faculty their commitment to open science. The toolkit also includes templates for evaluating open science practices in job and tenure applications with example criteria including publishing in open-access journals, posting data using FAIR (findable, accessible, interoperable, and reusable) principles, and sharing other research outputs such as computer code….
Mangravite says she “one hundred percent” sees this divide between senior and junior faculty. But she says that rather than waiting for older faculty to retire, what’s needed is to incentivize younger faculty to participate in open science now instead of continuing to hold them to traditional standards set by more senior academics….”
“We are investigating how researchers assess candidates’ research outputs when they are on a committee for hiring review, promotion or tenure, or grant applications. …”
“The University of Maryland is rewarding faculty members in the department of psychology who perform and disseminate research in accordance with open science practices. In April, the department adopted new guidelines that explicitly codify open science as a core criteria in tenure and promotion review….”
“Are you currently employed as a senior administrator (e.g., President, Provost, Vice-Provost, Dean, Department head), researcher, or librarian at a research institute in the United States? Do you have experience with research assessment practices within your institute?
If so, DORA invites you to complete our survey about faculty (assistant, associate, or full professor) hiring, promoting, and tenure practices within your institute….”
“Examples of specific evaluative criteria to be used in merit review, based on professional standards for evaluating faculty performance….Openness and transparency: Degree to which research, data, procedures, code, and research products are made openly available where appropriate; the use of registered reports or pre-registration. Committee should recognize that researchers may not be able to share some types of data, such as when data are proprietary or subject to ethical concerns over confidentiality[7, 1, 6, 2, 5] These limitations should be documented by faculty.”
The transition to an open science system affects the entire research process. The reward systems also need to be adjusted in order to support and mirror the open research landscape, but what will this work look like, and what will change? We met Gustav Nilsonne, chair of the European working group dealing with the issue and a participant in the SUHF working group on merit reviews.