Why Do We Need to Change Research Evaluation Systems? — Observatory | Institute for the Future of Education

“Can we break out of this vicious cycle? Are there alternatives? Yes, there are. For some years now, various movements worldwide have sought to change the system for evaluating research. In 2012, the “San Francisco Declaration” proposed eliminating metrics based on the impact factor. There was also the Charte de la désexcellence  (“Letter of Dis-Excellence”) mentioned above. In 2015, a group of academicians signed the Leiden Manifesto, which warned of the “widespread misuse of indicators in evaluating scientific performance.” Since 2013, the group Science in Transition has sought to reform the science evaluation system. Finally, since 2016, the Collectiu InDocentia, created at the University of Valencia (Spain), has also been doing its part. …”

DORA receives $1.2M grant from Arcadia to accelerate research assessment reform | DORA

“Research assessment reform is part of the open research movement in academia that asks the question: Who and what is research for? The San Francisco Declaration on Research Assessment (DORA), an initiative that operates under the sponsorship of the American Society for Cell Biology, has been awarded a 3-year, $1.2M grant from Arcadia – a charitable fund of Lisbet Rausing and Peter Baldwin. The generous funding will support Tools to Advance Research Assessment (TARA), a project to facilitate the development of new policies and practices for academic career assessment. Project TARA is a collaboration with Sarah de Rijcke, Professor in Science and Evaluation Studies and director of the Centre for Science and Technology Studies (CWTS) at Leiden University, and Ruth Schmidt, Associate Professor at the Institute of Design at the Illinois Institute of Technology.

The grant for Project TARA will help DORA to identify, understand, and make visible the criteria and standards universities use to make hiring, promotion, and tenure decisions. This information will be used to create resources and practical guidance on the reform of research assessment for academic and scholarly institutions. The grant provides DORA with crucial support to create the following outputs:

An interactive online dashboard that tracks criteria and standards academic institutions use for hiring, review, promotion, and tenure.
A survey of U.S. academic institutions to gain a broad understanding of institutional attitudes and approaches to research assessment reform.
A toolkit of resources informed by the academic community to support academic institutions working to improve policy and practice….”

Impact factor abandoned by Dutch university in hiring and promotion decisions

“A Dutch university says it is formally abandoning the impact factor — a standard measure of scientific success — in all hiring and promotion decisions. By early 2022, every department at Utrecht University in the Netherlands will judge its scholars by other standards, including their commitment to teamwork and their efforts to promote open science, says Paul Boselie, a governance researcher and the project leader for the university’s new Recognition and Rewards scheme. “Impact factors don’t really reflect the quality of an individual researcher or academic,” he says. “We have a strong belief that something has to change, and abandoning the impact factor is one of those changes.” …”

Incorporating Preprints into Academic Assessment | DORA

“Join DORA and ASAPbio on Tuesday Day, June 29, for a joint webinar on preprints and academic assessment….

Speakers will discuss important topics surrounding the incorporation of preprints into academic assessment: the value of considering preprints in academic assessment, how preprints can be included in existing assessment processes, and what challenges may arise along the way. Participants will have the opportunity to engage in the dialogue and ask questions of the speakers in the last section of the webinar.

 

This webinar is free to attend and open to everyone interested in improving research assessment. In particular, this webinar will aim to equip early career researchers, faculty, and academic leadership with the knowledge to advocate for the use of preprints at their institutions.”

Game over: empower early career researchers to improve research quality

Abstract:  Processes of research evaluation are coming under increasing scrutiny, with detractors arguing that they have adverse effects on research quality, and that they support a research culture of competition to the detriment of collaboration. Based on three personal perspectives, we consider how current systems of research evaluation lock early career researchers and their supervisors into practices that are deemed necessary to progress academic careers within the current evaluation frameworks. We reflect on the main areas in which changes would enable better research practices to evolve; many align with open science. In particular, we suggest a systemic approach to research evaluation, taking into account its connections to the mechanisms of financial support for the institutions of research and higher education in the broader landscape. We call for more dialogue in the academic world around these issues and believe that empowering early career researchers is key to improving research quality.

 

Open Scholarship Support Guide.pdf(Shared)- Adobe Document Cloud

“Steps to Support Open Scholarship

Open scholarship entails a culture shift in how research is conducted in universities. It requires action on the part of university administration, working in concert with faculty, sponsors and disciplinary communities.  Universities should consider steps in three areas:

•  Policies:  Language and guidance should be reviewed for alignment with open scholarship, in particular: (1) academic hiring, review, tenure and promotion (valuing diverse types of research products; metrics that  incentivize the open dissemination of articles, data, and other research outputs; and valuing collaborative research); (2) intellectual property (ownership, licensing and distribution of data, software, materials and publications); (3) research data protection (for data to be stored and shared through repositories); (4) attribution (recognizing full range of contributions);  and (5) privacy (insuring that privacy obligations are met). 

•  Services and Training:  Researchers need support to assure that data and other research objects are managed according to FAIR Principles: findable, accessible, interoperable and reusable.  While the specific solution must be tailored to the discipline and research, common standards, including Digital Object Identifiers (DOIs), must be followed.

•  Infrastructure:  Archival storage is required for data, materials, specimens and publications to permit reuse.  Searchable portals are needed to register research products where they can be located and accessed. Universities can recognize efficiencies by utilizing external resources (including existing disciplinary repositories) and by developing shared resources that span the institution when external resources do not exist.

Presidents and provosts are encouraged to work with their academic senates to create an open scholarship initiative that promotes institution-wide actions supporting open scholarship practices, while remaining sufficiently flexible to accommodate disciplinary differences and norms….”

Statement on the Scholarly Merit and Evaluation of Open Scholarship in Linguistics | Linguistic Society of America

“Open Scholarship can be a key component for a scholar’s portfolio in a number of situations, including but not limited to hiring, review, promotion, and awards. Because Open Scholarship can take many forms, evaluation of this work may need different tools and approaches from publications like journal articles and books.  In particular, citation counts, a common tool for evaluating publications, are not available for some kinds of Open Scholarship in the same form or from the same providers as they are from publications. Here we share recommendations on how to assess the use of Open Scholarship materials including and beyond citations, including materials that both have formal peer review and those that do not not.

For tenure & promotion committees, program managers, department chairs, hiring committees, and others tasked with evaluating Open Scholarship, NASEM has prepared a discipline-agnostic rubric that can be used as part of hiring, review, or promotion processes. Outside letters of evaluation can also provide insight into the significance and impact of Open Scholarship work. Psychologist Brian Nosek (2017) provides some insight into how a letter writer can evaluate Open Scholarship, and includes several ways that evaluation committees can ask for input specifically about contributions to Open Scholarship. Nosek suggests that letter writers and evaluators comment on ways that individuals have contributed to Open Scholarship through “infrastructure, service, metascience, social media leadership, and their own research practices.” We add that using Open Scholarship in the classroom, whether through open educational materials, open pedagogy, or teaching of Open Scholarship principles, should be included in this list. Evaluators can explicitly ask for these insights in requests to letter writers, for example by including the request to “Please describe the impact that [scholar name]’s openly available research outputs have had from the research, public policy, pedagogic, and/or societal perspectives.” These evaluations can be particularly important when research outputs are not formally peer reviewed.

For scholars preparing hiring, review, promotion, or other portfolios that include Open Scholarship, we recommend not only discussing the Open Scholarship itself, but also its documented and potential impacts on both the academic community as well as broader society. Many repositories housing Open Scholarship materials provide additional metrics such as views, downloads, comments, and forks (or reuse cases) alongside citations in published literature. The use and mention of material with a Digital Object Identifier (DOI) can be tracked using tools such as ImpactStory, Altmetric.com, and other alternative metrics. To aid with evaluation of this work, the creator should share these metrics where available, along with any other qualitative indicators (such as personal thank-yous, reuse stories, or online write-ups) that can give evaluators a sense of the impact of their work. The Metrics Toolkit provides examples and use cases for these kinds of metrics. This is of potential value when peer review of these materials may not take the same form as with published journals or books; thoughtful use and interpretation of metrics can help evaluators understand the impact and importance of the work.

The Linguistic Society of America reaffirms its commitment to fair review of Open Scholarship in hiring, tenure, and promotion, endorses all of these approaches to peer review and evaluation of Open Scholarship, and encourages scholars, departments, and personnel committees to take them into careful consideration and implement language about Open Scholarship in their evaluation processes.”

Dissemination of applied research to the field: attitudes and practices of faculty authors in social work

Abstract:  In applied research disciplines like social work, there is a clear disconnect between the production and dissemination of research and the access and use of research in practice. This research/practice divide is particularly problematic for practitioners required to work within evidence-based or research-informed frameworks. To explore this issue, we conducted a nationwide survey and qualitative interviews with social work faculty regarding their research dissemination attitudes and practices, especially to non-academic audiences. The survey and interviews provide data on faculty dissemination methods, attitudes toward gold and green open access and promotion and tenure considerations. Results demonstrate that faculty are primarily engaged with traditional publishing models and much less engaged with dissemination to non-academic audiences. Faculty are skeptical of open access journals, avoid article processing charges and are only minimally engaged with institutional repositories. Faculty are conflicted regarding the dissemination of their research, especially in the context of promotion and tenure. Shifting dissemination outside of non-academic audiences would require increased confidence in open access, support for the creation of practitioner-focused materials and prioritizing the impact of research on practice.

 

Recognition and rewards – Open Science – Universiteit Utrecht

“Open science means action. And the way we offer recognition and reward to academics and university staff is key in bringing about the transition that Utrecht University aims for. Over the course of the past year the working group on Recognition and Rewards, part of the Open Science Programme, has reflected and thoroughly debated a novel approach to ensuring that we offer room for everyone’s talent, resulting in a new vision (pdf)….

In the current system, researchers and their research are judged by journal impact factors, publisher brands and H-indices, and not by actual quality, real use, real impact and openness characteristics….

Under those circumstances, at best open science practices are seen as posing an additional burden without rewards. At worst, they are seen as actively damaging chances of future funding and promotion & tenure. Early career researchers are perhaps the most dependent on traditional evaluation culture for career progression, a culture held in place by established researchers, as well as by institutional, national and international policies, including funder mandates….”

 

 

Utrecht University Recognition and Rewards Vision

“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”

Practitioner Perspectives: The DOERS3 Collaborative on OER in Tenure and Promotion – BCcampus

“Andrew McKinney, OER coordinator at the City University of New York (CUNY), and Amanda Coolidge, director of Open Education at BCcampus in British Columbia, Canada, share the development of an adaptable matrix to help faculty include OER (Open Educational Resources) in their tenure and promotion portfolios. …”

Recognition and rewards in the Open Era: Turning thoughts into actions | Open Working

“The TU Delft Open Science programme held its very first thematic session on the Recognition and Rewards cross-cutting theme on October 5, 2020. The Open Science Programme currently has 5 projects and 3 cross-cutting themes, from FAIR software to Open Education. This means that the programme core team is composed of members from many different departments (not only within the Library), bringing in their diverse perspectives and skills! But this also poses a challenge on teamwork- we need a way for us to all stay in touch, be able to see and learn from each other’s work, and contribute and provide feedback – hence the idea of the thematic sessions.Ingrid Vos, the leader of the Recognition and Rewards theme, has kindly volunteered to lead this first thematic session. Since this theme relates to everyone’s work within the Open Science Programme, Ingrid wanted to make sure everyone can be effectively engaged in the session and their voices can be heard – more on this below.Key takeaways: A re-examination of rewards and recognition is needed to further fuel the cultural and behavioural changes towards open science TU Delft’s work in this aspect builds upon VSNU’s “Room for everyone’s talent” position paper. Every university in the Netherlands has a committee on Recognition & Rewards. The TU Delft committee is led by Ena Voûte. The Open Science Programme team had fruitful discussions around open research and education behaviours and “products”, how to evaluate, appreciate and reward these, as well as emerging career paths We’d love to hear your ideas and thoughts, both on rewards and recognition and on how you’d like to contribute and participate in these discussions- please use the comment section of this post!  …”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Reimagining Academic Career Assessment: Stories of innovation and change

“This report and the accompanying online repository1 bring together case studies in responsible academic career assessment. Gathered by the San Francisco Declaration on Research Assessment (DORA),2 European University Association (EUA),3 and Scholarly Publishing and Academic Resources Coalition (SPARC) Europe, 4 the case studies independently serve as a source of inspiration for institutions looking to improve their academic career assessment practices. Following the publication of guidelines and recommendations on more responsible evaluation approaches, such as DORA,5 the Leiden Manifesto for Research Metrics, 6 and the Metric Tide, 7 more and more institutions have begun to consider how to implement a range of practical changes and innovations in recent years. However, information about the creation and development of new practices in academic career assessment are not always easy to find. Collectively, the case studies will further facilitate this “practical turn” toward implementation by providing a structured overview and conceptual clarity on key characteristics and contextual factors. In doing so, the report examines emerging pathways of institutional reform of academic career assessment…”