“Join DORA for a community call to introduce two new responsible research evaluation tools and provide feedback on future tool development. The toolkit is part of Project TARA, which aims to identify, understand, and make visible the criteria and standards universities use to make hiring, promotion, and tenure decisions. This interactive call will explore these new tools, which were each created to help community members who are seeking:
Strategies on how to debias committees and deliberative processes: It is increasingly recognized that more diverse decision-making panels make better decisions. Learn how to debias your committees and decision-making processes with this one-page brief.
Ideas on how to incorporate a wider range of contributions in their evaluation policies and practices: Capturing scholarly “impact” often relies on familiar suspects like h-index, JIF, and citations, despite evidence that these indicators are narrow, often misleading, and generally insufficient to capture the full richness of scholarly work. Learn how to consider a wider breadth of contributions in assessing the value of academic activities with this one-page brief….”
“Research funding organizations are often thought of as leaders in the movement toward more responsible research evaluation practices. Often, the perception of “excellence” in research culture is filtered through the lens of who and what type of work receives funding. However, when a narrow set of indicators is used to determine who receives funding, the result can be a subsequent narrowing of academia’s perceptions of research excellence (e.g., journal impact factor (JIF), h-index). This places funders in the unique position of being able to help “set the tone” for research culture through their own efforts to reduce reliance on flawed proxy measures of quality and implement a more holistic approach to the evaluation of researchers for funding opportunities. Whether funders are seeking inspiration from their peers or insight on iterative policy development, the ability to come together to discuss reform activity is critical for achieving widespread culture change. At DORA’s June Funder Community of Practice (CoP) meetings, we heard how DORA is being implemented by the Natural Sciences and Engineering Research Council of Canada (NSERC) and the New Zealand Ministry of Business, Innovation and Employment (MBIE)….”
“Aside from an individual’s personal interactions with another academic, the perceived quality of the journal where a researcher publishes is the most influential factor when forming an opinion on their academic standing, with almost half (49 percent) of 9,609 respondents saying it is important and 12 percent saying it is most important.
Asked about citation metrics, 24 percent say a scholar’s h-index and other similar measures are important, and 5 percent say they are the most crucial factor….
Last month more than 350 organizations from more than 40 countries signed a new compact, building on the 2015 Leiden Manifesto, which would see research evaluated mainly on qualitative measures and the journal-based metrics abandoned. That agreement came nearly 10 years after the signing of the San Francisco Declaration on Research Assessment, which sought to phase out the use of journal-based metrics when making funding, appointment and promotion decisions, and which has now been signed by almost 20,000 individuals and 2,600 institutions worldwide….”
Gardner, Victoria, Mark Robinson, and Elisabetta O’Connell. 2022. “Implementing the Declaration on Research Assessment: A Publisher Case Study”. Insights 35: 7. DOI: http://doi.org/10.1629/uksg.573
There has been much debate around the role of metrics in scholarly communication, with particular focus on the misapplication of journal metrics, such as the impact factor in the assessment of research and researchers. Various initiatives have advocated for a change in this culture, including the Declaration on Research Assessment (DORA), which invites stakeholders throughout the scholarly communication ecosystem to sign up and show their support for practices designed to address the misuse of metrics. This case study provides an overview of the process undertaken by a large academic publisher (Taylor & Francis Group) in signing up to DORA and implementing some of its key practices in the hope that it will provide some guidance to others considering becoming a signatory. Our experience suggests that research, consultation and flexibility are crucial components of the process. Additionally, approaching signing with a project mindset versus a ‘sign and forget’ mentality can help organizations to understand the practical implications of signing, to anticipate and mitigate potential obstacles and to support cultural change.
“DORA is excited to announce the launch of the SPACE workshop kit, a resource designed for those seeking to share the SPACE rubric with their communities. This kit is intended to equip individuals with the materials needed to run their own version of the SPACE workshop for participants from their academic association, institution, department, and more. It contains English and Spanish versions of a slide deck, pre-workshop materials, and instructions for breakout room facilitators. The kit also includes a workbook for workshop participants to capture concrete existing and future actions for research assessment reform….”
“The International Network of Research Management Societies (INORMS) Research Evaluation Group (REG) brings together representatives from a range of global member research management societies to work towards better, fairer, and more meaningful research evaluation. The SCOPE Framework was developed by the REG as a practical way of implementing responsible research evaluation principles to design robust evaluations. We hope this guide will provide a useful steer to research evaluators around the world who are keen to engage with best practice and provide the best service to their organisations….”
“SPIE, the international society for optics and photonics, today announced that it has signed the San Francisco Declaration on Research Assessment (DORA).
Developed during the 2012 Annual Meeting of the American Society for Cell Biology in San Francisco, DORA commits to improve the ways in which the outputs of scholarly research are evaluated. Under the agreement, participating publishers agree to provide a range of article-level metrics to encourage assessment based on the scientific content of the article rather than the journal in which it was published; encourage responsible authorship practices; remove reuse limitations on reference lists; and reduce the constraints on the number of references in research articles. To date, more than 21,000 individuals and organizations in nearly 160 countries have signed DORA.
SPIE currently publishes the Proceedings of SPIE and 14 peer-reviewed journals through its SPIE Digital Library platform. The SPIE Digital Library, the world’s largest collection of optics and photonics applied research, comprises more than 560,000 publications covering topical areas ranging from biomedical optics and neuroscience to physics and astronomy-related technology….”
“Since 2013, more than 21,000 individuals and organizations in 158 countries have signed the San Francisco Declaration on Research Assessment (DORA) and committed to improving the ways research and researchers are assessed for hiring, promotion, and funding decisions. As an initiative, DORA raises awareness and facilitates the implementation of good practice in research assessment to catalyze change and improve equity in academia.
The DORA Internship is an opportunity for individuals with an interest in scholarly communications, science policy, and science diplomacy. The intern will gain first-hand experience working for an international non-profit initiative. The internship is a remote part-time position (15 – 20 hours per week; $20 per hour) for a period of 6 months. The intern will report to the DORA program director. All degree levels, including recent graduates and currently enrolled students, are encouraged to apply….”
“Are you currently employed as a senior administrator (e.g., President, Provost, Vice-Provost, Dean, Department head), researcher, or librarian at a research institute in the United States? Do you have experience with research assessment practices within your institute?
If so, DORA invites you to complete our survey about faculty (assistant, associate, or full professor) hiring, promoting, and tenure practices within your institute….”
“As a journal-level metric, the IF is unable to assess the value of any given article or author. To make this inference, one would need to read the article and assess its claims, scientific rigor, methodological soundness, and broader implications. What’s more, the IF (which represents the average number of citations across a finite set of eligible articles) is vulnerable to the skewness in citation rates among articles (Nature, 2005) and to the manipulation, negotiation, and gaming of its calculation among stakeholders (Ioannidis & Thombs, 2019). At a more fundamental level the IF does not capture journal functioning such as improvements to (or worsening of) internal evaluative processes (e.g., effectiveness of peer review, changes to submission instructions and policies, use and adherence to reporting guidelines, etc.; Dunleavy, 2022). These and other issues are explored in more depth by Seglen (1997)….
In light of these limitations, social work should de-emphasize the IF and instead embrace a new set of evaluative tools. The San Francisco Declaration on Research Assessment (American Society for Cell Biology, 2013)—and more recently the Leiden Manifesto (Hicks et al., 2015)—typify such efforts. They encourage stakeholders (i.e., academic institutions, journals, funders, researchers) to consider using a multitude of qualitative and quantitative alternative metrics (i.e., “altmetrics”; Priem et al., 2012; see also https://metrics-toolkit.org/metrics/) when judging scholarly output—whether it be a journal article, a grant proposal, or even a hiring or tenure packet. …”
“DORA sought to fund ideas to advance assessment reform at academic institutions at any stage of readiness. Projects could be targeted to any level within an academic institution, including (but not limited to) reform efforts at the graduate program, department, library, or institution level, and should address one or more key aspects of education, planning, implementing, training, iteratively improving, and scaling policies and practices. More than 55 ideas were submitted from individuals and teams in 29 countries! After careful review, members of the Steering Committee selected 10 proposals to support….”
“Global research and education leader Wiley today announced it has signed the Declaration on Research Assessment (DORA), which is a world-wide initiative designed to improve the ways in which the outputs of scholarly research are evaluated.
As the publisher of nearly 2,000 academic journals, Wiley will deliver more ways to assess and recognize research outputs, which in turn supports healthy scholarship and allows more researchers to thrive in their careers. To this end, Wiley will roll out a broad range of journal and article metrics across its journal portfolio with the aim of providing a holistic, well-rounded view of the value and impact of any author’s research. This includes metrics that measure levels of impact beyond citation value, including usage, re-use, reproducibility, peer review assessment, geographic reach, and public recognition via references in media outlets….”
“The review, promotion, and tenure (RPT) process is central to academic life and workplace advancement. It influences where faculty direct their attention, research, and publications. By unveiling the RPT process, we can inform actions that lead towards a greater opening of research.
Between 2017 and 2022, we conducted a multi-year research project involving the collection and analysis of more than 850 RPT guidelines and 338 surveys with scholars from 129 research institutions across Canada and the US. Starting with a literature review of academic promotion and tenure processes, we launched six studies applying mixed methods approaches such as surveys and matrix coding.
So how do today’s universities and colleges incentivize open access research? Read on for 6 key takeaways from our studies….”
Abstract: There has been much debate around the role of metrics in scholarly communication, with particular focus on the misapplication of journal metrics, such as the impact factor in the assessment of research and researchers. Various initiatives have advocated for a change in this culture, including the Declaration on Research Assessment (DORA), which invites stakeholders throughout the scholarly communication ecosystem to sign up and show their support for practices designed to address the misuse of metrics. This case study provides an overview of the process undertaken by a large academic publisher (Taylor & Francis Group) in signing up to DORA and implementing some of its key practices in the hope that it will provide some guidance to others considering becoming a signatory. Our experience suggests that research, consultation and flexibility are crucial components of the process. Additionally, approaching signing with a project mindset versus a ‘sign and forget’ mentality can help organizations to understand the practical implications of signing, to anticipate and mitigate potential obstacles and to support cultural change.