“Today the UKRN is delighted to announce the launch of one of the largest national initiatives in the world to reform how open research is recognised and rewarded when researchers are recruited, promoted and appraised. The ‘OR4’ project, part of the UKRN’s Open Research Programme, today announces the 43 UK academic research organisations that have joined either as case studies or as part of a wider community of practice. This group of institutions is incredibly diverse, including the Royal College of Music, Queen’s University Belfast, the Universities of Cambridge, the West of Scotland, Swansea and Durham, and the CRUK Scotland Institute (full list on the OR4 web page). Together, they employ over 80,000 academic staff, all of whom we hope will benefit from this initiative. OR4 also aligns UK developments with leading international work, for example, CoARA, the European OPUS Project, and US HELIOS network. For more information, please see the OR4 web page.”
Category Archives: oa.p&t
Using Altmetric Data Responsibly: A Guide to Interpretation and Good Practice
Abstract: This guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs. This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.
Overemphasis on publications may disadvantage historically excluded groups in STEM before and during COVID-19: A North American survey-based study | PLOS ONE
Abstract: Publishing is a strong determinant of academic success and there is compelling evidence that identity may influence the academic writing experience and writing output. However, studies rarely quantitatively assess the effects of major life upheavals on trainee writing. The COVID-19 pandemic introduced unprecedented life disruptions that may have disproportionately impacted different demographics of trainees. We analyzed anonymous survey responses from 342 North American environmental biology graduate students and postdoctoral scholars (hereafter trainees) about scientific writing experiences to assess: (1) how identity interacts with scholarly publication totals and (2) how the COVID-19 pandemic influenced trainee perceptions of scholarly writing productivity and whether there were differences among identities. Interestingly, identity had a strong influence on publication totals, but it differed by career stage with graduate students and postdoctoral scholars often having opposite results. We found that trainees identifying as female and those with chronic health conditions or disabilities lag in publication output at some point during training. Additionally, although trainees felt they had more time during the pandemic to write, they reported less productivity and motivation. Trainees who identified as female; Black, Indigenous, or as a Person of Color [BIPOC]; and as first-generation college graduates were much more likely to indicate that the pandemic affected their writing. Disparities in the pandemic’s impact on writing were most pronounced for BIPOC respondents; a striking 85% of BIPOC trainees reported that the pandemic affected their writing habits, and overwhelmingly felt unproductive and unmotivated to write. Our results suggest that the disproportionate impact of the pandemic on writing output may only heighten the negative effects commonly reported amongst historically excluded trainees. Based on our findings, we encourage the academy to consider how an overemphasis on publication output during hiring may affect historically excluded groups in STEM—especially in a post-COVID-19 era.
Call for Submissions: Tenure, Promotion, and Recognition · Series 3.2: Recognition and Rewards
“We seek proposals that will address topics or questions surrounding tenure, promotion, and recognition models, such as: …
How can Open Science be leveraged to transform recognition models and processes? …”
Project TARA | DORA
“Project TARA is supported by a generous three-year grant from Arcadia, a charitable fund of Lisbet Rausing and Peter Baldwin. It will help DORA identify, understand, and make visible the criteria and standards universities use to make hiring, promotion, and tenure decisions. This information will be used to create resources and practical guidance on research assessment reform for academic and scholarly institutions.”
English – Knowledge Equity Network
“For Higher Education Institutions
Publish a Knowledge Equity Statement for your institution by 2025, incorporating tangible commitments aligned with the principles and objectives below.
Commit to institutional action(s) to support a sustained increase of published educational material being open and freely accessible for all to use and reuse for teaching, learning, and research.
Commit to institutional action(s) to support a sustained increase of new research outputs being transparent, open and freely accessible for all, and which meet the expectations of funders.
Use openness as an explicit criteria in reaching hiring, tenure, and promotion decisions. Reward and recognise open practices across both research and research-led education. This should include the importance of interdisciplinary and/or collaborative activities, and the contribution of all individuals to activities.
Define Equity, Diversity and Inclusion targets that will contribute towards open and inclusive Higher Education practices, and report annually on progress against these targets.
To create new mechanisms in and between Higher Education Institutions that allow for further widening participation and increased diversity of staff and student populations.
Review the support infrastructure for open Higher Education, and invest in the human, technical, and digital infrastructure that is needed to make open Higher Education a success.
Promote the use of open interoperability principles for any research or education software/system that you procure or develop, explicitly highlighting the option of making all or parts of content open for public consumption.
Ensure that all research data conforms to the FAIR Data Principles: ‘findable’, accessible, interoperable, and re-useable.
For Funding Agencies
Publish a statement that open dissemination of research findings is a critical component in evaluating the productivity and integrity of research.
Incorporate open research practices into assessment of funding proposals.
Incentivise the adoption of Open Research through policies, frameworks and mandates that require open access for publications, data, and other outputs, with as liberal a licence as possible for maximum reuse.
Actively manage funding schemes to support open infrastructures and open dissemination of research findings, educational resources, and underpinning data.
Explicitly define reward and recognition mechanisms for globally co-produced and co-delivered open educational resources that benefit society….”
Spotlight Series Recap: Incentivizing Open in Reappointment, Promotion, Tenure, and Hiring — Higher Education Leadership Initiative for Open Scholarship
“On March 22, 2023 the Higher Education Leadership Initiative for Open Scholarship (HELIOS) convened academic leaders to discuss incentivizing open scholarship practices in hiring, reappointment, promotion, and tenure (RPT)….
McKiernan framed the day’s conversation: “when we are talking about incentives within promotion, tenure, and hiring, what we’re really talking about is what universities value, what they recognize, and whether they are the same things.” In McKiernan’s research, she and her co-authors have discovered that what gets rewarded in these policies is not what universities always state they value. University mission statements often talk about the importance of community and public engagement for the betterment of society. Open scholarship practices like making our work openly available by sharing data, code, notebooks, and all kinds of outputs allow individuals to engage with the work, collaborate, and build on the work. There are many public aspects of what faculty do in their day-to-day work, including openly disseminating scholarly outputs, but tenure and promotion guidelines at many universities do not adequately reward public engagement and outreach that open scholarship practices enable….”
Young researchers in action: the road towards a new PhD evaluation | DORA
“Less emphasis on bibliometrics, more focus on personal accomplishments and growth in research-related competencies. That is the goal of Young Science in Transition’s (Young SiT) new evaluation approach for PhD candidates in Utrecht, the Netherlands. But what do PhD candidates think about the new evaluation? With the DORA engagement grant, we did in-depth interviews with PhD candidates and found out how the new evaluation can be improved and successfully implemented.
The beginning: from idea to evaluation
Together with Young SiT, a thinktank of young scientists at the UMC Utrecht, we (Inez Koopman and Annemijn Algra) have been working on the development and implementation of a new evaluation method for PhD candidates since 2018.1 In this new evaluation, PhD candidates are asked to describe their progress, accomplishments and learning goals. The evaluation also includes a self-assessment of their competencies. We started bottom-up, small, and locally. This meant that we first tested our new method in our own PhD program (Clinical and Experimental Neurosciences, where approximately 200 PhD’s are enrolled). After a first round of feedback, we realized the self-evaluation tool (the Dutch PhD Competence Model) needed to be modernized. Together with a group of enthusiastic programmers, we critically reviewed its content, gathered user-feedback from various early career networks and transformed the existing model into a modern and user-friendly web-based tool.2
In the meantime, we started approaching other PhD programs from the Utrecht Graduate School of Life Sciences (GSLS) to further promote and enroll our new method. We managed to get support ‘higher up’: the directors and coordinators of the GSLS and Board of Studies of Utrecht University were interested in our idea. They too were working on a new evaluation method, so we decided to team up. Our ideas were transformed into a new and broad evaluation form and guide that can soon be used by all PhD candidates enrolled in one of the 15 GSLS programs (approximately 1800 PhD’s).
However, during the many discussions we had about the new evaluation one question kept popping up: ‘but what is the scientific evidence that this new evaluation is better than the old one’? Although the old evaluation, which included a list of all publications and prizes, was also implemented without any scientific evidence, it was a valid question. We needed to further understand the PhD perspective, and not only the perspective from PhD’s in early career networks. Did PhD candidates think the new evaluation was an improvement and if so, how it could be improved even further?
We used our DORA engagement grant to set up an in-depth interview project with a first group of PhD candidates using the newly developed evaluation guide and new version of the online PhD Competence Model. Feedback about the pros and cons of the new approach helps us shape PhD research assessment….”
Guidelines for Broadening the Definition of Historical Scholarship | Perspectives on History | AHA
“n January 5, 2023, the AHA Council approved the Guidelines for Broadening the Definition of Historical Scholarship. In most history departments, “scholarship” has traditionally and primarily encompassed books, journal articles and book chapters, and papers presented at conferences. The weight and significance of each of these vary considerably by institution. The most valued coin of the realm remains not just the book—especially for early and midcareer scholars—but a particular kind of book known only in academia and scholarly publishing as a “monograph.” Yet many other categories of books don’t count: textbooks, official histories, anthologies, translations and critical editions, reference books, and more. These have not been deemed to be “creating new knowledge.” …
The AHA Council has decided that it is time to map a broader terrain of scholarship, with more flexible boundaries. There are many ways to be a historian, many ways to do historical work….
This recommendation and the guidelines that follow rest on four pillars:
A wide range of scholarly historical work can be undertaken in ways consistent with our disciplinary standards and values, from writing briefing papers and op-eds, to testifying in legislatures and courts, participating in the work of regulatory agencies, publishing textbooks and reference books, expanding our media presence across a wide range of platforms, and more.
To support such publicly engaged and/or policy-oriented work, history departments should give it appropriate scholarly credit in personnel decisions. Not doing so diminishes the public impact of historians and cedes to others—observers less steeped in our discipline-specific methods, epistemologies, and standards—the podium from which to shape the historical framing of vital public conversations.
Historians cannot expect decision makers or other potential audiences to appreciate the value of our work if we don’t affirm its value ourselves.
All historical work can be peer-reviewed, whether before or after publication….”
AHA: use public-facing work in hiring decisions | Times Higher Education (THE)
“Textbooks, congressional testimony, media appearances, historical gaming – the American Historical Association is urging universities to accept more types of work from candidates for hiring, promotion, tenure and other benefits.
It is a development that historians say follows movement – particularly within the field of public history – towards broader recognition. That field involves work regarding national parks, museums, documentaries, archives and historical preservation….”
SocArXiv Papers | Value dissonance in research(er) assessment: Individual and institutional priorities in review, promotion and tenure criteria
Preprint on reforming research assessment, with Open Access as one of the criteria:
“Analysis of an international survey of 198 respondents reveals a deep disjunct between personal beliefs and perceived institutional priorities (“value dissonance”), with practices of open and responsible research, as well as “research citizenship” comparatively poorly valued by institutions at present. Our findings hence support current moves to reform research assessment.”
Responsible Research Assessment I: Implementing DORA for hiring and promotion in psychology | PsychArchives
Abstract: The use of journal impact factors and other metric indicators of research productivity, such as the h-index, has been heavily criticized for being invalid for the assessment of individual researchers and for fueling a detrimental “publish or perish” culture. Multiple initiatives call for developing alternatives to existing metrics that better reflect quality (instead of quantity) in research assessment. This report, written by a task force established by the German Psychological Society, proposes how responsible research assessment could be done in the field of psychology. We present four principles of responsible research assessment in hiring and promotion and suggest a two-step assessment procedure that combines the objectivity and efficiency of indicators with a qualitative, discursive assessment of shortlisted candidates. The main aspects of our proposal are (a) to broaden the range of relevant research contributions to include published data sets and research software, along with research papers, and (b) to place greater emphasis on quality and rigor in research evaluation.
Watch the Supporting Open Science in the Promotion & Tenure Process: Lessons from the University of Maryland Webinar
“The academic promotion and tenure process establishes the incentive structure for institutions of higher education. Open science champions have long advocated for the process to better reflect important open science scholarship that is often under-valued and neglected in academia.
COS hosted a webinar on September 27, 2022, highlighting the five-year effort in the Psychology Department at the University of Maryland to adopt new guidelines that explicitly codify open science as a core criteria in tenure and promotion review. According to Dr. Michael Doughetry, Department Chair, the new policy was necessary to ensure incentives for advancement reflect the values of scientists and their institutions….”
COARA – Coalition for Advancing Research Assessment
“Our vision is that the assessment of research, researchers and research organisations recognises the diverse outputs, practices and activities that maximise the quality and impact of research. This requires basing assessment primarily on qualitative judgement, for which peer review is central, supported by responsible use of quantitative indicators.”
The Agreement on Reforming Research Assessment is now final – COARA
“Launched in January 2022 as a co-creation exercise, the process of drafting an agreement for reforming research assessment has reached an important milestone. On 8 July, the final version of the agreement was presented at a Stakeholder Assembly bringing together the 350+ organisations from 40+ countries having expressed interest in being involved in the process. Today, the final Agreement is made public with this news.
Organisations involved have provided feedback to the evolving drafts of the agreement, as prepared by a team composed of representatives from the European University Association (EUA), Science Europe, and the European Commission, alongside Dr Karen Stroobants in her individual capacity as researcher with expertise in research on research.
A core group of 20 research organisations, representing the diversity of the research community across Europe, also contributed to the drafting process, while EU Member States and Associated Countries have been consulted on the agreement in the framework of the ERA Forum and the European Research Area Committee (ERAC).
The Agreement on Reforming Research Assessment sets a shared direction for changes in assessment practices for research, researchers and research performing organisations, with the overarching goal to maximise the quality and impact of research. The Agreement includes the principles, commitments and timeframe for reforms and lays out the principles for a Coalition of organisations willing to work together in implementing the changes.
Signatories will commit to a common vision, which is that the assessment of research, researchers and research organisations recognises the diverse outputs, practices and activities that maximise the quality and impact of research. This requires basing assessment primarily on qualitative judgement supported by responsible use of quantitative indicators.”