SPACE to evolve academic assessment: A rubric for analyzing institutional conditions and progress indicators | DORA

“This is part of DORA’s toolkit of resources to support academic institutions that are improving their policies and practices. Find the other resources in the toolkit here.

Improving research and scholarship assessment practices requires the ability to analyze the outcomes of efforts and interventions. However, when conducted only at the unit level of individual interventions, these evaluations and reflections miss opportunities to understand how institutional conditions themselves set the table for the success of new efforts, or how developing institutional capabilities might improve the effectiveness and impact of these new practices at greater scale. The SPACE rubric was developed to help institutions at any stage of academic assessment reform gauge their institutional ability to support interventions and set them up for success.

Organizations can use the SPACE rubric to support the implementation of fair and responsible academic career assessment practices in two ways: First, it can help establish a baseline for the current state of infrastructural conditions, to gauge an institution’s ability to support the development and implementation of new academic assessment practices and activities. Second, the rubric can be used to retroactively analyze how strengths or gaps in these institutional conditions may have impacted the outcomes of concrete interventions targeted to specific types of academic assessment activities—such as hiring, promotion, tenure, or even graduate student evaluation—either helping or hindering progress toward those goals.

The SPACE rubric is a result of DORA’s partnership with Ruth Schmidt, Associate Professor at the Institute of Design of the Illinois Institute of Technology, who led the iterative participatory design process. The creation of the rubric was informed by nearly 75 individuals in 26 countries and 6 continents, and benefited from multiple rounds of feedback….”

Recognition and rewards – Open Science – Universiteit Utrecht

“Open science means action. And the way we offer recognition and reward to academics and university staff is key in bringing about the transition that Utrecht University aims for. Over the course of the past year the working group on Recognition and Rewards, part of the Open Science Programme, has reflected and thoroughly debated a novel approach to ensuring that we offer room for everyone’s talent, resulting in a new vision (pdf)….

In the current system, researchers and their research are judged by journal impact factors, publisher brands and H-indices, and not by actual quality, real use, real impact and openness characteristics….

Under those circumstances, at best open science practices are seen as posing an additional burden without rewards. At worst, they are seen as actively damaging chances of future funding and promotion & tenure. Early career researchers are perhaps the most dependent on traditional evaluation culture for career progression, a culture held in place by established researchers, as well as by institutional, national and international policies, including funder mandates….”

 

 

Utrecht University Recognition and Rewards Vision

“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”

Are stakeholders measuring the publishing metrics that matter?: Putting research into context

“Perhaps the most fundamental aspect of compiling and implementing more meaningful research metrics that the NISO panelists discussed is the importance of putting data into context. And, as the speakers noted, there are multiple facets of context to consider, including:

The strengths and limitations of different metrics by discipline/subject matter (e.g., some metrics are better suited to certain types of research)
The intended uses and overall strengths and limitations of particular data points (e.g., altmetrics are “indicators” of impact, not measures of quality and the JIF was never meant to be used to measure the impact of individual articles or scholars)
The cultural context that a researcher is operating within and the opportunities, challenges, and biases they have experienced
How and where a research output fits within scholars’ other professional contributions (e.g., recognizing how individual research outputs are part of broader bodies of work and also measuring the impacts of scholarly outputs that do not fit within traditional publication-based assessment systems) …”

Open access publishing is the ethical choice | Wonkhe

“I had a stroke half a decade ago and found I couldn’t access the medical literature on my extremely rare vascular condition.

I’m a capable reader, but I couldn’t get past the paywalls – which seemed absurd, given most research is publicly funded. While I had, already, long been an open access advocate by that point, this strengthened my resolve.

The public is often underestimated. Keeping research locked behind paywalls under the assumption that most people won’t be interested in, or capable of, reading academic research is patronising….

While this moral quandary should not be passed to young researchers, there may be benefits to them in taking a firm stance. Early career researchers are less likely to have grants to pay for article processing charges to make their work open access compared to their senior colleagues. Early career researchers are also the ones who are inadvertently paying the extortionate subscription fees to publishers. According to data from the Higher Education Statistics Agency (HESA), the amount of money UK universities fork out each year to access paywalled content from Elsevier – the largest academic publisher in the world – could pay 1,028 academic researchers a salary of £45,000 per year.

We know for-profit publishers, such as Elsevier, hold all the cards with respect to those prestigious titles. What we need are systematic “read and publish” deals that allow people to publish where they want without having to find funding for open access….

The current outlook for prospective researchers to secure an academic position at a university is compromised because so much money is spent propping up for-profit, commercial publishers. Rather than focusing on career damage to those who can’t publish with an Elsevier title, we should focus on the opportunity cost in hundreds of lost careers in academia….”

Incentivization Blueprint — Open Research Funders Group

“A growing number of funders are eager to encourage grantees to share their research outputs – articles, code and materials, and data. To accelerate the adoption of open norms, deploying the right incentives is of paramount importance. Specifically, the incentive structure needs to both reduce its reliance on publication in high-impact journals as a primary metric, and properly value and reward a range of research outputs.

This Incentivization Blueprint seeks to provide funders with a stepwise approach to adjusting their incentivization schemes to more closely align with open access, open data, open science, and open research. Developed by the Open Research Funders Group, the Blueprint provides organizations with guidance for developing, implementing, and overseeing incentive structures that maximize the visibility and usability of the research they fund.

A number of prominent funders have committed to taking steps to implement the Incentivization Blueprint. Among them are the following: …”

Open Research Funders Group (ORFG) | DORA

“The ORFG released guidance for funders called, Incentivizing the sharing of research outputs through research assessment: a funder implementation blueprint. The group created the document to assist funders in encouraging researchers to maximize the impact of their work by openly sharing research outputs. The blueprint identifies three goals to be successful:

change the perception that publication in high-impact journals is the only metric that counts;
provide demonstrable evidence that, while journal articles are important, we value and reward all types of research outputs; and
ensure that indicators like the venue of publication or journal impact factor are not used as surrogate measures of quality in researcher assessment.

To do this, the blueprint provides three steps with concrete actions for funders: 1) policy development and declarations, 2) implementation, and 3) engagement.  Template language for funders is included in the document to promote easy uptake….”

Indonesia should stop pushing its academics to chase empty indicators – Nikkei Asia

“An assessment system that predominantly evaluates research performance based on journal output and citations is steering academics from developing countries like mine to chasing quantity over quality. And being exploited while doing so.

Researchers in Indonesia are the second most likely in the world to publish in dubious journals that print articles for a fee without proper scientific peer review, a process where several experts in the field review the merit of the research, according to a new study by economists Vit Machacek and Martin Srholec.

 

These predatory journals prey on academics whose career progressions, and therefore salary increase, are determined by credit points. They exploit the processing fees that authors pay to make articles open to the public. They pocket the payment, an average of $178, an amount close to the basic salary of an entry-level lecturer in a state university in Indonesia, without facilitating proper peer review. The papers published by predatory journals are often low-quality, with typographical and grammatical errors….

In addition to the predatory journal problem, the metric also discourages science collaboration. As the metric values article count, academics who want to turn out several journal articles from a data set has an incentive to hold on to them rather than sharing them for other scientists to analyze….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”