Responsible Research Network, Finland | DORA

“Finland is among the first countries to have developed national recommendations on responsible research evaluation. In 2020, a task force formed by the Federation of Finnish Learned Societies published the “Good Practice in Researcher Evaluation: Recommendation for Responsible Evaluation of a Researcher in Finland.”1 A major driver for the national recommendation was the need to make conscious decisions in evaluation processes. Although many national entities were involved in developing the Recommendation, the approach is considered “bottom-up” and there was broad and enthusiastic buy-in among Finnish academic stakeholders….

A national task force was founded based on shared concerns identified by learned societies, research funders, policy organizations, publishers, national open science coordination, and the national research integrity board. While many national entities were involved in the Recommendation’s creation, the approach is considered “bottom-up”; in Finland there is a historic culture of autonomy for academic stakeholders….

In addition, the Recommendation timing coincided with the uptake of FAIR (findable, accessible, interoperable, and reusable) data and open science initiatives in Finland. These initiatives incentivize and reward researchers for producing open and FAIR data, and align with the Recommendation. In the coming years, the focus will be on building the capacity to move evaluation practices beyond quantitative publication metrics and in closer alignment with the goals of the Recommendation….”

Research evaluation in context 1: Introducing research evaluation in the Netherlands – Leiden Madtrics

“As a member of the working group for the monitoring and further development of the evaluation protocol – and as an employee of CWTS – let me provide insight and context. In a series of blog posts I will focus on the evaluation procedure and the evaluation goals as described in the current protocol for the evaluation of research units. Furthermore, I will focus on the bigger picture and pay attention to the context in which the evaluation protocols have been developed and function….”

Annual report: a recap of the San Francisco Declaration on Research Assessment (DORA) activities in 2020 | DORA

“Over the past year, it has become increasingly clear that research assessment reform is a systems challenge that requires collective action. Point interventions simply do not solve these types of complex challenges that involve multiple stakeholders. Because of this, we dedicated our efforts in 2020 on building a community of practice and finding new ways to support organizations seeking to improve the decision-making that impacts research careers.

Current events also influenced our approach this year and evolved our thinking about research assessment reform. The Covid-19 pandemic led to the abrupt global disruption of academic research, along with many other industries. For academics with limited access to research laboratories and other on-campus resources, work stalled. Without appropriate action, this disruption will have a profound effect on the advancement and promotion of the academic workforce, and it will likely disproportionately affect women and underrepresented and minoritized researchers. So in April DORA called on institutions to redefine their expectations and clearly communicate how evaluation procedures will be modified. In May, DORA organized a webinar with Rescuing Biomedical Research to better understand specific faculty concerns as a result of the pandemic….

In the Fall of 2020, DORA initiated a new community project with Schmidt to develop a means for institutions to gauge their ability to support academic assessment interventions and set them up for success. Our goal for the project was to support the development of new practices by helping institutions analyze the outcomes of their efforts. More than 70 individuals in 26 countries and 6 continents responded to our informal survey in August, and about 35 people joined us for 3 working sessions in September. From these activities, we heard it was important to look beyond individual interventions to improve assessment, because the success of these interventions depends on institutional conditions and capabilities. We were also reminded that institutional capabilities impact interventions, so it is important not only to gauge success but also to support interventions. These and other insights led us to create SPACE to Evolve Academic Assessment: a rubric for analyzing institutional conditions and progress indicators. The first draft of the rubric was developed in the last quarter of 2020. The final version was released in 2021 after an initial pilot phase with seven members of the academic community, including a college dean, policy advisor, research administrator, faculty member, and graduate student….

Another addition to the website was a repository of case studies documenting key elements of institutional change to improve academic career assessment, such as motivations, processes, timelines, new policies, and the types of people involved. The repository, Reimagining academic assessment: stories of innovation and change, was produced in partnership with the European University Association and SPARC Europe. At the time of launch, the repository included 10 structured case studies coming from 7 universities and 3 national consortia. Nine of the 10 cases are from Europe and one is from China. The case studies have shown us the importance of coalition-building to gain bottom-up support for change. We also learned that limited awareness and capacity for incentivizing and rewarding a broader range of academic activities were challenges that all the cases had to overcome. By sharing information about the creation of new policies and practices, we hope the case studies will serve as a source of inspiration for institutions seeking to review or improve academic career assessment….

Policy progress for research assessment reform continued to gain momentum in 2020. A new national policy on research assessment in China announced in February prohibits cash rewards for research papers and indicates that institutions can no longer exclusively hire or promote researchers based on their number of publications or citations. In June, Wellcome published guidance for research organizations on how to implement responsible and fair approaches for research assessment that are grounded i

Innovation, entrepreneurship, promotion, and tenure

“Academic promotion and tenure (P&T) processes that typically prioritize faculty grants and publications can fail to fully assess and value entrepreneurial, innovative endeavors (1) that can produce the kind of societal impacts that universities are increasingly being called on to provide and that many faculty and students increasingly prioritize (2, 3). A more inclusive assessment of scholarship and creative activity to better recognize and reward innovation and entrepreneurship (I&E) will require “broadening the bar” (4) to reflect evolving forms of faculty impact without diluting or increasing the requirements for advancement. Expanding what we value as scholarship can also help augment who we value as scholars and thus support a more innovative and diverse professoriate. We highlight work by the Promotion and Tenure–Innovation and Entrepreneurship (PTIE) coalition to promote policies and practices to recognize the impact of faculty I&E. We posit that this strategy can be broadly applicable (beyond I&E) to recognize the many and evolving dimensions along which faculty create societal impacts….

I&E—along with diversity, equity, and inclusion (DEI); interdisciplinary team science; open science; community engagement; and others—represent examples of the many evolving forms of scholarship for the 21stcentury faculty member. That said, these types of scholarship can be overlooked or undervalued in the process by which universities review, reward, and advance the academic workforce (8, 11, 12). As these evolutions are incorporated into the fabric of higher education, the faculty evaluation process thus needs to be updated to reflect this changing landscape….”

Coalition Members | Promotion and Tenure – Innovation and Entrepreneurship (PTIE) Summit

“Coalition members are universities committed to being a part of the conversation with on this topic. Membership as a coalition member does not constitute endorsing specific solutions or promotion & tenure (P&T) policies. Any opinions, findings, and conclusions or recommendations expressed are those of the authors and do not necessarily reflect the views of the organizations listed below. By joining the non-binding PTIE Coalition, the representative(s) from the institution are committing to the following:

 

Stay engaged with us as we develop our program for the 2020 PTIE summit – providing suggestions and insights and serving as a sounding board for ideas.
Provide a representative from your institution who will attend the Virtual National Summit on September 16-18, 2020. 
Consider adopting the recommendations from the 2020 PTIE summit for expanding P&T guidelines on your own campus.
Allow the PTIE organizing committee to list your institution’s name and/or logo on a webpage as an institution (along with our other coalition institutions) committed to advancing I&E on their campus for their faculty and students. The webpage will be housed on our www.ptie.org website and will list that your participation in this coalition consists of these four bullet points listed here….”

How should Dora be enforced? – Research Professional News

“One lesson is that the declaration’s authors did not consider redundancy as a possible outcome of research assessment, focusing instead on hiring, promotion and funding decisions. However, in my view, redundancy processes should not be delegated to crude metrics and should be informed by the principles of Dora. 

That said, it is not Dora’s job as an organisation to intervene in the gritty particulars of industrial disputes. Nor can we arbitrate in every dispute about research assessment practices within signatory organisations. …

Recently, we have re-emphasised that university signatories must make it clear to their academic staff what signing Dora means. Organisations should demonstrate their commitment to Dora’s principles to their communities, not seek accreditation from us. In doing so, they empower their staff to challenge departures from the spirit of the declaration. Grant conditions introduced by signatory funders such as the Wellcome Trust and Research England buttress this approach. 

Dora’s approach to community engagement taps into the demand for research assessment reform while acknowledging the lack of consensus on how best to go about it. The necessary reforms are complex, intersecting with the culture change needed to make the academy more open and inclusive. They also have to overcome barriers thrown up by academics comfortable with the status quo and the increasing marketisation of higher education. In such a complex landscape, Dora has no wish to be prescriptive. Rather, we need to help institutions find their own way, which will sometimes mean allowing room for course corrections….”

NSF-funded, Husker-led project to evaluate open-access educational resources | Nebraska Today | University of Nebraska–Lincoln

“So the landmark NSF report called on educators to prioritize conceptual understanding over facts, emphasize the scientific process as much as the result, and explore ways to give students a greater stake in their own learning. Even before the report, but especially in the decade after, instructors developed so-called open educational resources — freely available lesson plans, lab activities, quizzes and other course materials — to help incite the instructional revolution.

With the support of a nearly $2 million grant from the NSF, Couch is now leading a five-year, multi-institutional effort to gauge the creation, evolution and implementation of those open educational resources….”

Boost for academic recognition and reward revolution

“Dutch academics are putting their foot on the gas in the rebellion against the uncritical use of journal impact factors to recognise and reward researchers, which was set in motion by the 2012 San Francisco Declaration on Research Assessment, or DORA.

From early next year, Utrecht University in the Netherlands will officially stop using the so-called ‘impact factor’ in all its hiring and promotions and judge its researchers by their commitment to open science, teamwork, public engagement and data sharing.

And despite opposition from some Dutch professors, the sweeping changes are gathering pace, with Leiden University among the Dutch institutions also pledging their support with their Academia in Motion paper….”

Open Education in Promotion, Tenure,& Faculty Development · Open Education in Promotion, Tenure, and Faculty Development

“The Iowa Open Education Action Team (Iowa OER) has built upon DOERS3’s OER in Tenure & Promotion Matrix to help faculty and staff advocate for the inclusion of open educational practices (OEP) in the promotion, tenure, and faculty evaluation practices at their institutions….”

Open Education in Promotion, Tenure, and Faculty Development

“This resource was developed by a working group from the Iowa Open Education Action Team (Iowa OER). Our team built upon DOERS3’s OER in Tenure & Promotion Matrix to help faculty and staff advocate for the inclusion of open educational practices (OEP) in promotion, tenure, and faculty evaluation practices at their institutions. Below, you can find our main document, directions for interacting with the text, and handouts you can use or adapt for your own advocacy work….”

Impact of “impact factor” on early-career scientists | Rising Kashmir

“Usage of JIF by the scientific community as a predictor of impact has also increased, even while evidence of its predictive value has eroded; both correlations between article citation rate and JIF and proportions of highly cited articles published in high-impact journals have declined since 1990. Because digitization of journal content and proliferation of open-access articles have profoundly changed how relevant literature is located and cited. Having reviewed its history, a Web of Science search was carried out for articles published last year relevant to JIF; of 88 articles, about half are critiques of JIF, yet the other half, for the most part, are journal editorials touting a new or higher impact factor for the year….

Hiring and promotion decisions are too important to be subject to the influence of a metric so vulnerable to manipulation and misrepresentation. Journals can boost their JIF by encouraging selective journal self-citation and by changing journal composition through a preferential publication of reviews, articles in fields with large constituencies, or articles on research topics with short half-lives. JIF has degenerated into a marketing tool for journals as illustrated by the use of “Unofficial Impact Factors” in promotional material for journals that are not even indexed in Web of Science; also they are marketing tools for academic institutions as illustrated by the practice of Clarivate Analytics (which now owns Science Citation Index) of awarding paper certificates and electronic “badges” for scientists determined to be Highly Cited Researchers (HCRs, #HighlyCited) by virtue of publishing papers in the top 1% by citations for their field and publication year. …

 

In Science, it has been widely noted that using JIF as a proxy for scientific excellence undermines incentives to pursue novel, time-consuming, and potentially groundbreaking work…”

 

Dashboard will track hiring and promotion criteria

“A US$1.2 million grant will fund an effort to identify and publicize the criteria that universities around the world use to hire and promote researchers. The Declaration on Research Assessment (DORA), a global initiative to reform the evaluation of researchers, will use part of the funds to create an interactive dashboard that will shine much-needed light on a process that is often opaque and controversial, says programme director Anna Hatch, who is based in Washington DC. “When criteria are visible and transparent, universities can be held accountable,” she says. “Researchers will know how their contributions will be measured, so they can make a better case for themselves.”

DORA, conceived in 2012 at the annual meeting of the American Society for Cell Biology, called for improvements to the evaluation of researchers and the outputs of scholarly research. The declaration specifically calls for doing away with impact factors as a way to judge the merit of academics. So far, it has been signed by more than 20,000 individuals and institutions around the world.

The grant is from the Arcadia Fund, a UK-based charity that has supported many academic initiatives since its founding in 2001….”

TU Berlin unterzeichnet San Francisco Declaration on Research Assessment (TU Berlin signs DORA)

Influence of the journal impact factor in research assessment is criticized

By signing the San Francisco Declaration on Research Assessment (DORA) on July 14, 2021, Technische Universität (TU) Berlin joins an international movement of researchers* and institutions advocating for more equality and transparency in the evaluation of scientific research results. As of mid-July 2021, 2,251 organizations and 17,721 individuals worldwide have signed the declaration, including the German Research Foundation (DFG).
 

 

Einfluss des Journal Impact Factors bei der Forschungsbewertung wird kritisiert

Mit der Unterzeichnung der San Francisco Declaration on Research Assessment (DORA) am 14. Juli 2021 schließt sich die Technische Universität (TU) Berlin einer internationalen Bewegung von Forscher*innen und Institutionen an, die sich für mehr Gleichberechtigung und Transparenz bei der Evaluation wissenschaftlicher Forschungsergebnisse einsetzt. Stand Mitte Juli 2021 haben weltweit 2.251 Organisationen und 17.721 Personen die Erklärung unterzeichnet, darunter die Deutsche Forschungsgemeinschaft (DFG).

ERA Portal Austria – ERC announces its plans for 2022

On the occasion of the adoption of the ERC’s 2022 work programme, the ERC has also announced its formal endorsement of the San Francisco Declaration on Research Assessment (DORA), in line with its long-standing adherence to the highest standards of research assessment. The ERC is convinced that the broad implementation of research assessment procedures that integrate the DORA principles is the key to an equitable transition to Open Science.