UKRI commissioned Research Consulting to undertake a project to support the development of its monitoring and evaluation framework.
UKRI commissioned Research Consulting to undertake a project to support the development of its monitoring and evaluation framework.
In 2020, the French Ministry of Higher Education and Research (MESR) launched the Translations and Open Science project with the aim to explore the opportunities offered by translation technologies to foster multilingualism in scholarly communication and thus help to remove language barriers according to Open Science principles.
During the initial phase of the project (2020), a first working group, made up of experts in natural language processing and translation, published a report suggesting recommendations and avenues for experimentation with a view to establishing a scientific translation service combining relevant technologies, resources and human skills.
Once developed, the scientific translation service is intended to:
address the needs of different users, including researchers (authors and readers), readers outside the academic community, publishers of scientific texts, dissemination platforms or open archives;
combine specialised language technologies and human skills, in particular adapted machine translation engines and in-domain language resources to support the translation process;
be founded on the principles of open science, hence based on open-source software as well as shareable resources, and used to produce open access translations.
Project Goals
In order to follow up on recommendations and lay the foundation of the translation service, the OPERAS Research Infrastructure was commissioned by the MESR to coordinate a series of preparatory studies in the following areas:
Mapping and collection of scientific bilingual corpora: identifying and defining the conditions for collecting and preparing corpora of bilingual scientific texts which will serve as training dataset for specialised translation engines, source data for terminology extraction, and translation memory creation.
Use case study for a technology-based scientific translation service: drafting an overview of the current translation practices in scholarly communication and defining the use cases of a technology-based scientific translation service (associated features, expected quality, editorial and technical workflows, and involved human experts).
Machine translation evaluation in the context of scholarly communication: evaluating a set of translation engines to translate specialised texts.
Roadmap and budget projections: making budget projections to anticipate the costs to develop and run the service.
The four preparatory studies are planned during a one-year period as of September 2022.
The present call for tenders only covers the (3) Machine translation evaluation in the context of scholarly communication.
by Andrea Chiarelli
Between August and November 2022, almost 80 individuals from across the research and publishing landscape contributed to a study we delivered on behalf of UK Research and Innovation (UKRI), to support the development of a monitoring and evaluation (M&E) framework for their Open Access (OA) policy.
The framework will help UKRI and the sector assess open access progress, levels of compliance with the policy and its effectiveness. It will also seek to establish insights into open access publication trends across the UK and, where possible, their impact on academic practices and society.
We are in the process of finalising project outputs for public dissemination alongside our associates Bianca Kramer and Cameron Neylon, but we are now in a position to share some high-level findings and next steps. This blog covers five key principles we identified from our discussions with the research and publishing communities, as well as considering the implications for UKRI’s future M&E efforts.
Over the last month, we have added two new groups, GigaScience and GigaByte, from the journals of the same name, increasing the number of specialist teams displaying their evaluations on Sciety.
GigaScience and GigaByte are part of GigaScience Press. With a decade-long history of open-science publishing, they aim to revolutionise publishing by promoting reproducibility of analyses and data dissemination, organisation, understanding, and use. As open-access and open-data journals, they publish all research objects (publishing data, software and workflows) from ‘big data’ studies across the life and biomedical sciences. These resources are managed using the FAIR Principles for scientific data management and stewardship, which state that research data should be Findable, Accessible, Interoperable and Reusable. They also follow the practices of transparency and openness in science publishing, and as such, they embrace open peer review (which is mandated for both journals) and preprints (which are strongly encouraged in GigaScience and mandated for GigaByte). The opportunities for combining both are covered by GigaScience in its video on open science and preprint peer review for Peer Review Week.
Meticulous self-evaluative practices in the offices of academic periodicals can be helpful in reducing widespread uncertainty about the quality of scholarly journals. This paper summarizes the results of the second part of a qualitative worldwide study among 258 senior editors of scholarly journals across disciplines. By means of a qualitative questionnaire, the survey investigated respondents’ perceptions of needed changes in their own editorial workflow that could, according to their beliefs, positively affect the quality of their journals. The results show that the most relevant past improvements indicated by respondents were achieved by: (a) raising the required quality criteria for manuscripts, by defining standards for desk rejection and/or shaping the desired qualities of the published material, and (b) guaranteeing a rigorous peer review process. Respondents believed that, currently, three areas have the most pressing need for amendment: ensuring higher overall quality of published articles (26% of respondents qualified this need as very high or high), increasing the overall quality of peer-review reports (23%), and raising reviewers’ awareness of the required quality standards (20%). Bivariate analysis shows that respondents who work with non-commercial publishers reported an overall greater need to improve implemented quality assessment processes. Work overload, inadequate reward systems, and a lack of time for development activities were cited by respondents as the greatest obstacles to implementing necessary amendments.
via deepl.com: The following spectrum for the evaluation of OA transformation contracts and publishing offers is a result of the work of the independent working group “Forum 13+” and is primarily aimed at negotiators of OA transformation contracts and thus at acquisition and licensing experts at academic libraries and library consortia.
German original:
Das folgende Spektrum zur Bewertung von Open Access-Transformationsverträgen und Verlagsangeboten ist ein Arbeitsergebnis der unabhängigen Arbeitsgruppe „Forum 13+“ und richtet sich in erster Linie an Verhandler*innen von Open Access-Transformationsverträgen und damit an die Erwerbungs- und Lizenzierungsexpert*innen an wissenschaftlichen Bibliotheken und an Bibliothekskonsortien.
by Erzsébet Tóth-Czifra
Research assessment (i.e. decisions on allocation of research funds, academic career advancement, and the hiring of staff) has been recognized as the Achilles heel of firmly grounding Open Science practices in research realities for a long while now. In academia scholars are still facing conflicting injunctions and have to walk on both paved and unpaved paths while advocacy and research policy efforts repeatedly point out enormous complexities, systemic impediments and failed attempts in upscaling thoughtful, alternative proxies that could replace the the current harmful system dominated by publisher prestige.
As an example of the many voices urging and supporting this change, in DARIAH’s response to the the stakeholder consultation on the Future of scholarly publishing and scholarly communication European Commission report in 2019, we argue that the vicious circle in which research evaluation is lingering can only be broken through a set of urgent and harmonized actions and call for a new social contract between on the European level involving funders, research performing institutions and their ministries, university networks, disciplinary communities and research infrastructure providers (including publishers).
Gadd, Mis-Measuring Our Universities: Why Global University Rankings Don’t Add Up. Frontiers in Research Metrics and Analytics. https://doi.org/10.3389/frma.2021.680023
Abstract: Draws parallels between the problematic use of GDP to evaluate economic success with the use of global university rankings to evaluate university success. Inspired by Kate Raworth’s Doughnut Economics, this perspective argues that the pursuit of growth as measured by such indicators creates universities that ‘grow’ up the rankings rather than those which ‘thrive’ or ‘mature.’ Such growth creates academic wealth divides within and between countries, despite the direction of growth as inspired by the rankings not truly reflecting universities’ critical purpose or contribution. Highlights the incompatibility between universities’ alignment with socially responsible practices and continued engagement with socially irresponsible ranking practices. Proposes four possible ways of engendering change in the university rankings space. Concludes by calling on leaders of ‘world-leading’ universities to join together to ‘lead the world’ in challenging global university rankings, and to set their own standards for thriving and maturing universities.
Avanço, Karla, Balula, Ana, B?aszczy?ska, Marta, Buchner, Anna, Caliman, Lorena, Clivaz, Claire, … Wieneke, Lars. (2021, June 29). Future of Scholarly Communication . Forging an inclusive and innovative research infrastructure for scholarly communication in Social Sciences and Humanities. Zenodo. https://doi.org/10.5281/zenodo.5017705
This report discusses the scholarly communication issues in Social Sciences and Humanities that are relevant to the future development and functioning of OPERAS. The outcomes collected here can be divided into two groups of innovations regarding 1) the operation of OPERAS, and 2) its activities. The “operational” issues include the ways in which an innovative research infrastructure should be governed (Chapter 1) as well as the business models for open access publications in Social Sciences and Humanities (Chapter 2). The other group of issues is dedicated to strategic areas where OPERAS and its services may play an instrumental role in providing, enabling, or unlocking innovation: FAIR data (Chapter 3), bibliodiversity and multilingualism in scholarly communication (Chapter 4), the future of scholarly writing (Chapter 5), and quality assessment (Chapter 6). Each chapter provides an overview of the main findings and challenges with emphasis on recommendations for OPERAS and other stakeholders like e-infrastructures, publishers, SSH researchers, research performing organisations, policy makers, and funders. Links to data and further publications stemming from work concerning particular tasks are located at the end of each chapter.
“Part of our mission at bioRxiv is to alert readers to reviews and discussion of preprints and support the different ways readers provide feedback to authors on their work. These include tweets, comments on preprints and community- or journal-organized peer reviews. bioRxiv improves discoverability of such efforts by linking to peer reviews, community discussions and mentions of the preprint in social and traditional media. By aggregating this information in a new dashboard, we are now making these even easier for readers to find and access.
A series of new icons now appears in the dashboard launch bar, above each Abstract, representing different sources of preprint discussion or evaluation; the numbers of each evaluation or interaction are shown, and clicking on one of the icons opens a dashboard with details of the entries in that section….”