Designing an Open Peer Review Process for Open Access Guides | Community-led Open Publication Infrastructures for Monographs (COPIM)

by Simon Worthington

The LIBER Citizen Science Working Group is embarking on the design of an open peer review process for the guidebook series being published on the topic of citizen science for research libraries. The LIBER working group in collaboration with COPIM is looking for input and feedback on the design of the open peer review workflow. COPIM is supporting the working group by contributing its experience and knowledge of open access book publishing, with respect to collaborative post-publication input, community peer review processes, and reuse. The first section of the guide Citizen Science Skilling for Library Staff, Researchers, and the Public has already been published with three more sections to follow.

 

Books Contain Multitudes: Exploring Experimental Publishing (2022 update) | Community-Led Open Publication Infrastructures for Monographs (COPIM)

Books Contain Multitudes: Exploring Experimental Publishing is a three-part research and scoping report created to support the Experimental Publishing and Reuse Work Package (WP 6) of the COPIM project. It also serves as a resource for the scholarly community, especially for authors and publishers interested in pursuing more experimental forms of book publishing. This is the second version of this report (you can find the first version here), which includes feedback from our community, updates, as well as new additions to predominantly sections 2 (typology) and 3 (workflows, tools, and platforms). For this second version of Books Contain Multitudes we have pulled in resources from another research report we have previously published on reuse and interaction with open access books, from a series of Twitter threads that we have shared online, and from feedback received over this past year on the first version of this report. The resources from this research report and the Twitter threads as well as the feedback received are now incorporated in section 3 of this report.

COPIM (Community-led Open Publication Infrastructures for Monographs) is a 3-year project led by Coventry University as part of an international partnership of researchers, universities, librarians, open access (OA) book publishers and infrastructure providers and is funded by The Research England Development Fund and Arcadia—a charitable fund of Lisbet Rausing and Peter Baldwin. COPIM is building community-owned, open systems and infrastructures to enable OA book publishing to flourish, delivering major improvements in the infrastructures used by OA book publishers and those publishers making a transition to OA. The project addresses the key technological, structural, and organisational hurdles—around funding, production, dissemination, discovery, reuse, and archiving—that are standing in the way of the wider adoption and impact of OA books. COPIM will realign OA book publishing away from competing commercial service providers to a more horizontal and cooperative knowledge-sharing approach.

As part of seven connected Work Packages, COPIM will work on 1) integrated capacity-building amongst presses; 2) access to and development of consortial, institutional, and other funding channels; 3) development and piloting of appropriate business models; 4) cost reductions achieved by economies of scale; 5) mutually supportive governance models; 6) integration into library, repository, and digital learning environments; 7) the re-use of and experimentation with OA books; 8) the effective and robust archiving of OA content; and 9) knowledge transfer to stakeholders through various pilots.

In the Experimental Publishing and Reuse Work Package we are looking at ways to more closely align existing software, tools and technologies, workflows and infrastructures for experimental publishing with the workflows of OA book publishers. To do so, we have produced a set of pilot projects of experimental books, which are being developed with the aid of these new tools and workflows and integrated into COPIM’s infrastructures. As part of these pilot projects, relationships have been established with open source publishing platforms, software providers, and projects focused on experimental long-form publications and outreach activities have been and will be conducted with OA book publishers and authors to further promote experimental publishing opportunities. We have also explored how non-experimental OA books are (re)used by the scholarly community. As such, we have examined those technologies and cultural strategies that are most effective in promoting OA book content interaction and reuse. This includes building communities around content and collections via annotations, comments, and post-publication review (e.g., via the social annotation platform hypothes.is) to enable more collaborative forms of knowledge production. To achieve this, we have mapped both existing technological solutions as well as cultural barriers and best practices with respect to reuse as part of a research report on Promoting and Nurturing Interactions with Open Access Books: Strategies for Publishers and Authors.

We are also producing an online resource and toolkit, or Compendium, to promote and support the publication of experimental books. The ExPub Compendium will be an online resource which provides an easy-to-browse catalogue of experimental publishing tools, practices, examples of experimental books, and the relationships between them. This report has been produced to support b

Open Science in Practice Webinar Series | NWO

“This webinar series showcases the projects awarded an Open Science Fund grant and will cover a wide variety of open science topics. On this page you can find more information about the upcoming webinars and recordings of the previous webinars.

The NWO Open Science Fund provides financial support to the leaders and pioneers who are putting open science into practice. The projects funded in the 2020-2021 round of the programme cover a broad range of open science practices, from developing open source tools and platforms for open science, to FAIR sharing of research data and software, and to bringing about the necessary culture change.

The seminars will be held every two months in 2022, starting in mid-February 2022. Each seminar will focus on a specific open science topic and will feature 1-3 speakers, with at least 30 minutes of Q&A and discussion specific to the main topic of the seminar. Recordings will be made available here following the session. The seminars will be held in English….”

Introducing level X in the Norwegian Publication Indicator | Nordic Perspectives on Open Science

Røeggen, Vidar. 2021. “Introducing Level X in the Norwegian Publication Indicator: Involving the Research Community When Evaluating Journals Operating in the Borderland Between Predatory and Reputable Practice”. Nordic Perspectives on Open Science, December. https://doi.org/10.7557/11.6376.

By introducing the Norwegian Publication Indicator in 2004 Norway became part of an international development in which the allocation of basic funds to research institutions is increasingly linked to performance indicators (Dansk center for forskningsanalyse, 2014). Denmark and Finland have also implemented what is frequently labeled as “The Norwegian Model”. The model has inspired changes in similar national models in Flanders (Belgium) and Poland, and it is used for local purposes by several universities in Sweden and by University of Dublin, in Ireland (Sivertsen, 2018). The research community has been deeply involved in designing and adopting the model in Norway, and the annual processes evaluating journals depend on involvement by panels in every field of research. The indicator has an interactive webpage where researchers can communicate and discuss publication channels openly, and the final decisions made by panels when nominating journals to the highest level (level 2) are transparent and openly available at the webpage.

The indicator depends on information from a national registry of approved publication channels that is managed by The Directorate of Higher Education and Skills (HK-dir.). As of November 2021, The Norwegian register for scientific journals, series and publishers contains 26 127 journals at the basic level (level 1) and 2 193 journals at the highest level (level 2), and level 2 journals are identified by research panels in 84 different fields of research. Researchers can suggest new publication channels to the registry and these suggestions are examined according to our four criteria:

Journals/series must:

Be identified with a valid ISSN, confirmed by The International ISSN Register (demand from 2014)
Have an academic editorial board (or an equivalent) primarily consisting of researchers from universities, institutes or organizations that do research
Have established procedures for external peer review
Have a national or international authorship, meaning that maximum 2/3 of the authors can belong to the same institution

Publishers must:

Be organized in an editorial way to publish publications in accordance with the definition of a scientific publication
Have a scientific publishing program with external advisors and aiming for distribution to scholars and research institutions
Have a national or international authorship, meaning that maximum 2/3 of the authors can belong to the same institution

New suggestions are prepared by the secretariate at the register, and then finally approved by The National Board of Scholarly Publishing (NPU). So, the research community is deeply involved, both in the operations and further development of the indicator.

The secretariate at HK-dir. processes approximately 1 600 new proposals annually and NPU observe a new tendency in recent years: that an ever-increasing number of the incoming suggestions represents channels where there is uncertainty about approval or rejection. On the one hand, an examination of the available information on these journals’ webpages shows that the journals apparently satisfy our criteria. However, NPU sometimes identify ongoing discussion in the research community as to whether editorial practice is in accordance with how the journals describe their own routines. In addition, researchers often inform both NPU and the secretariate at HK-dir. about their own (bad) experience with a journal and ask us to investigate further.

Researchers often refer to these journals as “predatory journals” or the activity they represent as “predatory publishing”. But what does predatory publishing mean in 2021? The term has been co-opted to describe a range of activities including lack of rigorous peer review to exploitative publishing models (Hanson, 2021). Journals or publishers are not either predatory or representatives of high standards – they are rather on a continuum from predatory to high standards of research integrity and practice. Therefore, NPU discuss where to draw the line on this continuum.

A survey of researchers’ code sharing and code reuse practices, and assessment of interactive notebook prototypes | OSF Preprints

Cadwallader, L., & Hrynaszkiewicz, I. (2022, March 2). A survey of researchers’ code sharing and code reuse practices, and assessment of interactive notebook prototypes. https://doi.org/10.31219/osf.io/tys8p

Abstract: This research aimed to understand the needs and habits of researchers in relation to code sharing and reuse; gather feedback on prototype code notebooks created by Neurolibre; and help determine strategies that publishers could use to increase code sharing. We surveyed 188 researchers in computational biology. Respondents were asked about how often and why they look at code, which methods of accessing code they find useful and why and what aspects of code sharing are important to them, and how satisfied they are with their ability to complete these. Respondents were asked to look at a prototype code notebook and give feedback on its features. Respondents were also asked how much time they spent preparing code and if they would be willing to increase this to use a code sharing tool, such as a notebook. As a reader of research articles the most common reason (70%) for looking at code was to gain a better understanding of the article. The most commonly encountered method for code sharing – linking articles to a code repository — was also the most useful method of accessing code from the reader’s perspective. As authors, the respondents were largely satisfied with their ability to carry out tasks related to code sharing. The most important of these tasks were ensuring that the code was running in the correct environment, and sharing code with good documentation. The average researcher, according to our results, is unwilling to incur additional costs (in time, effort or expenditure) that are currently needed to use code sharing tools alongside a publication. We infer this means we need different models for funding and producing interactive or executable research outputs if they are to reach a large number of researchers. For the purpose of increasing the amount of code shared by authors, PLOS Computational Biology is, as a result, focusing on policy rather than tools.

Friesike et al. (2022) Striving for Societal Impact as an Early-career Researcher: Reflections on Five Common Concerns | Emerald Insight

Friesike, S., Dobusch, L. and Heimstädt, M. (2022), “Striving for Societal Impact as an Early-career Researcher: Reflections on Five Common Concerns”, Gümüsay, A.A., Marti, E., Trittin-Ulbrich, H. and Wickert, C. (Ed.) Organizing for Societal Grand Challenges (Research in the Sociology of Organizations, Vol. 79), Emerald Publishing Limited, Bingley, pp. 239-255. https://doi.org/10.1108/S0733-558X20220000079022

Abstract

Many early-career researchers (ECR) are motivated by the prospect of creating knowledge that is useful, not just within but also beyond the academic community. Although research facilities, funders and academic journals praise this eagerness for societal impact, the path toward such contributions is by no means straightforward. In this essay, we address five common concerns faced by ECRs when they strive for societal impact. We discuss the opportunity costs associated with impact work, the fuzziness of current impact measurement, the challenge of incremental results, the actionability of research findings, and the risk of saying something wrong in public. We reflect on these concerns in light of our own experience with impact work and conclude by suggesting a “post-heroic” perspective on impact, whereby seemingly mundane activities are linked in a meaningful way.

Professional Program in Open Education | KPU.ca – Kwantlen Polytechnic University

This comprehensive and flexible online program is designed to develop expertise and capacity across a broad spectrum of open educational practices, including open educational resources and pedagogies, educational technologies, policy, advocacy, and scholarship. The program balances both theoretical and practical elements, always ensuring that critical perspectives and issues are foregrounded.

Note that the program will be offered from Fall, 2022.  Check back often as additional program information will continue to be added.

Case Study of Open Access practices: Limitations and Opportunities in Public Libraries in Nigeria | by Isaac Oloruntimilehin | Creative Commons: We Like to Share | Mar, 2022 | Medium

“Online Public Access Catalogue (OPAC)…

Institutional Repositories…

The Nigerian Copyright Act…

Expressions of Folklore…

Nigerian Language Oral History Documentation Project…

Challenges…”

 

Pontika et al. (2022) Indicators of research quality, quantity, openness and responsibility in institutional promotion, review and tenure policies across seven countries | MetaArXiv Preprints

Pontika, N., Klebel, T., Correia, A., Metzler, H., Knoth, P., & Ross-Hellauer, T. (2022, March 3). Indicators of research quality, quantity, openness and responsibility in institutional promotion, review and tenure policies across seven countries. https://doi.org/10.31222/osf.io/b9qaw

Abstract: The need to reform research assessment processes related to career advancement at research institutions has become increasingly recognised in recent years, especially to better foster open and responsible research practices. Current assessment criteria are believed to focus too heavily on inappropriate criteria related to productivity and quantity as opposed to quality, collaborative open research practices, and the socio-economic impact of research. Evidence of the extent of these issues is urgently needed to inform actions for reform, however. We analyse current practices as revealed by documentation on institutional review, promotion and tenure processes in seven countries (Austria, Brazil, Germany, India, Portugal, United Kingdom and United States of America). Through systematic coding and analysis of 143 RPT policy documents from 107 institutions for the prevalence of 17 criteria (including those related to qualitative or quantitative assessment of research, service to the institution or profession, and open and responsible research practices), we compare assessment practices across a range of international institutions to significantly broaden this evidence-base. Although prevalence of indicators varies considerably between countries, overall we find that currently open and responsible research practices are minimally rewarded and problematic practices of quantification continue to dominate.

Editorial misconduct: the case of online predatory journals

 

The number of publishers that offer academics, researchers, and postgraduate students the opportunity to publish articles and book chapters quickly and easily has been growing steadily in recent years. This can be ascribed to a variety of factors, e.g., increasing Internet use, the Open Access movement, academic pressure to publish, and the emergence of publishers with questionable interests that cast doubt on the reliability and the scientific rigor of the articles they publish.

All this has transformed the scholarly and scientific publishing scene and has opened the door to the appearance of journals whose editorial procedures differ from those of legitimate journals. These publishers are called predatory, because their manuscript publishing process deviates from the norm (very short publication times, non-existent or low-quality peer-review, surprisingly low rejection rates, etc.).

The object of this article is to spell out the editorial practices of these journals to make them easier to spot and thus to alert researchers who are unfamiliar with them. It therefore reviews and highlights the work of other authors who have for years been calling attention to how these journals operate, to their unique features and behaviors, and to the consequences of publishing in them.

The most relevant conclusions reached include the scant awareness of the existence of such journals (especially by researchers still lacking experience), the enormous harm they cause to authors’ reputations, the harm they cause researchers taking part in promotion or professional accreditation procedures, and the feelings of chagrin and helplessness that come from seeing one’s work printed in low-quality journals. Future comprehensive research on why authors decide to submit valuable articles to these journals is also needed.

This paper therefore discusses the size of this phenomenon and how to distinguish those journals from ethical journals.

 

EIFL checklist for using OJS in journal publishing | EIFL

We have updated and revised the EIFL checklist of good practices in using the free and open software Open Journal System (OJS) for journal editing and publishing. OJS is the most widely used publishing software in EIFL partner countries.

The checklist, by Iryna Kuchma, Manager of the EIFL Open Access Programme, takes forward a key goal of EIFL – to ensure the growth and sustainability of digital repositories and journal publishing platforms. 

OJS is created by the Public Knowledge Project (PKP), which is a multi-university initiative developing free and open source software to improve the quality and reach of scholarly publishing.

This is the second version of the checklist. It includes more details about the current production release of software – OJS 3, and tips on organizational identifiers plugin, DOAJ (Directory of Open Access Journals) registration, copyright and licensing, the PKP Project Preservation Network and journal content accessibility. And we’ve updated the ‘further reading’ list.

Developing tools and practices to promote open and efficient science

“In this talk, I’ll introduce three new tools that aim to improve the efficiency of researchers’ work and the accumulation of knowledge. I’ll argue that minimizing extra workload and increasing the ease of use have key importance at the introduction of new research practices. The tools that I’ll share are:

The Transparency Checklist, a consensus-based general ShinyApp checklist to improve and document the transparency of research reports;
Tenzing, a solution to simplify the CRediT-based documentation and reporting the contributions to scholarly articles; and
the Multi-analyst guidance, a consensus-based guide for conducting and documenting multi-analyst studies….”