“Who Is the FAIRest of Them All?” Authors, Entities, and Journals Regarding FAIR Data Principles

Abstract:  The perceived need to improve the infrastructure supporting the re-use of scholarly data since the second decade of the 21st century led to the design of a concise number of principles and metrics, named FAIR Data Principles. This paper, part of an extended study, intends to identify the main authors, entities, and scientific journals linked to research conducted within the FAIR Data Principles. The research was developed by means of a qualitative approach, using documentary research and a constant comparison method for codification and categorization of the sampled data. The sample studied showed that most authors were located in the Netherlands, with Europe accounting for more than 70% of the number of authors considered. Most of these are researchers and work in higher education institutions. These entities can be found in most of the territorial-administrative areas under consideration, with the USA being the country with more entities and Europe being the world region where they are more numerous. The journal with more texts in the used sample was Insights, with 2020 being the year when more texts were published. Two of the most prominent authors present in the sample texts were located in the Netherlands, while the other two were in France and Australia.

 

Journal transparency – the new Journal Comparison Service from PlanS | Maverick Publishing Specialists

“At a recent STM Association webinar, Robert Kiley, Head of Open Research at the Wellcome Trust, presented an informative overview of the new Journal Comparison Service from PlanS. He stated that the goal of this new tool is to meet the needs of the research community who “have called for greater transparency regarding the services publishers provide and the fees they charge. Many publishers are willing to be responsive to this need, but until now there was no standardised or secure way for publishers to share this information with their customers.” Publishers of scholarly journals are invited to upload data on their journals – one data set for each journal. The cOAlition S Publisher’s Guide  points out that the data is all information that publishers already have in some form, and it will need to be uploaded every year for the previous year.

There are two versions of data that can be supplied and I took a look at the version developed by Information Power (see https://www.coalition-s.org/journal-comparison-service-resources-publishers/ for the details and an FAQ). There are 34 fields, including basic journal identifiers plus additional information in three broad categories: prices (APC data; subscription prices plus discount policies); editorial data (acceptance rates, peer review times, Counter 5 data); and costs (price and service information)….

As a previous publisher of a portfolio of journals, I know that allocating these kinds of costs back to a specific journal is at best a guesstimate and very unlikely to be accurate and comparable.

The webinar included a contribution from Rod Cookson, CEO of International Water Association (IWA) Publishing.  Rod has been an advocate for transparency and helped to create the tool kit for publishers who want to negotiate transformative agreements (https://www.alpsp.org/OA-agreements). Rod reported that it had taken 6 people 2-3 months to gather the data to complete the 34 fields in the comparison tool.  IWA Publishing publishes 14 journals….”

 

Uses of the Journal Impact Factor in national journal rankings in China and Europe – Kulczycki – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  This paper investigates different uses of the Journal Impact Factor (JIF) in national journal rankings and discusses the merits of supplementing metrics with expert assessment. Our focus is national journal rankings used as evidence to support decisions about the distribution of institutional funding or career advancement. The seven countries under comparison are China, Denmark, Finland, Italy, Norway, Poland, and Turkey—and the region of Flanders in Belgium. With the exception of Italy, top-tier journals used in national rankings include those classified at the highest level, or according to tier, or points implemented. A total of 3,565 (75.8%) out of 4,701 unique top-tier journals were identified as having a JIF, with 55.7% belonging to the first Journal Impact Factor quartile. Journal rankings in China, Flanders, Poland, and Turkey classify journals with a JIF as being top-tier, but only when they are in the first quartile of the Average Journal Impact Factor Percentile. Journal rankings that result from expert assessment in Denmark, Finland, and Norway regularly classify journals as top-tier outside the first quartile, particularly in the social sciences and humanities. We conclude that experts, when tasked with metric-informed journal rankings, take into account quality dimensions that are not covered by JIFs.

 

National differences in dissemination and use of open access literature

Open Access (OA) dissemination has been gaining a lot of momentum over the last decade, thanks to the implementation of several OA policies by funders and institutions, as well as the development of several new platforms that facilitate the publication of OA content at low or no cost. Studies have shown that nearly half of the contemporary scientific literature could be available online for free. However, few studies have compared the use of OA literature across countries. This study aims to provide a global picture of OA adoption by countries, using two indicators: publications in OA and references made to articles in OA. We find that, on average, low-income countries are publishing and citing OA at the highest rate, while upper middle-income countries and higher-income countries publish and cite OA articles at below world-average rates. These results highlight national differences in OA uptake and suggest that more OA initiatives at the institutional, national, and international levels are needed to support wider adoption of open scholarship.

 

Open Research in the Humanities | Unlocking Research

“The Working Group on Open Research in the Humanities was chaired by Prof. Emma Gilby (MMLL) with Dr. Rachel Leow (History), Dr. Amelie Roper (UL), Dr. Matthias Ammon (MMLL and OSC), Dr. Sam Moore (UL), Prof. Alexander Bird (Philosophy), and Prof. Ingo Gildenhard (Classics). We met for four meetings in July, September, October and December 2021, with a view to steering and developing services in support of Open Research in the Humanities. We aimed notably to offer input on how to define Open Research in the Humanities, how to communicate effectively with colleagues in the Arts and Humanities (A&H), and how to reinforce the prestige around Open Research. We hope to add our perspective to the debate on Open Science by providing a view ‘from the ground’ and from the perspective of a select group of humanities researchers. These disciplinary considerations inevitably overlap, in some measure, with the social sciences and indeed some aspects of STEM, and we hope that they will therefore have a broad audience and applicability.

Academics in A&H are, in the main, deeply committed to sharing their research. They consider their main professional contribution to be the instigation and furthering of diverse cultural conversations. They also consider open public access to their work to be a valuable goal, alongside other equally prominent ambitions: aiming at research quality and diversity, and offering support to early career scholars in a challenging and often precarious employment landscape.  

Although A&H cover a diverse range of disciplines, it is possible to discern certain common elements which guide their profile and impact. These common elements also guide the discussion that follows….”

NUI Galway IP Policy and OER: Comparing NUI Galway and Peer Institutions in Ireland | The HardiBlog: Blog for the NUI Galway Library

by Kris Meen

I blogged recently about Open Educational Resource policies and whether NUI Galway ought to think about reviewing its own policies with an eye towards making them more OER-enabling. More recently, it occurred to me that it might be useful to have a look at some peer institutions in Ireland. and their IP policies to see if I could get an impression of how NUI Galway’s policies stack up to others’ in terms of their OER-friendliness. I went ahead and found the IP policies of five peer universities: Maynooth University, Trinity College Dublin, University College Cork, University College Dublin, and the University of Limerick. What I found was interesting: that the IP policy at NUI Galway appears to be a bit of an outlier, and in some ways would probably be considered less OER-friendly than at least some of our peers. I include links to all six IP policies (NUI Galway and five peer institutions) below as Appendix A.

 

Journal Selection Primer for Neuroradiology Researchers – ScienceDirect

“Authors can also benefit from the open-access model. Open access articles are freely available to all, including physicians, researchers, and patients. Thus, it can potentially lead to an increase in visibility, use, and citation of your article (8). However, researchers must distinguish reputable journals from predatory journals in the open-access publication model. Unfortunately, predatory journals have managed to bleed into PubMed and PubMed central databases in recent years. Therefore, we also recommend that authors check their target journals’ affiliations with reputable scholarly organizations such as the DOAJ, Open Access Scholarly Publishers Association (OASPA), and the Committee on Publication Ethics (COPE) (6).  Some journals (indexed or non-indexed) may also be affiliated with a reputable National Society; for instance, the American Journal of Neuroradiology (AJNR) is affiliated with the American Society of Neuroradiology (Table 4)….”

Attitudes, willingness, and resources to cover Article Publishing Charges (APC): the influence of age, position, income level country, discipline and open access habits

Abstract:  The rise of open access (OA) publishing has been followed by the expansion of the Article Publishing Charges (APC) that moves the financial burden of scholarly journal publishing from readers to authors. We introduce the results of an international randomly selected sampled survey (N=3,422) that explores attitudes towards this pay-to-publish or Gold OA model among scholars. We test the predictor role of age, professional position, discipline, and income-level country in this regard. We found that APCs are perceived more as a global threat to Science than a deterrent to personal professional careers. Academics in low and lower-middle income level countries hold the most unfavorable opinions about the APC system. The less experimental disciplines held more negative perceptions of APC compared to STEM and the Life Sciences. Age and access to external funding stood as negative predictors of refusal to pay to publish. Commitment to OA self-archiving predicted the negative global perception of the APC. We conclude that access to external research funds influences the acceptance and the particular perception of the pay to publish model, remarking the economic dimension of the problem and warning about issues in the inequality between center and periphery.

Subscription-based and open access dermatology journals: the publication model dilemma

Medical journalism and the dissemination of peer-reviewed research serve to promote and protect the integrity of scholarship. We evaluated the publication models of dermatology journals to provide a snapshot of the current state of publishing. A total of 106 actively-publishing dermatology journals were identified using the SCImago Journal Rankings (SJR) citation database. Journals were classified by publication model (subscription-based and open-access), publishing company, publisher type (commercial, professional society, and university), MEDLINE-indexing status, and SJR indicator. Of these, 65 (61.32%) dermatology journals were subscription-based and 41 (38.68%) were open-access. In addition, 59 (55.66%) journals were indexed in MEDLINE and most were subscription-based (N=51) and published by commercial entities (N=54). MEDLINE-indexing status was significantly different across publisher types (P<0.001), access-types (P<0.001), and the top four publishers (P=0.016). Distribution of SJR indicator was significantly different across publisher types (P<0.001) and access-types (all journals, P=0.001; indexed journals only, P=0.046). More than 91% of MEDLINE-indexed titles were published by commercial entities, and among them, four companies controlled the vast majority. Discontinuation of access to any one of the top publishers in dermatology can significantly and disproportionately impact education and scholarship.

Comparative Analyses of Medicinal Chemistry and Cheminformatics Filters with Accessible Implementation in Konstanz Information Miner (KNIME)

Abstract:  High-throughput virtual screening (HTVS) is, in conjunction with rapid advances in computer hardware, becoming a staple in drug design research campaigns and cheminformatics. In this context, virtual compound library design becomes crucial as it generally constitutes the first step where quality filtered databases are essential for the efficient downstream research. Therefore, multiple filters for compound library design were devised and reported in the scientific literature. We collected the most common filters in medicinal chemistry (PAINS, REOS, Aggregators, van de Waterbeemd, Oprea, Fichert, Ghose, Mozzicconacci, Muegge, Egan, Murcko, Veber, Ro3, Ro4, and Ro5) to facilitate their open access use and compared them. Then, we implemented these filters in the open platform Konstanz Information Miner (KNIME) as a freely accessible and simple workflow compatible with small or large compound databases for the benefit of the readers and for the help in the early drug design steps. View Full-Text

 

Plan S Journal Comparison Service: open for publishers to register and deposit price and service data | Plan S

cOAlition S is excited to release today the Journal Comparison Service (JCS), a secure, free and long-anticipated digital service, that aims to shed light on publishing fees and services.

Starting from today, publishers can register with the JCS publisher portal. After signing a service agreement, publishers can share information, at journal level, highlighting the services they provide and the prices they charge in line with one of the Plan S approved price and service transparency frameworks. These data are then made available to librarians via a secure online system.  Examples of data that will be made available through the service include information about the publication frequency, the peer review process, times from submission to acceptance, the range of list prices for APCs, subscription prices, and how the price is allocated over a defined set of services.

 

A comparison of scientometric data and publication policies of ophthalmology journals

Abstract: Purpose: 

This retrospective database analysis study aims to present the scientometric data of journals publishing in the field of ophthalmology and to compare the scientometric data of ophthalmology journals according to the open access (OA) publishing policies.

Methods: 

The scientometric data of 48 journals were obtained from Clarivate Analytics InCites and Scimago Journal & Country Rank websites. Journal impact factor (JIF), Eigenfactor score (ES), scientific journal ranking (SJR), and Hirsch index (HI) were included. The OA publishing policies were separated into full OA with publishing fees, full OA without fees, and hybrid OA. The fees were stated as US dollars (USD).

Results: 

Four scientometric indexes had strong positive correlations; the highest correlation coefficients were observed between the SJR and JIF (R = 0.906) and the SJR and HI (R = 0.798). However, journals in the first quartile according to JIF were in the second and third quartiles according to the SJR and HI and in the fourth quartile in the ES. The OA articles published in hybrid journals received a median of 1.17-fold (0.15–2.71) more citations. Only HI was higher in hybrid OA; other scientometric indexes were similar with full OA journals. Full OA journals charged a median of 1525 USD lower than hybrid journals.

Conclusion: 

Full OA model in ophthalmology journals does not have a positive effect on the scientometric indexes. In hybrid OA journals, choosing to publish OA may increase citations, but it would be more accurate to evaluate this on a journal basis.

Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases | SpringerLink

Abstract:  This paper introduces a novel scientometrics method and applies it to estimate the subject coverages of many of the popular English-focused bibliographic databases in academia. The method uses query results as a common denominator to compare a wide variety of search engines, repositories, digital libraries, and other bibliographic databases. The method extends existing sampling-based approaches that analyze smaller sets of database coverages. The findings show the relative and absolute subject coverages of 56 databases—information that has often not been available before. Knowing the databases’ absolute subject coverage allows the selection of the most comprehensive databases for searches requiring high recall/sensitivity, particularly relevant in lookup or exploratory searches. Knowing the databases’ relative subject coverage allows the selection of specialized databases for searches requiring high precision/specificity, particularly relevant in systematic searches. The findings illustrate not only differences in the disciplinary coverage of Google Scholar, Scopus, or Web of Science, but also of less frequently analyzed databases. For example, researchers might be surprised how Meta (discontinued), Embase, or Europe PMC are found to cover more records than PubMed in Medicine and other health subjects. These findings should encourage researchers to re-evaluate their go-to databases, also against newly introduced options. Searching with more comprehensive databases can improve finding, particularly when selecting the most fitting databases needs particular thought, such as in systematic reviews and meta-analyses. This comparison can also help librarians and other information experts re-evaluate expensive database procurement strategies. Researchers without institutional access learn which open databases are likely most comprehensive in their disciplines.

 

The methodological quality of physical therapy related trial… : American Journal of Physical Medicine & Rehabilitation

Abstract:  Objective 

We aimed to compare the methodological quality of physical therapy-related trials published in open access with that of trials published in subscription-based journals, adjusting for subdiscipline, intervention type, endorsement of the consolidated standards of reporting trials (CONSORT), impact factor, and publication language.

Design 

In this meta-epidemiological study, we searched the Physiotherapy Evidence Database (PEDro) on May 8, 2021, to include any physical therapy-related trials published from January 1, 2020. We extracted variables such as CONSORT endorsement, the PEDro score, and publication type. We compared the PEDro score between the publication types using a multivariable generalized estimating equation (GEE) by adjusting for covariates.

Results 

A total of 2,743 trials were included, with a mean total PEDro score (SD) of 5.8 (±1.5). Trials from open access journals had a lower total PEDro score than those from subscription-based journals (5.5 ± 1.5 vs. 5.9 ± 1.5, mean difference [MD]: ?0.4; 95% confidence interval: 0.3–0.5). GEE revealed that open access publication was significantly associated with the total PEDro score (MD: ?0.42; P < 0.001).

Conclusions 

In the recent physical therapy-related trials, open access publications demonstrated lower methodological quality than subscription-based publications, although with a small difference.