“Dimensions has entered a partnership with the world’s largest university press, Oxford University Press (OUP). Under the agreement, more than 27,000 books and 500 journal titles from OUP’s Oxford Academic digital publishing platform will be fully indexed and discoverable in Dimensions….”
Category Archives: oa.dimensions
[2307.09704] How are exclusively data journals indexed in major scholarly databases? An examination of the Web of Science, Scopus, Dimensions, and OpenAlex
Abstract: As part of the data-driven paradigm and open science movement, the data paper is becoming a popular way for researchers to publish their research data, based on academic norms that cross knowledge domains. Data journals have also been created to host this new academic genre. The growing number of data papers and journals has made them an important large-scale data source for understanding how research data is published and reused in our research system. One barrier to this research agenda is a lack of knowledge as to how data journals and their publications are indexed in the scholarly databases used for quantitative analysis. To address this gap, this study examines how a list of 18 exclusively data journals (i.e., journals that primarily accept data papers) are indexed in four popular scholarly databases: the Web of Science, Scopus, Dimensions, and OpenAlex. We investigate how comprehensively these databases cover the selected data journals and, in particular, how they present the document type information of data papers. We find that the coverage of data papers, as well as their document type information, is highly inconsistent across databases, which creates major challenges for future efforts to study them quantitatively. As a result, we argue that efforts should be made by data journals and databases to improve the quality of metadata for this emerging genre.
Recalibrating the Scope of Scholarly Publishing: A Modest Step in a Vast Decolonization Process | Quantitative Science Studies | MIT Press
Abstract: By analyzing 25,671 journals largely absent from common journal counts, as well as Web of Science and Scopus, this study demonstrates that scholarly communication is more of a global endeavor than is commonly credited. These journals, employing the open source publishing platform Open Journal Systems (OJS), have published 5.8 million items; they are in 136 countries, with 79.9% in the Global South and 84.2% following the OA diamond model (charging neither reader nor author). A substantial proportion of journals operate in more than one language (48.3%), with research published in a total of 60 languages (led by English, Indonesian, Spanish, and Portuguese). The journals are distributed across the social sciences (45.9%), STEM (40.3%), and the humanities (13.8%). For all their geographic, linguistic, and disciplinary diversity, 1.2% are indexed in the Web of Science and 5.7% in Scopus. On the other hand, 1.0% are found in Cabells Predatory Reports, while 1.4% show up in Beall’s questionable list. This paper seeks to both contribute and historically situate expanded scale and diversity of scholarly publishing in the hope that this recognition may assist humankind in taking full advantage of what is increasingly a global research enterprise.
Journals to trial tool that automatically flags reproducibility and transparency issues in papers | News | Chemistry World
“A tool using natural language processing and machine learning algorithms is being rolled-out on journals to automatically flag reproducibility, transparency and authorship problems in scientific papers.
The tool, Ripeta, has existed since 2017 and has already been run on millions of journal papers following its release, but now the tool’s creators have enabled its latest versions to be run on papers before peer review. In August, Ripeta was integrated with the widekly used manuscript submission system Editorial Manager in a bid to identify shortcomings in papers before they are sent out to peer review at journals. At this stage the tool’s creators won’t disclose which journals are using Ripeta, citing commercial confidentiality.
Ripeta sifts through papers to identify ‘trust markers’ for papers such as whether they contain data and code availability statements, open access statements, as well as ethical approvals, author contributions, repository notices and funding declarations.
From October 2022, the technology behind Ripeta was also integrated in the scholarly database Dimensions, giving users access to metadata about trust markers – for a fee – in 33 million academic papers published since 2010….”
Improving Reseach Output and Visibility at Mzuzu University Through Dimensions the Open Access Research Discovery Solutions
Registration form for an event. No other info.
Frontiers | Measuring Research Information Citizenship Across ORCID Practice | Research Metrics and Analytics
“Over the past 10 years, stakeholders across the scholarly communications community have invested significantly not only to increase the adoption of ORCID adoption by researchers, but also to build the broader infrastructures that are needed both to support ORCID and to benefit from it. These parallel efforts have fostered the emergence of a “research information citizenry” between researchers, publishers, funders, and institutions. This paper takes a scientometric approach to investigating how effectively ORCID roles and responsibilities within this citizenry have been adopted. Focusing specifically on researchers, publishers, and funders, ORCID behaviors are measured against the approximated research world represented by the Dimensions dataset….”
The effect of data sources on the measurement of open access: A comparison of Dimensions and the Web of Science
Abstract: With the growing number of open access (OA) mandates, the accurate measurement of OA publishing is an important policy issue. Existing studies have provided estimates of the prevalence of OA publications ranging from 27.9% to 53.7%, depending on the data source and period of investigation. This paper aims at providing a comparison of the proportion of OA publishing as represented in two major bibliometric databases, Web of Science (WoS) and Dimensions, and assesses how the choice of database affects the measurement of OA across different countries. Results show that a higher proportion of publications indexed in Dimensions are OA than those indexed by WoS, and that this is particularly true for publications originating from outside North America and Europe. The paper concludes with a discussion of the cause and consequences of these differences, motivating the use of more inclusive databases when examining OA, especially for publications originating beyond North America and Europe.
[2109.13640] Measuring Research Information Citizenship Across ORCID Practice
Abstract: Over the past 10 years stakeholders across the scholarly communications community have invested significantly not only to increase the adoption of ORCID adoption by researchers, but also to build the the broader infrastructures that are needed both to support ORCID and to benefit from it. These parallel efforts have fostered the emergence of “research information citizenry”, which comprises, but is not limited to, researchers, publishers, funders, and institutions. This paper takes a scientometric approach to investigating how effectively ORCID roles and responsibilities within this citizenry have been adopted. Focusing specifically on researchers, publishers, and funders, ORCID behaviours are measured against the approximated research world represented by the Dimensions dataset.
Data sources and their effects on the measurement of open access. Comparing Dimensions with the Web of Science
With the amount of open access (OA) mandates at the funder and institution level growing, the accurate measurement of OA publishing is an important policy question. Existing studies have provided estimates of the prevalence of OA publications ranging from 27.9% to 53.7%, depending on the data source and period of investigation. This paper aims at providing a comparison of the proportion of OA publishing as represented in two bibliometric databases, Web of Science (WoS) and Dimensions, and assess how it affects the measurement of OA across different countries. Results show that publications indexed in Dimensions have a higher percentage of OA than those indexed by the WoS, especially for publications from outside North America and Europe. The paper concludes with a discussion of the cause and consequences of these differences, motivating for the use of more inclusive databases when examining OA, especially for publications beyond North America and Europe.
From indexation policies through citation networks to normalized citation impacts: Web of Science, Scopus, and Dimensions as varying resonance chambers
Abstract: Dimensions was introduced as an alternative bibliometric database to the well-established Web of Science (WoS) and Scopus, however all three databases have fundamental differences in coverage and content, resultant from their owners’ indexation philosophies. In light of these differences, we explore here, using a citation network analysis and assessment of normalised citation impact of “duplicate” publications, whether the three databases offer structurally different perspectives of the bibliometric landscape or if they are essentially homogenous substitutes. Our citation network analysis of core and exclusive 2016-2018 publications revealed a large set of core publications indexed in all three databases that are highly self-referential. In comparison, each database selected a set of exclusive publications that appeared to hold similarly low levels of relevance to the core set and to one another, with slightly more internal communication between exclusive publications in Scopus and Dimensions than WoS. Our comparison of normalised citations for 41,848 publications indexed in all three databases found that German sectors were valuated as more impactful in Scopus and Dimensions compared to WoS, particularly for sectors with an applied research focus. We conclude that the databases do present structurally different perspectives, although Scopus and Dimensions with their additional circle of applied research vary more from the more base research-focused WoS than they do from one another.
The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis | SpringerLink
Abstract: Traditionally, Web of Science and Scopus have been the two most widely used databases for bibliometric analyses. However, during the last few years some new scholarly databases, such as Dimensions, have come up. Several previous studies have compared different databases, either through a direct comparison of article coverage or by comparing the citations across the databases. This article aims to present a comparative analysis of the journal coverage of the three databases (Web of Science, Scopus and Dimensions), with the objective to describe, understand and visualize the differences in them. The most recent master journal lists of the three databases is used for analysis. The results indicate that the databases have significantly different journal coverage, with the Web of Science being most selective and Dimensions being the most exhaustive. About 99.11% and 96.61% of the journals indexed in Web of Science are also indexed in Scopus and Dimensions, respectively. Scopus has 96.42% of its indexed journals also covered by Dimensions. Dimensions database has the most exhaustive journal coverage, with 82.22% more journals than Web of Science and 48.17% more journals than Scopus. This article also analysed the research outputs for 20 selected countries for the 2010–2018 period, as indexed in the three databases, and identified database-induced variations in research output volume, rank, global share and subject area composition for different countries. It is found that there are clearly visible variations in the research output from different countries in the three databases, along with differential coverage of different subject areas by the three databases. The analytical study provides an informative and practically useful picture of the journal coverage of Web of Science, Scopus and Dimensions databases.
Open Access surpasses subscription publication globally for the first time | Dimensions
“In the vein of keeping things moving, the Dimensions team has introduced many new features over the last few years. Most recently, they have updated the Open Access classifications in Dimensions and introduced some additional fields that some of you may find helpful.
The Open Access data in Dimensions is sourced from our colleagues at Unpaywall. When we first launched Dimensions, Unpaywall was almost as new as we were, but in the meanwhile, both Unpaywall and Dimensions have moved on. The new release of Dimensions now tracks the Unpaywall OA classifications. This means that the filters in Dimensions should be more consistent and easier to understand – we now have: Green, Bronze, Gold, Hybrid, All OA and Closed. Of course, all the Open Access filters are available in the free version of Dimensions as well.
While we have seen the percentage of OA increasing rapidly in recent years, especially in countries like China, Germany and the UK, it was not until 2020 that more outputs were published through Open Access channels than traditional subscription channels globally….”
Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations | SpringerLink
Abstract: New sources of citation data have recently become available, such as Microsoft Academic, Dimensions, and the OpenCitations Index of CrossRef open DOI-to-DOI citations (COCI). Although these have been compared to the Web of Science Core Collection (WoS), Scopus, or Google Scholar, there is no systematic evidence of their differences across subject categories. In response, this paper investigates 3,073,351 citations found by these six data sources to 2,515 English-language highly-cited documents published in 2006 from 252 subject categories, expanding and updating the largest previous study. Google Scholar found 88% of all citations, many of which were not found by the other sources, and nearly all citations found by the remaining sources (89–94%). A similar pattern held within most subject categories. Microsoft Academic is the second largest overall (60% of all citations), including 82% of Scopus citations and 86% of WoS citations. In most categories, Microsoft Academic found more citations than Scopus and WoS (182 and 223 subject categories, respectively), but had coverage gaps in some areas, such as Physics and some Humanities categories. After Scopus, Dimensions is fourth largest (54% of all citations), including 84% of Scopus citations and 88% of WoS citations. It found more citations than Scopus in 36 categories, more than WoS in 185, and displays some coverage gaps, especially in the Humanities. Following WoS, COCI is the smallest, with 28% of all citations. Google Scholar is still the most comprehensive source. In many subject categories Microsoft Academic and Dimensions are good alternatives to Scopus and WoS in terms of coverage.
How PLOS uses Dimensions to validate next generation Open Access agreements | Dimensions
“While there are few, if any, organizations that can claim to have perfect data, the goal should undoubtedly be to strive for a level that is as good as possible. “Data underpins and supports the discussions, the agreements and of course the metrics for success following an agreement,” says Sara. She continues, “at PLOS, we combine data from our own internal sources together with external data sources like Dimensions – which give us the crucial, broader view of the market place outside of PLOS alone.”
How does Dimensions support PLOS? “PLOS relies on Dimensions for baseline data about institutions and their funding sources for agreement discussions but also for internal business analytics,” notes Sara. She adds, Dimensions Analytics is particularly easy to use for non-analysts like myself who want to get in, get a specific question answered (like who is the most frequent funder of a specific country or institution), and get out quickly.” PLOS understands that subject matter experts need to dedicate their time to more significant impact analysis tasks. Accessing a database like Dimensions Analytics that already provides analytical views – layered on top of the data itself – means that many questions can be answered by the PLOS team at all levels. …”
How PLOS uses Dimensions to validate next generation Open Access agreements | Dimensions
“While there are few, if any, organizations that can claim to have perfect data, the goal should undoubtedly be to strive for a level that is as good as possible. “Data underpins and supports the discussions, the agreements and of course the metrics for success following an agreement,” says Sara. She continues, “at PLOS, we combine data from our own internal sources together with external data sources like Dimensions – which give us the crucial, broader view of the market place outside of PLOS alone.”
How does Dimensions support PLOS? “PLOS relies on Dimensions for baseline data about institutions and their funding sources for agreement discussions but also for internal business analytics,” notes Sara. She adds, Dimensions Analytics is particularly easy to use for non-analysts like myself who want to get in, get a specific question answered (like who is the most frequent funder of a specific country or institution), and get out quickly.” PLOS understands that subject matter experts need to dedicate their time to more significant impact analysis tasks. Accessing a database like Dimensions Analytics that already provides analytical views – layered on top of the data itself – means that many questions can be answered by the PLOS team at all levels. …”