Abstract: With the growing number of open access (OA) mandates, the accurate measurement of OA publishing is an important policy issue. Existing studies have provided estimates of the prevalence of OA publications ranging from 27.9% to 53.7%, depending on the data source and period of investigation. This paper aims at providing a comparison of the proportion of OA publishing as represented in two major bibliometric databases, Web of Science (WoS) and Dimensions, and assesses how the choice of database affects the measurement of OA across different countries. Results show that a higher proportion of publications indexed in Dimensions are OA than those indexed by WoS, and that this is particularly true for publications originating from outside North America and Europe. The paper concludes with a discussion of the cause and consequences of these differences, motivating the use of more inclusive databases when examining OA, especially for publications originating beyond North America and Europe.
With the amount of open access (OA) mandates at the funder and institution level growing, the accurate measurement of OA publishing is an important policy question. Existing studies have provided estimates of the prevalence of OA publications ranging from 27.9% to 53.7%, depending on the data source and period of investigation. This paper aims at providing a comparison of the proportion of OA publishing as represented in two bibliometric databases, Web of Science (WoS) and Dimensions, and assess how it affects the measurement of OA across different countries. Results show that publications indexed in Dimensions have a higher percentage of OA than those indexed by the WoS, especially for publications from outside North America and Europe. The paper concludes with a discussion of the cause and consequences of these differences, motivating for the use of more inclusive databases when examining OA, especially for publications beyond North America and Europe.
Abstract: Dimensions was introduced as an alternative bibliometric database to the well-established Web of Science (WoS) and Scopus, however all three databases have fundamental differences in coverage and content, resultant from their owners’ indexation philosophies. In light of these differences, we explore here, using a citation network analysis and assessment of normalised citation impact of “duplicate” publications, whether the three databases offer structurally different perspectives of the bibliometric landscape or if they are essentially homogenous substitutes. Our citation network analysis of core and exclusive 2016-2018 publications revealed a large set of core publications indexed in all three databases that are highly self-referential. In comparison, each database selected a set of exclusive publications that appeared to hold similarly low levels of relevance to the core set and to one another, with slightly more internal communication between exclusive publications in Scopus and Dimensions than WoS. Our comparison of normalised citations for 41,848 publications indexed in all three databases found that German sectors were valuated as more impactful in Scopus and Dimensions compared to WoS, particularly for sectors with an applied research focus. We conclude that the databases do present structurally different perspectives, although Scopus and Dimensions with their additional circle of applied research vary more from the more base research-focused WoS than they do from one another.
“The Covid-19 pandemic has triggered an explosion of knowledge, with more than 200,000 papers published to date. At one point last year, scientific output on the topic was doubling every 20 days. This huge growth poses big challenges for researchers, many of whom have pivoted to coronavirus research without experience or preparation.
Mainstream academic search engines are not built for such a situation. Tools such as Google Scholar, Scopus and Web of Science provide long, unstructured lists of results with little context.
These work well if you know what you are looking for. But for anyone diving into an unknown field, it can take weeks, even months, to identify the most important topics, publication venues and authors. This is far too long in a public health emergency.
The result has been delays, duplicated work, and problems with identifying reliable findings. This lack of tools to provide a quick overview of research results and evaluate them correctly has created a crisis in discoverability itself. …
Building on these, meta-aggregators such as Base, Core and OpenAIRE have begun to rival and in some cases outperform the proprietary search engines. …”
“Being able to find, assess and place new research within a field of knowledge, is integral to any research project. For social scientists this process is increasingly likely to take place on Google Scholar, closely followed by traditional scholarly databases. In this post, Alberto Martín-Martín, Enrique Orduna-Malea , Mike Thelwall, Emilio Delgado-López-Cózar, analyse the relative coverage of the three main research databases, Google Scholar, Web of Science and Scopus, finding significant divergences in the social sciences and humanities and suggest that researchers face a trade-off when using different databases: between more comprehensive, but disorderly systems and orderly, but limited systems….”
Abstract: Traditionally, Web of Science and Scopus have been the two most widely used databases for bibliometric analyses. However, during the last few years some new scholarly databases, such as Dimensions, have come up. Several previous studies have compared different databases, either through a direct comparison of article coverage or by comparing the citations across the databases. This article aims to present a comparative analysis of the journal coverage of the three databases (Web of Science, Scopus and Dimensions), with the objective to describe, understand and visualize the differences in them. The most recent master journal lists of the three databases is used for analysis. The results indicate that the databases have significantly different journal coverage, with the Web of Science being most selective and Dimensions being the most exhaustive. About 99.11% and 96.61% of the journals indexed in Web of Science are also indexed in Scopus and Dimensions, respectively. Scopus has 96.42% of its indexed journals also covered by Dimensions. Dimensions database has the most exhaustive journal coverage, with 82.22% more journals than Web of Science and 48.17% more journals than Scopus. This article also analysed the research outputs for 20 selected countries for the 2010–2018 period, as indexed in the three databases, and identified database-induced variations in research output volume, rank, global share and subject area composition for different countries. It is found that there are clearly visible variations in the research output from different countries in the three databases, along with differential coverage of different subject areas by the three databases. The analytical study provides an informative and practically useful picture of the journal coverage of Web of Science, Scopus and Dimensions databases.
Maddi, A., Lardreau, E. & Sapinho, D. Open access in Europe: a national and regional comparison. Scientometrics (2021). https://doi.org/10.1007/s11192-021-03887-1
Open access to scientific publications has progressively become a key issue for European policy makers, resulting in concrete measures by the different country members to promote its development. The aim of paper is, after providing a quick overview of OA policies in Europe, to carry out a comparative study of OA practices within European countries, using data from the Web of Science (WoS) database. This analysis is based on two indicators: the OA share that illustrates the evolution over time, and the normalized OA indicator (NOAI) that allows spatial comparisons, taking into account disciplinary structures of countries. Results show a general trend towards the development of OA over time as expected, but with large disparities between countries, depending on how early they begin taking measures in favor of OA. While it is possible to stress the importance of policy and its influence on open access at country level, this does not appear to be the case at the regional level. There is not much variability between regions, within the same country, in terms of open access indicators.
Abstract: This study is one of the first that uses the recently introduced open access (OA) labels in the Web of Science (WoS) metadata to investigate whether OA articles published in Directory of Open Access Journals (DOAJ) listed journals experience a citation advantage in comparison to subscription journal articles, specifically those of which no self-archived versions are available. Bibliometric data on all articles and reviews indexed in WoS, and published from 2013 to 2015, were analysed. In addition to normalised citation score (NCS), we used two additional measures of citation advantage: whether an article was cited at all; and whether an article is among the most frequently cited percentile of articles within its respective subject area (pptopX %). For each WoS subject area, the strength of the relationship between access status (whether an article was published in an OA journal) and each of these three measures was calculated. We found that OA journal articles experience a citation advantage in very few subject areas and, in most of these subject areas, the citation advantage was found on only a single measure of citation advantage, namely whether the article was cited at all. Our results lead us to conclude that access status accounts for little of the variability in the number of citations an article accumulates. The methodology and the calculations that were used in this study are described in detail and we believe that the lessons we learnt, and the recommendations we make, will be of much use to future researchers interested in using the WoS OA labels, and to the field of citation advantage in general.
” Clarivate Plc (NYSE:CCC), a global leader in providing trusted information and insights to accelerate the pace of innovation, is supporting the Open Access Monitor (OA Monitor), Germany with the provision of Web of Science™ publication, grant and funding data to increase the impact of scientific scholarship and to enable more equitable participation in research. Clarivate™ will provide weekly customised data from the Web of Science covering the publication literature for the DACH region (which includes Germany, Switzerland and Austria).
Supported by the German Federal Ministry of Education and Research (BMBF) and managed by Forschungszentrum Jülich, the OA Monitor provides evaluations of both the volume and financing of publications at federal, state and institutional level in the DACH region. The ability to connect the corresponding author data from the Web of Science with the publication fee information sourced by OA Monitor will have particularly broad implications for the German academic library community. The data will also help policy makers gauge the status of the transformation to Open Access (OA). …”
Abstract: New sources of citation data have recently become available, such as Microsoft Academic, Dimensions, and the OpenCitations Index of CrossRef open DOI-to-DOI citations (COCI). Although these have been compared to the Web of Science Core Collection (WoS), Scopus, or Google Scholar, there is no systematic evidence of their differences across subject categories. In response, this paper investigates 3,073,351 citations found by these six data sources to 2,515 English-language highly-cited documents published in 2006 from 252 subject categories, expanding and updating the largest previous study. Google Scholar found 88% of all citations, many of which were not found by the other sources, and nearly all citations found by the remaining sources (89–94%). A similar pattern held within most subject categories. Microsoft Academic is the second largest overall (60% of all citations), including 82% of Scopus citations and 86% of WoS citations. In most categories, Microsoft Academic found more citations than Scopus and WoS (182 and 223 subject categories, respectively), but had coverage gaps in some areas, such as Physics and some Humanities categories. After Scopus, Dimensions is fourth largest (54% of all citations), including 84% of Scopus citations and 88% of WoS citations. It found more citations than Scopus in 36 categories, more than WoS in 185, and displays some coverage gaps, especially in the Humanities. Following WoS, COCI is the smallest, with 28% of all citations. Google Scholar is still the most comprehensive source. In many subject categories Microsoft Academic and Dimensions are good alternatives to Scopus and WoS in terms of coverage.
Abstract: We present a large-scale comparison of five multidisciplinary bibliographic data sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic. The comparison considers all scientific documents from the period 2008-2017 covered by these data sources. Scopus is compared in a pairwise manner with each of the other data sources. We first analyze differences between the data sources in the coverage of documents, focusing for instance on differences over time, differences per document type, and differences per discipline. We then study differences in the completeness and accuracy of citation links. Based on our analysis, we discuss strengths and weaknesses of the different data sources. We emphasize the importance of combining a comprehensive coverage of the scientific literature with a flexible set of filters for making selections of the literature.
Abstract: Rigorous evidence identification is essential for systematic reviews and meta?analyses (evidence syntheses), because the sample selection of relevant studies determines a review’s outcome, validity, and explanatory power. Yet, the search systems allowing access to this evidence provide varying levels of precision, recall, and reproducibility and also demand different levels of effort. To date, it remains unclear which search systems are most appropriate for evidence synthesis and why. Advice on which search engines and bibliographic databases to choose for systematic searches is limited and lacking systematic, empirical performance assessments.
This study investigates and compares the systematic search qualities of 28 widely used academic search systems, including Google Scholar, PubMed and Web of Science. A novel, query?based method tests how well users are able to interact and retrieve records with each system. The study is the first to show the extent to which search systems can effectively and efficiently perform (Boolean) searches with regards to precision, recall and reproducibility. We found substantial differences in the performance of search systems, meaning that their usability in systematic searches varies. Indeed, only half of the search systems analysed and only a few Open Access databases can be recommended for evidence syntheses without adding substantial caveats. Particularly, our findings demonstrate why Google Scholar is inappropriate as principal search system.
We call for database owners to recognise the requirements of evidence synthesis, and for academic journals to re?assess quality requirements for systematic reviews. Our findings aim to support researchers in conducting better searches for better evidence synthesis.
“How much is your institution spending on APC fees?
How does your institution’s Open Access footprint compare to your peers?
In this session, learn how you can use data from the Web of Science to calculate your institution’s spend on Open Access, and to benchmark your institution’s participation in OA publishing against activity at peer institutions.
We’ll also discuss recent market developments, including how Plan S, a multi-national initiative aimed at making an increasing share of research findings available in OA publications, may impact faculty at U.S. institutions….”
“The Web of Science Group (a Clarivate Analytics company) has entered into a new partnership with Emerald Publishing, to pilot the industry’s first cross-publisher, scalable and transparent peer review workflow from Publons and ScholarOne across three of Emerald’s leading journals.
Transparent peer review shows the complete peer review process from initial review to final decision, and has gained popularity with authors, reviewers and editors alike in recent years.
The new transparent peer review service will be rolled out across Online Information Review, Industrial Lubrication and Tribology and International Journal of Social Economics. The workflows ensure that alongside the published article, readers can access a comprehensive peer review history, including reviewer reports, editor decision letters and authors’ responses. Each of these elements is assigned its own digital object identified (DOI), which helps readers easily reference and cite the peer review content. Transparency can also aid teaching of best practice in peer review. The transparent peer review workflow complies with best-practice data privacy regulation, ensuring the individual preferences of authors, peer reviewers and journals are met….”