Microsoft Academic Website: No longer accessible after Dec. 31, 2020,
Microsoft Academic Graph: No longer providing updated data or access to old releases after Dec. 31, 2021; however, existing copies can still be used under license.
Microsoft Academic has been on a mission to explore new ways to empower researchers and research organizations to achieve more. The research project is characterized by two sets of technologies: one that reads all the Bing-indexed web pages and organizes the most up-to-date academic knowledge into a knowledge base called Microsoft Academic Graph (MAG), and the other that performs semantic reasoning and inference to serve that knowledge through the Microsoft Academic search website and API. We are proud that these data and web services have been found useful in numerous research projects around the world, and excited to see more community-driven, public efforts emerge.
One question that we are asked frequently, though, is how the technologies powering Microsoft Academic can be used by institutions outside of academia to make organizational knowledge more discoverable and accessible. Over the years, we have openly shared some of the building blocks, such as the language and network similarity packages, and the core search engine MAKES. With the continued progress in data access, we believe now is the right time to fully explore opportunities to extend this technology to new industries and transition to community approaches for academic research.
Microsoft Research will continue to support the automated AI agents powering Microsoft Academic services through the end of calendar year 2021. During this time, we encourage existing Microsoft Academic users to begin transitioning to other equivalent services. Below are just a few of the many great options available to the community.
Thank you very much for the years of support and encouragement. We are immensely grateful to have learned and grown from your feedback over the years. As we are passing the torch to the community-driven efforts, we invite you to join us in continuously contributing ideas and suggestions to nurture, embrace, and grow these platforms.
Abstract: New sources of citation data have recently become available, such as Microsoft Academic, Dimensions, and the OpenCitations Index of CrossRef open DOI-to-DOI citations (COCI). Although these have been compared to the Web of Science Core Collection (WoS), Scopus, or Google Scholar, there is no systematic evidence of their differences across subject categories. In response, this paper investigates 3,073,351 citations found by these six data sources to 2,515 English-language highly-cited documents published in 2006 from 252 subject categories, expanding and updating the largest previous study. Google Scholar found 88% of all citations, many of which were not found by the other sources, and nearly all citations found by the remaining sources (89–94%). A similar pattern held within most subject categories. Microsoft Academic is the second largest overall (60% of all citations), including 82% of Scopus citations and 86% of WoS citations. In most categories, Microsoft Academic found more citations than Scopus and WoS (182 and 223 subject categories, respectively), but had coverage gaps in some areas, such as Physics and some Humanities categories. After Scopus, Dimensions is fourth largest (54% of all citations), including 84% of Scopus citations and 88% of WoS citations. It found more citations than Scopus in 36 categories, more than WoS in 185, and displays some coverage gaps, especially in the Humanities. Following WoS, COCI is the smallest, with 28% of all citations. Google Scholar is still the most comprehensive source. In many subject categories Microsoft Academic and Dimensions are good alternatives to Scopus and WoS in terms of coverage.
Abstract: We present a large-scale comparison of five multidisciplinary bibliographic data sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic. The comparison considers all scientific documents from the period 2008-2017 covered by these data sources. Scopus is compared in a pairwise manner with each of the other data sources. We first analyze differences between the data sources in the coverage of documents, focusing for instance on differences over time, differences per document type, and differences per discipline. We then study differences in the completeness and accuracy of citation links. Based on our analysis, we discuss strengths and weaknesses of the different data sources. We emphasize the importance of combining a comprehensive coverage of the scientific literature with a flexible set of filters for making selections of the literature.
“In this blog post, I will talk specifically on a very important source of data used by Academic Search engines – Microsoft Academic Graph (MAG) and do a brief review of four academic search engines – Microsoft Academic, Lens.org, Semantic Scholar and Scinapse ,which uses MAG among other sources….
We live in a time, where large (>50 million) Scholarly discovery indexes are no longer as hard to create as in the past, thanks to the availability of freely available Scholarly article index data like Crossref and MAG.”
“This article examines the current trends and elaborates the future potentials of AI in its role for making science more open and accessible. Based on the experience derived from a research project called Microsoft Academic, the advocates have reasons to be optimistic about the future of open science as the advanced discovery, ranking, and distribution technologies enabled by AI are offering strong incentives for scientists, funders and research managers to make research articles, data and software freely available and accessible….”
Abstract: In the last 3 years, several new (free) sources for academic publication and citation data have joined the now well-established Google Scholar, complementing the two traditional commercial data sources: Scopus and the Web of Science. The most important of these new data sources are Microsoft Academic (2016), Crossref (2017) and Dimensions (2018). Whereas Microsoft Academic has received some attention from the bibliometric commu-nity, there are as yet very few studies that have investigated the coverage of Crossref or Dimensions. To address this gap, this brief letter assesses Crossref and Dimensions cover-age in comparison to Google Scholar, Microsoft Academic, Scopus and the Web of Science through a detailed investigation of the full publication and citation record of a single academic, as well as six top journals in Business & Economics. Overall, this first small-scale study suggests that, when compared to Scopus and the Web of Science, Crossref and Dimensions have a similar or better coverage for both publications and citations, but a substantively lower coverage than Google Scholar and Microsoft Academic. If our find-ings can be confirmed by larger-scale studies, Crossref and Dimensions might serve as good alternatives to Scopus and the Web of Science for both literature reviews and citation analysis. However, Google Scholar and Microsoft Academic maintain their position as the most comprehensive free sources for publication and citation data
Abstract: “This is the first in-depth study on the coverage of Microsoft Academic (MA). The coverage of a verified publication list of a university was analyzed on the level of individual publications in MA, Scopus, and Web of Science (WoS). Citation counts were analyzed and issues related to data retrieval and data quality were examined. A Perl script was written to retrieve metadata from MA. We find that MA covers journal articles, working papers, and conference items to a substantial extent. MA surpasses Scopus and WoS clearly with respect to book-related document types and conference items but falls slightly behind Scopus with regard to journal articles. MA shows the same biases as Scopus and WoS with regard to the coverage of the social sciences and humanities, non-English publications, and open-access publications. Rank correlations of citation counts are high between MA and the benchmark databases. We find that the publication year is correct for 89.5% of all publications and the number of authors for 95.1% of the journal articles. Given the fast and ongoing development of MA, we conclude that MA is on the verge of becoming a bibliometric superpower. However, comprehensive studies on the quality of MA data are still lacking.”