Manipulation of bibliometric data by editors of scientific journals

“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….

An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….

Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! … 

Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”

The DataCite MDC Stack

“In May, the Make Data Count team announced that we have received additional funding from the Alfred P. Sloan Foundation for work on the Make Data Count (MDC) initiative. This will enable DataCite to do additional work in two important areas:

Implement a bibliometrics dashboard that enables bibliometricians – funded by a separate Sloan grant – to do quantitative studies around data usage and citation behaviors.

Increase adoption of standardized data usage across repositories by developing a log processing service that offloads much of the hard work from repositories.

In this blog post, we want to provide more technical details about the upcoming work on the bibliometrics dashboard; the log processing service will be the topic of a future blog post. The bibliometrics dashboard will be based on several important infrastructure pieces that DataCite has built over the past few years, and that are again briefly described below….”

Exploring possibilities to use bibliometric data to monitor Gold open access publishing at the national level – van Leeuwen – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  This article1 describes the possibilities to analyze open access (OA) publishing in the Netherlands in an international comparative way. OA publishing is now actively stimulated by Dutch science policy, similar to the United Kingdom. We conducted a bibliometric baseline measurement to assess the current situation, to be able to measure developments over time. We collected data from various sources, and for three different smaller European countries (the Netherlands, Denmark, and Switzerland). Not all of the analyses for this baseline measurement are included here. The analysis presented in this article focuses on the various ways OA can be defined using the Web of Science, limiting the analysis mainly to Gold OA. From the data we collected we can conclude that the way OA is currently registered in various electronic bibliographic databases is quite unclear, and various methods applied deliver results that are different, although the impact scores derived from the data point in the same direction.

Exploring possibilities to use bibliometric data to monitor Gold open access publishing at the national level – van Leeuwen – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  This article1 describes the possibilities to analyze open access (OA) publishing in the Netherlands in an international comparative way. OA publishing is now actively stimulated by Dutch science policy, similar to the United Kingdom. We conducted a bibliometric baseline measurement to assess the current situation, to be able to measure developments over time. We collected data from various sources, and for three different smaller European countries (the Netherlands, Denmark, and Switzerland). Not all of the analyses for this baseline measurement are included here. The analysis presented in this article focuses on the various ways OA can be defined using the Web of Science, limiting the analysis mainly to Gold OA. From the data we collected we can conclude that the way OA is currently registered in various electronic bibliographic databases is quite unclear, and various methods applied deliver results that are different, although the impact scores derived from the data point in the same direction.

Scientific Production on Open Access: A Worldwide Bibliometric Analysis in the Academic and Scientific Context – E-LIS repository

[Abstract] This research aims to diachronically analyze the worldwide scientific production on open access, in the academic and scientific context, in order to contribute to knowledge and visualization of its main actors. As a method, bibliographical, descriptive and analytical research was used, with the contribution of bibliometric studies, especially the production indicators, scientific collaboration and indicators of thematic co-occurrence. The Scopus database was used as a source to retrieve the articles on the subject, with a resulting corpus of 1179 articles. Using Bibexcel software, frequency tables were constructed for the variables, and Pajek software was used to visualize the collaboration network and VoSViewer for the construction of the keywords’ network. As for the results, the most productive researchers come from countries such as the United States, Canada, France and Spain. Journals with higher impact in the academic community have disseminated the new constructed knowledge. A collaborative network with a few subnets where co-authors are from different countries has been observed. As conclusions, this study allows identifying the themes of debates that mark the development of open access at the international level, and it is possible to state that open access is one of the new emerging and frontier fields of library and information science.