There is clear evidence that publishing research in an open access (OA) journal or as an OA model is associated with higher impact, in terms of number of reads and citation rates. The development of OA journals and their quality are poorly studied in the field of urology. In this study, we aim to assess the number of OA journals, their quality in terms of CiteScore, percent cited and quartiles, and their scholarly production during the period from 2011 to 2018.
We obtained data about journals from www.scopus.com, and we filtered the list for urology journals. We obtained data for all Scopus indexed journals during the period from 2011 to 2018. For each journal, we extracted the following indices: CiteScore, Citations, scholarly output, and SCImago quartiles. We analyzed the difference in quality indices between OA and non-OA urology journals.
Urology journals have increased from 66 journals in 2011 to 99 journals in 2018. The number of OA urology journals has increased from only 10 (15.2%) journals in 2011 to 33 (33.3%) journals in 2018. The number of quartile 1 (the top 25%) journals has increased from only 1 journal in 2011 to 5 journals in 2018. Non-OA urology journals had significantly higher CiteScore compared with OA journals till the year 2015, after which the mean difference in CiteScore became smaller with insignificant p-value.
Number and quality of OA journals in the field of urology have increased throughout the last few years. Despite this increase, non-OA urology journals still have higher quality and output.
“The rise of OA and the megajournals has turned out to be a lucrative model for publishing houses.1,2 But is it good for the scientific community as a whole? Opinions on this differ from field to field, with the more translational fields, like biology and medicine, taking a more enthusiastic stance and more fundamental fields, like mathematics and physics, a more skeptical one. (See the commentary by Jason Wright in Physics Today, February 2020, page 10, and reference 3.)
There is also a noticeable generational difference of opinion. Some younger scientists view the trend toward OA scientific journals more favorably than their older colleagues do. …”
Abstract: Objectives: The aim of this study is to compare the annual SJR and to evaluate the other parameters that show the scientific effect of journals in terms of open access (OA) or subscription access (SA) in the field of obstetrics and gynecology according to the SCImago database. Material and methods: This study was conducted between September-December 2019 at Near East University. The SCImago Journal & Country Rank database was used to collect information about the journals. We evaluated and compared the changes in the one-year SJR (SCImago Journal Rank) and journal impact factor (JIF) of OA and SA journals. Results: Data from 183 scientific journals in the field of obstetrics and gynecology from the period between 1999 and 2018 were evaluated, where 140 of these journals were SA and 43 were OA. The average SJR of OA journals in 1999 was 0.17, while it was 0.38 for SA journals. In 2018, these values were 0.31 and 0.78 for OA and SA journals, respectively. In the comparison of JIF, the average of the OA journals in 1999 was 0.09, while it was 0.66 for SA journals. In 2018, these values were 0.80 and 1.93 for OA and SA journals, respectively. Conclusions: Access to information has become easier due to technological developments and this will continue to affect the access policies of journals. Despite the disadvantages of predator journals, the rise of OA journals in terms of number and quality is likely to continue. Key words: open access journal; impact factor; subscription access journal; SCImago; obstetrics; gynecology.
Abstract: The impact of COVID-19 has underlined the need for reliable information to guide clinical practice and policy. This urgency has to be balanced against disruption to journal handling capacity and the continued need to ensure scientific rigour. We examined the reporting quality of highly disseminated COVID-19 research papers using a bibliometric analysis examining reporting quality and risk of bias (RoB) amongst 250 top scoring Altmetric Attention Score (AAS) COVID-19 research papers between January and April 2020. Method-specific RoB tools were used to assess quality. After exclusions, 84 studies from 44 journals were included. Forty-three (51%) were case series/studies, and only one was an randomized controlled trial. Most authors were from institutions based in China (n =?44, 52%). The median AAS and impact factor was 2015 (interquartile range [IQR] 1,105–4,051.5) and 12.8 (IQR 5–44.2) respectively. Nine studies (11%) utilized a formal reporting framework, 62 (74%) included a funding statement, and 41 (49%) were at high RoB. This review of the most widely disseminated COVID-19 studies highlights a preponderance of low-quality case series with few research papers adhering to good standards of reporting. It emphasizes the need for cautious interpretation of research and the increasingly vital responsibility that journals have in ensuring high-quality publications.
Abstract: A retrospective observational study was conducted to evaluate open-access journals in obstetrics and gynaecology, published between 2011 and 2019. Journals were classified based on their registration in open-access journal directories. Of 176 journals, 47 were not registered. Journals registered in the Directory of Open Access Journals (DOAJ) demonstrated good overall quality, and their journal metrics were significantly higher than those of non-registered journals or journals registered in other directories. The lack of editor names and indexing information on a journal’s website are the most distinctive features of non-registered journals. Non-registration in an open-access journal directory indicates a lack of transparency and may ultimately indicate that a journal is predatory.
“Open science means action. And the way we offer recognition and reward to academics and university staff is key in bringing about the transition that Utrecht University aims for. Over the course of the past year the working group on Recognition and Rewards, part of the Open Science Programme, has reflected and thoroughly debated a novel approach to ensuring that we offer room for everyone’s talent, resulting in a new vision (pdf)….
In the current system, researchers and their research are judged by journal impact factors, publisher brands and H-indices, and not by actual quality, real use, real impact and openness characteristics….
Under those circumstances, at best open science practices are seen as posing an additional burden without rewards. At worst, they are seen as actively damaging chances of future funding and promotion & tenure. Early career researchers are perhaps the most dependent on traditional evaluation culture for career progression, a culture held in place by established researchers, as well as by institutional, national and international policies, including funder mandates….”
“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”
Abstract: Preprints are an increasingly important component of the scholarly record and preprint platforms have correspondingly grown in number. Academic communities value preprints for the opportunity to share early findings with peers and receive immediate feedback on not-yet-reviewed works. With the COVID pandemic, a broader audience is turning to preprints, as political leaders, journalists, and the public seek new information about the virus. Complications arise, however, when the unvetted nature of these works is not clearly signaled alongside discussions of their findings. In late 2020, Rick Anderson captured these concerns, highlighting cases where discredited preprints remained available to read, presenting a potential for misinformation. Anderson posited that preprint platform providers, not just editors, should ensure adequate preprint vetting and be willing to retract them. With the availability of two new open-source preprint platforms–PKP’s Open Preprint Systems (OPS) and Birkbeck’s Janeway preprint server–library publishers now have familiar, robust infrastructure for entering this space and are a logical home for such services, especially given a strong commitment to a specific research community. But what additional responsibilities must we accept–if any–as publishers of this genre? Should we establish terms for vetting of submissions? Without adequate domain knowledge, how would we enforce, or even audit, such terms? How do we indicate that a specific preprint’s findings have not yet been formally accepted? What about obligations regarding debunked publications? What are the responsibilities of platform providers, publishers, and editors? Should library publishers, as a community of practice, expand on the proposed best practices related to preprint metadata to ensure we are responsible actors in providing access to early research? Panelists will explore these questions during the session’s first half, and invite attendee participation for the second. Registered attendees will receive an advance survey regarding current/planned preprint publishing, in order to identify additional discussion topics.
“We have examined retracted publications in different subject fields and attempted to analyse whether online free accessibility (Open Access) influences retraction by examining the scholarly literature published from 2000 through 2019, an incidence of the recent 20 years of publications. InCites, a research analytics tool developed by Clarivate Analytics®, in consultation with Web of Science, PubMed Central, and Retraction Watch databases were used to harvest data for the study. Retracted ‘Article’ and ‘Review’ publications were examined concerning their online accessibility mode (Toll Access and Open Access), based on non-parametric tests like Odds Ratio, Wilcoxon Signed Rank Test, Mann–Whitney U Test, Mann–Kendall and Sen’s methods. The Odds for OA articles to have retraction are about 1.62 as large (62% higher) compared with TA articles (95% CI 1.5, 1.7). 0.028% of OA publications are retracted compared with 0.017% TA publications. Retractions have occurred in all subject areas. In eight subject areas, the Odds for retraction of OA articles are larger compared with retraction of TA articles. In three subject areas, the Odds for retraction of OA articles are lesser compared with the retraction of TA articles. In the remaining 11 subject areas, no significant difference is observed. Post-retraction, though a decline is observed in the citation count of OA & TA publications (p?<?.01), yet the Odds for OA articles to get cited after retraction are about 1.21 as large (21% higher) compared with TA articles (95% CI 1.53, 1.72). TA publications are retracted earlier compared to OA publications (p?<?.01). We observed an increasing trend of retracted works published in both modes. However, the rate of retraction of OA publications is double than the rate of retraction of TA publications.
“ICMRA1 and WHO call on the pharmaceutical industry to provide wide access to clinical data for all new medicines and vaccines (whether full or conditional approval, under emergency use, or rejected). Clinical trial reports should be published without redaction of confidential information for reasons of overriding public health interest….
Regulators continue to spend considerable resources negotiating transparency with sponsors. Both positive and negative clinically relevant data should be made available, while only personal data and individual patient data should be redacted. In any case, aggregated data are unlikely to lead to re-identification of personal data and techniques of anonymisation can be used….
Providing systematic public access to data supporting approvals and rejections of medicines reviewed by regulators, is long overdue despite existing initiatives, such as those from the European Medicines Agency and Health Canada. The COVID-19 pandemic has revealed how essential to public trust access to data is. ICMRA and WHO call on the pharmaceutical industry to commit, within short timelines, and without waiting for legal changes, to provide voluntary unrestricted access to trial results data for the benefit of public health.”
Abstract: Proper peer review and quality of published articles are often regarded as signs of reliable scientific journals. The aim of this study was to compare whether the quality of statistical reporting and data presentation differs among articles published in ‘predatory dental journals’ and in other dental journals. We evaluated 50 articles published in ‘predatory open access (OA) journals’ and 100 clinical trials published in legitimate dental journals between 2019 and 2020. The quality of statistical reporting and data presentation of each paper was assessed on a scale from 0 (poor) to 10 (high). The mean (SD) quality score of the statistical reporting and data presentation was 2.5 (1.4) for the predatory OA journals, 4.8 (1.8) for the legitimate OA journals, and 5.6 (1.8) for the more visible dental journals. The mean values differed significantly (p < 0.001). The quality of statistical reporting of clinical studies published in predatory journals was found to be lower than in open access and highly cited journals. This difference in quality is a wake-up call to consume study results critically. Poor statistical reporting indicates wider general lower quality in publications where the authors and journals are less likely to be critiqued by peer review.
Abstract: Traditional peer review is undergoing increasing questioning, given the increase in scientific fraud detected and the replication crisis biomedical research is currently going through. Researchers, academic institutions, and research funding agencies actively promote scientific record analysis, and multiple tools have been developed to achieve this. Different biomedical journals were founded with post-publication peer review as a feature, and there are several digital platforms that make this process possible. In addition, an increasing number biomedical journals allow commenting on articles published on their websites, which is also possible in preprint repositories. Moreover, publishing houses and researchers are largely using social networks for the dissemination and discussion of articles, which sometimes culminates in refutations and retractions.
Abstract: This academic thought piece provides an overview of the history of, and current trends in, publishing practices in the scientific fields known to the authors (chemical sciences, social sciences and humanities), as well as a discussion of how open access mandates such as Plan S from cOAlition S will affect these practices. It begins by summarizing the evolution of scientific publishing, in particular how it was shaped by the learned societies, and highlights how important quality assurance and scientific management mechanisms are being challenged by the recent introduction of ever more stringent open access mandates. The authors then discuss the various reactions of the researcher community to the introduction of Plan S, and elucidate a number of concerns: that it will push researchers towards a pay?to?publish system which will inevitably create new divisions between those who can afford to get their research published and those who cannot; that it will disrupt collaboration between researchers on the different sides of cOAlition S funding; and that it will have an impact on academic freedom of research and publishing. The authors analyse the dissemination of, and responses to, an open letter distributed and signed in reaction to the introduction of Plan S, before concluding with some thoughts on the potential for evolution of open access in scientific publishing.
“Yet, in far too many cases, we are still requiring very expensive textbooks in our classes. Over the degree program, students are expending thousands of dollars for texts that many sell back to the bookstore for less than half of their original value. Much of the material embedded in the texts is either already available freely online or could be assembled by the instructor from open-access sources. At the same time, many instructors still complain that the text does not precisely fit their needs; they skip chapters and assign additional readings to update the material in the text that is already one or two years out of date before the book hits the students’ desks. Why not just create your own texts and update them as often as is needed?
During the first three semesters in COVID times, awareness of open educational resources (OER) has surged among faculty members. Faculty members who put their classes online through remote learning discovered more fully the range and timeliness of relevant materials that are available online. In a study by Bay View Analytics, sponsored by the William and Flora Hewitt Foundation, it was found that faculty who adopted OER rated their materials superior to the commercial alternatives, and while the percentage of required OER materials did not increase, the percentage of supplemental OER materials did….”
“F1000 is collaborating with two Chinese customers to develop open research publishing platforms dedicated to the research and application of collaborative robots and ‘digital twin’ technologies. Both will be the world’s first open publishing platforms in their fields and will launch for submission in July 2021.
The platforms will utilise F1000’s open research publishing model, enabling all research outputs to be published open access, as well as combining the benefits of pre-printing (providing rapid publication with no editorial bias) with mechanisms to assure quality and transparency (invited and open peer review, archiving and indexing). They also offer researchers an open and transparent peer review process and have a mandatory FAIR data policy to provide full and easy access to the source data underlying the results….”