“[Q] Which brings us to Wikipedia. Many of us consult it, slightly wary of its bias, depth, and accuracy. But, as you’ll be sharing in your speech at Intellisys, the content actually ends up being surprisingly reliable. How does that happen?
[A] The answer to “should you believe Wikipedia?” isn’t simple. In my book I argue that the content of a popular Wikipedia page is actually the most reliable form of information ever created. Think about it—a peer-reviewed journal article is reviewed by three experts (who may or may not actually check every detail), and then is set in stone. The contents of a popular Wikipedia page might be reviewed by thousands of people. If something changes, it is updated. Those people have varying levels of expertise, but if they support their work with reliable citations, the results are solid. On the other hand, a less popular Wikipedia page might not be reliable at all….”
eLife is excited to announce a new approach to peer review and publishing in medicine, including public health and health policy.
One of the most notable impacts of the COVID-19 pandemic has been the desire to share important results and discoveries quickly, widely and openly, leading to rapid growth of the preprint server medRxiv. Despite the benefits of rapid, author-driven publication in accelerating research and democratising access to results, the growing number of clinical preprints means that individuals and institutions may act quickly on new information before it is adequately scrutinised.
To address this challenge, eLife is bringing its system of editorial oversight by practicing clinicians and clinician-investigators, and rigorous, consultative peer review to preprints. The journal’s goal is to produce ‘refereed preprints’ on medRxiv that provide readers and potential users with a detailed assessment of the research, comments on its potential impact, and perspectives on its use. By providing this rich and rapid evaluation of new results, eLife hopes peer-reviewed preprints will become a reliable indicator of quality in medical research, rather than journal impact factor.
“Research software is a fundamental and vital part of research worldwide, yet there remain significant challenges to software productivity, quality, reproducibility, and sustainability. Improving the practice of scholarship is a common goal of the open science, open source software and FAIR (Findable, Accessible, Interoperable and Reusable) communities, but improving the sharing of research software has not yet been a strong focus of the latter.
To improve the FAIRness of research software, the FAIR for Research Software (FAIR4RS) Working Group has sought to understand how to apply the FAIR Guiding Principles for scientific data management and stewardship to research software, bringing together existing and new community efforts. Many of the FAIR Guiding Principles can be directly applied to research software by treating software and data as similar digital research objects. However, specific characteristics of software — such as its executability, composite nature, and continuous evolution and versioning — make it necessary to revise and extend the principles.
This document presents the first version of the FAIR Principles for Research Software (FAIR4RS Principles). It is an outcome of the FAIR for Research Software Working Group (FAIR4RS WG).
The FAIR for Research Software Working Group is jointly convened as an RDA Working Group, FORCE11 Working Group, and Research Software Alliance (ReSA) Task Force.”
There is clear evidence that publishing research in an open access (OA) journal or as an OA model is associated with higher impact, in terms of number of reads and citation rates. The development of OA journals and their quality are poorly studied in the field of urology. In this study, we aim to assess the number of OA journals, their quality in terms of CiteScore, percent cited and quartiles, and their scholarly production during the period from 2011 to 2018.
We obtained data about journals from www.scopus.com, and we filtered the list for urology journals. We obtained data for all Scopus indexed journals during the period from 2011 to 2018. For each journal, we extracted the following indices: CiteScore, Citations, scholarly output, and SCImago quartiles. We analyzed the difference in quality indices between OA and non-OA urology journals.
Urology journals have increased from 66 journals in 2011 to 99 journals in 2018. The number of OA urology journals has increased from only 10 (15.2%) journals in 2011 to 33 (33.3%) journals in 2018. The number of quartile 1 (the top 25%) journals has increased from only 1 journal in 2011 to 5 journals in 2018. Non-OA urology journals had significantly higher CiteScore compared with OA journals till the year 2015, after which the mean difference in CiteScore became smaller with insignificant p-value.
Number and quality of OA journals in the field of urology have increased throughout the last few years. Despite this increase, non-OA urology journals still have higher quality and output.
“The rise of OA and the megajournals has turned out to be a lucrative model for publishing houses.1,2 But is it good for the scientific community as a whole? Opinions on this differ from field to field, with the more translational fields, like biology and medicine, taking a more enthusiastic stance and more fundamental fields, like mathematics and physics, a more skeptical one. (See the commentary by Jason Wright in Physics Today, February 2020, page 10, and reference 3.)
There is also a noticeable generational difference of opinion. Some younger scientists view the trend toward OA scientific journals more favorably than their older colleagues do. …”
Abstract: Objectives: The aim of this study is to compare the annual SJR and to evaluate the other parameters that show the scientific effect of journals in terms of open access (OA) or subscription access (SA) in the field of obstetrics and gynecology according to the SCImago database. Material and methods: This study was conducted between September-December 2019 at Near East University. The SCImago Journal & Country Rank database was used to collect information about the journals. We evaluated and compared the changes in the one-year SJR (SCImago Journal Rank) and journal impact factor (JIF) of OA and SA journals. Results: Data from 183 scientific journals in the field of obstetrics and gynecology from the period between 1999 and 2018 were evaluated, where 140 of these journals were SA and 43 were OA. The average SJR of OA journals in 1999 was 0.17, while it was 0.38 for SA journals. In 2018, these values were 0.31 and 0.78 for OA and SA journals, respectively. In the comparison of JIF, the average of the OA journals in 1999 was 0.09, while it was 0.66 for SA journals. In 2018, these values were 0.80 and 1.93 for OA and SA journals, respectively. Conclusions: Access to information has become easier due to technological developments and this will continue to affect the access policies of journals. Despite the disadvantages of predator journals, the rise of OA journals in terms of number and quality is likely to continue. Key words: open access journal; impact factor; subscription access journal; SCImago; obstetrics; gynecology.
Abstract: The impact of COVID-19 has underlined the need for reliable information to guide clinical practice and policy. This urgency has to be balanced against disruption to journal handling capacity and the continued need to ensure scientific rigour. We examined the reporting quality of highly disseminated COVID-19 research papers using a bibliometric analysis examining reporting quality and risk of bias (RoB) amongst 250 top scoring Altmetric Attention Score (AAS) COVID-19 research papers between January and April 2020. Method-specific RoB tools were used to assess quality. After exclusions, 84 studies from 44 journals were included. Forty-three (51%) were case series/studies, and only one was an randomized controlled trial. Most authors were from institutions based in China (n =?44, 52%). The median AAS and impact factor was 2015 (interquartile range [IQR] 1,105–4,051.5) and 12.8 (IQR 5–44.2) respectively. Nine studies (11%) utilized a formal reporting framework, 62 (74%) included a funding statement, and 41 (49%) were at high RoB. This review of the most widely disseminated COVID-19 studies highlights a preponderance of low-quality case series with few research papers adhering to good standards of reporting. It emphasizes the need for cautious interpretation of research and the increasingly vital responsibility that journals have in ensuring high-quality publications.
Abstract: A retrospective observational study was conducted to evaluate open-access journals in obstetrics and gynaecology, published between 2011 and 2019. Journals were classified based on their registration in open-access journal directories. Of 176 journals, 47 were not registered. Journals registered in the Directory of Open Access Journals (DOAJ) demonstrated good overall quality, and their journal metrics were significantly higher than those of non-registered journals or journals registered in other directories. The lack of editor names and indexing information on a journal’s website are the most distinctive features of non-registered journals. Non-registration in an open-access journal directory indicates a lack of transparency and may ultimately indicate that a journal is predatory.
“Open science means action. And the way we offer recognition and reward to academics and university staff is key in bringing about the transition that Utrecht University aims for. Over the course of the past year the working group on Recognition and Rewards, part of the Open Science Programme, has reflected and thoroughly debated a novel approach to ensuring that we offer room for everyone’s talent, resulting in a new vision (pdf)….
In the current system, researchers and their research are judged by journal impact factors, publisher brands and H-indices, and not by actual quality, real use, real impact and openness characteristics….
Under those circumstances, at best open science practices are seen as posing an additional burden without rewards. At worst, they are seen as actively damaging chances of future funding and promotion & tenure. Early career researchers are perhaps the most dependent on traditional evaluation culture for career progression, a culture held in place by established researchers, as well as by institutional, national and international policies, including funder mandates….”
“By embracing Open Science as one of its five core principles1, Utrecht University aims to accelerate and improve science and scholarship and its societal impact. Open science calls for a full commitment to openness, based on a comprehensive vision regarding the relationship with society. This ongoing transition to Open Science requires us to reconsider the way in which we recognize and reward members of the academic community. It should value teamwork over individualism and calls for an open academic culture that promotes accountability, reproducibility, integrity and transparency, and where sharing (open access, FAIR data and software) and public engagement are normal daily practice. In this transition we closely align ourselves with the national VSNU program as well as developments on the international level….”