Survey on Open Science Practices in Functional Neuroimaging – ScienceDirect

Abstract:  Replicability and reproducibility of scientific findings is paramount for sustainable progress in neuroscience. Preregistration of the hypotheses and methods of an empirical study before analysis, the sharing of primary research data, and compliance with data standards such as the Brain Imaging Data Structure (BIDS), are considered effective practices to secure progress and to substantiate quality of research. We investigated the current level of adoption of open science practices in neuroimaging and the difficulties that prevent researchers from using them.

Email invitations to participate in the survey were sent to addresses received through a PubMed search of human functional magnetic resonance imaging studies that were published between 2010 and 2020. 283 persons completed the questionnaire.

Although half of the participants were experienced with preregistration, the willingness to preregister studies in the future was modest. The majority of participants had experience with the sharing of primary neuroimaging data. Most of the participants were interested in implementing a standardized data structure such as BIDS in their labs. Based on demographic variables, we compared participants on seven subscales, which had been generated through factor analysis. Exploratory analyses found that experienced researchers at lower career level had higher fear of being transparent and researchers with residence in the EU had a higher need for data governance. Additionally, researchers at medical faculties as compared to other university faculties reported a more unsupportive supervisor with regards to open science practices and a higher need for data governance.

The results suggest growing adoption of open science practices but also highlight a number of important impediments.

WHA clinical trial resolution: draft text now public

“The World Health Organization today published the draft text of the clinical trial resolution being debated at the ongoing World Health Assembly.

 

 

The resolution’s overall aim is to improve the coordination, design, conduct and reporting of clinical trials worldwide. It was partly spurred by the realisation that hundreds – maybe thousands – of Covid clinical trials have ended up as costly research waste….

 

Promoting, as appropriate, measures to facilitate the timely reporting of both positive and negative interpretable clinical trial results in alignment with the WHO joint statement on public disclosure of results from clinical trials and the WHO joint statement on transparency and data integrity, including through registering the results on a publicly available clinical trial registry within the [global trial registry network] ICTRP, and encouraging timely publication of the trial results preferably in an open-access publication.

Exploring measures during public health emergencies of international concern to encourage researchers to rapidly and responsibly share interpretable results of clinical trials, including negative results, with national regulatory bodies or other appropriate authorities, including WHO for clinical guideline development and emergency use listing (EUL), to support rapid regulatory decision-making and emergency adaptation of clinical and public health guidelines as appropriate, including through pre-print publication. …”

 

Solving medicine’s data bottleneck: Nightingale Open Science | Nature Medicine

“Open datasets, curated around unsolved medical problems, are vital to the development of computational research in medicine, but remain in short supply. Nightingale Open Science, a non-profit computing platform, was founded to catalyse research in this nascent field….”

New Instructions to Authors Emphasize Open Science, Transparency, Full Reporting of Sociodemographic Characteristics of the Sample, and Avoidance of Piecemeal Publication | Annals of Behavioral Medicine | Oxford Academic

“We have updated our Author guidelines to more fully reflect the journal’s values and to better align manuscript reporting practices with scientific ideals for open transparency, open science, and contextualization. Accordingly, we are adding a number of new requirements for manuscript submission to Annals of Behavioral Medicine. The updated Author Guidelines (https://academic.oup.com/abm/pages/General_Instructions) describe them in full detail. Here, we briefly summarize some of the most important changes.”

An Open-Publishing Response to the COVID-19 Infodemic – PMC

Abstract:  The COVID-19 pandemic catalyzed the rapid dissemination of papers and preprints investigating the disease and its associated virus, SARS-CoV-2. The multifaceted nature of COVID-19 demands a multidisciplinary approach, but the urgency of the crisis combined with the need for social distancing measures present unique challenges to collaborative science. We applied a massive online open publishing approach to this problem using Manubot. Through GitHub, collaborators summarized and critiqued COVID-19 literature, creating a review manuscript. Manubot automatically compiled citation information for referenced preprints, journal publications, websites, and clinical trials. Continuous integration workflows retrieved up-to-date data from online sources nightly, regenerating some of the manuscript’s figures and statistics. Manubot rendered the manuscript into PDF, HTML, LaTeX, and DOCX outputs, immediately updating the version available online upon the integration of new content. Through this effort, we organized over 50 scientists from a range of backgrounds who evaluated over 1,500 sources and developed seven literature reviews. While many efforts from the computational community have focused on mining COVID-19 literature, our project illustrates the power of open publishing to organize both technical and non-technical scientists to aggregate and disseminate information in response to an evolving crisis.

Why did clinical trial registries fail to prevent Covid research chaos?

“There is a long-standing global ethical obligation to register all trials before they start, shored up by regulatory requirements in some jurisdictions. Data from 18 registries worldwide feed into the WHO-managed International Clinical Trials Registry Platform (ICTRP), providing a continuously updated overview of who is researching what, when, where and how – at least in theory.

 

 

If the registry infrastructure had worked and been used as intended, much of the COVID-19 research chaos would have been avoided.

 

 

For example, researchers considering launching a hydroxychloroquine trial could have searched ICTRP and discovered that the drug was already being investigated by numerous other trials. Those researchers could accordingly have focused on investigating other treatment options instead, or aligned their outcome measures with existing trials. …

The global registry infrastructure has long been inadequately supported by legislators and regulators, and is woefully underfunded.

 

 

 

This persistent neglect of the world’s only comprehensive directory of medical research led to costly research waste on an incredible scale during the pandemic.

 

 

The WHO recommends that member states should by law require every interventional trial to be registered and reported. In addition, WHO recommends that all trial results should be made public specifically on a registry within 12 months, and that registry data should be kept up to date.

 

 

 

By enforcing these three simple rules, regulators would ensure that there is a comprehensive, up-to-date global database of all trials and their results.

 

In reality, existing laws in the EU and the US only cover a small minority of trials and are not being effectively enforced, while many other jurisdictions have no relevant laws at all. …”

 

 

Archiving the COVID Tracking Project – Bay Area Open Science Group – LibCal – UC Berkeley Library

“Are you interested in making your research more openly available? Want to learn about open science tools and platforms that can make your research more effective and reproducible? The Bay Area Open Science Group is intended to bring together students, faculty, and staff from the Stanford, Berkeley, and UCSF communities to learn about open science, discuss the application of open science practices in a research context, and meet other members of the community who are interested in (or already are) incorporating open science practices into their work….

Gather around virtually with colleagues at Stanford and Berkeley for a presentation on The COVID Tracking Project by Kevin Miller, a former team lead with the project who is archiving the project’s data and collections for the UCSF Archives & Special Collections. The project was a volunteer-run, community-science program that became a critical source of national pandemic data accidentally and overnight. He will discuss how it was built, and the challenges of archiving such a massive, born-digital collection….”

Factors Associated with Open Access Publishing Costs in Oncology Journals

Background Open access (OA) publishing represents an exciting opportunity to facilitate dissemination of scientific information to global audiences. However, OA publication is often associated with significant article processing charges (APCs) for authors, which may thus serve as a barrier to publication.

Methods We identified oncology journals using the SCImago Journal & Country Rank database. All journals with an OA publication option and APC data openly available were included. We searched journal websites and tabulated journal characteristics, including APC amount (USD), OA model (hybrid vs full), 2-year impact factor (IF), H-index, number of citable documents, modality/treatment specific (if applicable), and continent of origin. We generated a multiple regression model to identify journal characteristics independently associated with OA APC amount.

Results Of 367 oncology journals screened, 251 met final inclusion criteria. The median APC was 2957 USD (IQR 1958-3450). On univariable testing, journals with greater number of citable documents (p<0.001), higher IF (p < 0.001), higher H-index (p < 0.001), and those using the hybrid OA model (p < 0.001) or originating in Europe/North America (p < 0.001) tended to have higher APCs. In our multivariable model, number of citable documents, IF, OA publishing model, and region persisted as significant predictors of processing charges.

Conclusions OA publication costs are greater in oncology journals that publish more citable articles, utilize the hybrid OA model, have higher IF, and are based in North America or Europe. These findings may inform targeted action to help the oncology community fully appreciate the benefits of open science.

 

DataWorks! Challenge | HeroX

“Share your story of how you reused or shared data to further your biological and/or biomedical research effort and get recognized!…

The Federation of American Societies for Experimental Biology (FASEB) and the National Institutes of Health (NIH) are championing a bold vision of data sharing and reuse. The DataWorks! Prize fuels this vision with an annual challenge that showcases the benefits of research data management while recognizing and rewarding teams whose research demonstrates the power of data sharing or reuse practices to advance scientific discovery and human health. We are seeking new and innovative approaches to data sharing and reuse in the fields of biological and biomedical research. 

To incentivize effective practices and increase community engagement around data sharing and reuse, the 2022 DataWorks! Prize will distribute up to 12 monetary team awards. Submissions will undergo a two-stage review process, with final awards selected by a judging panel of NIH officials. The NIH will recognize winning teams with a cash prize, and winners will share their stories in a DataWorks! Prize symposium.”

A comparison of scientometric data and publication policies of ophthalmology journals

Abstract: Purpose: 

This retrospective database analysis study aims to present the scientometric data of journals publishing in the field of ophthalmology and to compare the scientometric data of ophthalmology journals according to the open access (OA) publishing policies.

Methods: 

The scientometric data of 48 journals were obtained from Clarivate Analytics InCites and Scimago Journal & Country Rank websites. Journal impact factor (JIF), Eigenfactor score (ES), scientific journal ranking (SJR), and Hirsch index (HI) were included. The OA publishing policies were separated into full OA with publishing fees, full OA without fees, and hybrid OA. The fees were stated as US dollars (USD).

Results: 

Four scientometric indexes had strong positive correlations; the highest correlation coefficients were observed between the SJR and JIF (R = 0.906) and the SJR and HI (R = 0.798). However, journals in the first quartile according to JIF were in the second and third quartiles according to the SJR and HI and in the fourth quartile in the ES. The OA articles published in hybrid journals received a median of 1.17-fold (0.15–2.71) more citations. Only HI was higher in hybrid OA; other scientometric indexes were similar with full OA journals. Full OA journals charged a median of 1525 USD lower than hybrid journals.

Conclusion: 

Full OA model in ophthalmology journals does not have a positive effect on the scientometric indexes. In hybrid OA journals, choosing to publish OA may increase citations, but it would be more accurate to evaluate this on a journal basis.

Frontiers | The Academic, Societal and Animal Welfare Benefits of Open Science for Animal Science | Veterinary Science

Abstract:  Animal science researchers have the obligation to reduce, refine, and replace the usage of animals in research (3R principles). Adherence to these principles can be improved by transparently publishing research findings, data and protocols. Open Science (OS) can help to increase the transparency of many parts of the research process, and its implementation should thus be considered by animal science researchers as a valuable opportunity that can contribute to the adherence to these 3R-principles. With this article, we want to encourage animal science researchers to implement a diverse set of OS practices, such as Open Access publishing, preprinting, and the pre-registration of test protocols, in their workflows.

 

Why preprints are good for patients | Nature Medicine

“Rapid communication of clinical trial results has likely saved lives during the COVID-19 pandemic and should become the new norm….

But during health emergencies, there are many tensions, one of which is the mismatch between the urgent need for information and evidence and the much longer time frames of scientific peer review and publication. The COVID-19 pandemic is the first global health emergency of the new information age, with data and results widely shared via social media. This has resulted in very real difficulties in distinguishing important information from noise, and real news from fake news. How should the research and medical community best manage this new reality?…

Some may argue that the speed advantage of preprints does not outweigh the risks of poor-quality, misleading or even fraudulent research being published and acted upon. I would counter that clinicians should not rely solely on peer review to assess the validity and meaningfulness of research findings. This is because dubious, perhaps fraudulent data can still get through peer review, as was seen with early COVID papers published and then retracted from two of the most prestigious medical journals. In addition, even valid data can be misleading. There has been an avalanche of observational data that passed peer review and was then used to justify treatments, most notably with hydroxychloroquine, but the susceptibility of observational methodology to moderate biases means that such data should not be the basis of patient care.

I take two lessons from our experience running the largest COVID-19 clinical trial over the last two years. The first is that that the preprint system has come of age, demonstrating huge value in rapidly communicating important research findings. Almost daily I am alerted through social media alerts from trusted sources and colleagues of important new findings published as preprints. A degree of immediate peer review is also available by means of the preprint comments section and from colleagues via social media. The full peer-reviewed manuscripts usually appear many weeks or even months later. I cannot envisage a future without such rapid dissemination of new evidence.

 

Given this new reality, the second lesson is that we must ensure that the medical community and policy makers are sufficiently skilled in critical thinking and scientific methods that they can make sensible decisions, regardless of whether an article is peer reviewed or not.”

‘The EMA is withholding too much information”, 1 May 2022

“Transparency is a requirement for better and safer patient care. There is no valid reason to hide information about clinical trials, their methodology or their results, or evaluation data obtained on drugs after their market introduction, particularly data on adverse effects.

The creation of the European Medicines Agency (EMA) in 1995 constituted a step forward, compared with the practices of France’s drug regulatory agency at the time. For example, the EMA’s online publication of information on drug evaluations, such as European Public Assessment Reports, was a major advance in transparency as to the data in its possession….

It is one thing for pharmaceutical companies to consider that data showing the limitations of their drugs are commercially sensitive. But it is quite another – and utterly unacceptable – for the EMA to actually orchestrate the concealment of these data by pharmaceutical companies.

Transparency is not a fad or an end in itself. In the pharmaceutical field, it is a requirement for better and safer patient care. There is no valid reason to hide information about clinical trials, their methodology or their results, or evaluation data obtained on drugs after their market introduction, particularly data on adverse effects.

 

Perhaps there are certain individuals within the EMA who are dissatisfied with this situation? Or who are simply resigned to the power relations at play? Or who feel that the way the EMA operates is a necessary compromise, given the varying legislation? If so, these individuals are not speaking up and their opinions are not reflected in the EMA’s practices. Whatever the case may be, Prescrire’s negative assessment of the level of transparency at the EMA is intended as a wake-up call for policy makers and for legal bodies (such as the Ombudsman) who are in a position to improve the EMA’s operational practices….”