Informationsplattform Open Access: Life Sciences

In medicine and the life sciences, open access is particularly supported by mandates from research funders such as the National Institutes of Health (NIH), the Wellcome Trust, and the Bill & Melinda Gates Foundation. Calls for freer access to the results of research in medicine and the life sciences refer to the direct link between open access and public health, especially in the global South. Consequently, the World Health Organization (WHO) is also committed to promoting open access to the results of medical research, and operates IRIS, a repository for information sharing. Some research funders also provide publication platforms on which the results of the research that they fund can be published. Examples of such platforms include Wellcome Open Research and Gates Open Research. To comply with funding requirements, the published results of NIH-funded research must be made accessible in PubMed Central (PMC), the disciplinary repository for biomedical and life sciences journal literature at the NIH National Library of Medicine.

Open Science Conference 2021 | United Nations

“With the advent of the pandemic, the component of openness in the scientific process has achieved criticality. Since 2019, when the Dag Hammarskjöld Library held the first Open Science Conference in the United Nations headquarters in New York, the global open movement has been significantly enriched with new national and international policies and frameworks as well as daring and visionary initiatives, both private and public. Research and funding institutions, libraries, publishers switched content to open access, in some cases overnight, to ensure unhindered access for researchers and the public, solidifying a tacit understanding of Open Science principles. The roundtable discussion among 19 eminent personalities in Open Science that preceded the Library’s 2019 Conference had resulted in a document of principles elaborating on the necessary elements needed for the creation of a Global Open Science Commons for the SDGs

In the 2nd OPEN SCIENCE CONFERENCE, From Tackling the Pandemic to Addressing Climate Change, policy makers, main IGO actors, librarians, publishers and research practitioners will engage into a public dialogue focusing on what Open Science has learned from COVID-19 and how this can be applied into actions addressing the global climate crisis, at the interface of science, technology, policy and research….”

eLife and Medicine: Rigorous review and editorial oversight of clinical preprints | eLife

“In all, eLife’s ambitions in medicine are broader than just becoming a new open-access medical journal. This is a larger effort underscoring a cultural change to emphasize the importance of preprints and reviewing preprints; to focus on transparency, not just on open access but also on open data and open methods; and to encourage responsible behaviors in medical publishing – elements that are necessary for the translation of meaningful scientific investigation to the betterment of human health. Towards this aspiration, eLife’s reinvigorated Medicine section will cherish the support of the physician–scientist community around the globe….”

Meet the new Faculty Opinions Score – Faculty Opinions Blog

“Traditional citation metrics, such as the journal impact factor, can frequently act as biased measurements of research quality and contribute to the broken system of research evaluation. Academic institutions and funding bodies are increasingly moving away from relying on citation metrics towards greater use of transparent expert opinion. 

Faculty Opinions has championed this cause for two decades, with over 230k recommendations made by our 8000+ Faculty Members, and we are excited to introduce the next step in our evolution – a formidable mark of research quality – the new Faculty Opinions Score….

The Faculty Opinions Score assigns a numerical value of research publications in Biology and Medicine, to quantify their impact and quality compared to other publications in their field. 

The Faculty Opinions Score is derived by combining our unique star-rated Recommendations on individual publications, made by world-class experts, with bibliometrics to produce a radically new metric in the research evaluation landscape. 

 

 

 

Key properties of the Faculty Opinions Score: 

A score of zero is assigned to articles with no citations and no recommendations. 
The average score of a set of recommended articles has an expected value of 10. However, articles with many recommendations or highly cited articles may have a much higher score. There is no upper bound. 
Non-recommended articles generally score lower than recommended articles. 
Recommendations contribute more to the score than bibliometric performance. In other words, expert recommendations increase an article’s score (much) more than citations do….”

New metric ‘leverages opinions of 8,000 experts’ | Research Information

“Faculty Opinions has introduced a new metric in the research evaluation landscape, leveraging the opinions of more than 8,000 experts. 

The Faculty Opinions Score is designed to be an early indicator of an article’s future impact and a mark of research quality. The company describes the implications for researchers, academic institutions and funding bodies as ‘promising’….”

Developing a scalable framework for partnerships between health agencies and the Wikimedia ecosystem

Abstract:  In this era of information overload and misinformation, it is a challenge to rapidly translate evidence-based health information to the public. Wikipedia is a prominent global source of health information with high traffic, multilingual coverage, and acceptable quality control practices. Viewership data following the Ebola crisis and during the COVID-19 pandemic reveals that a significant number of web users located health guidance through Wikipedia and related projects, including its media repository Wikimedia Commons and structured data complement, Wikidata.

The basic idea discussed in this paper is to increase and expedite health institutions’ global reach to the general public, by developing a specific strategy to maximize the availability of focused content into Wikimedia’s public digital knowledge archives. It was conceptualized from the experiences of leading health organizations such as Cochrane, the World Health Organization (WHO) and other United Nations Organizations, Cancer Research UK, National Network of Libraries of Medicine, and Centers for Disease Control and Prevention (CDC)’s National Institute for Occupational Safety and Health (NIOSH). Each has customized strategies to integrate content in Wikipedia and evaluate responses.

We propose the development of an interactive guide on the Wikipedia and Wikidata platforms to support health agencies, health professionals and communicators in quickly distributing key messages during crisis situations. The guide aims to cover basic features of Wikipedia, including adding key health messages to Wikipedia articles, citing expert sources to facilitate fact-checking, staging text for translation into multiple languages; automating metrics reporting; sharing non-text media; anticipating offline reuse of Wikipedia content in apps or virtual assistants; structuring data for querying and reuse through Wikidata, and profiling other flagship projects from major health organizations.

In the first phase, we propose the development of a curriculum for the guide using information from prior case studies. In the second phase, the guide would be tested on select health-related topics as new case studies. In its third phase, the guide would be finalized and disseminated.

What We Learned Doing Fast Grants – Future

“And so, in early April, we decided to start Fast Grants, which we hoped could be one of the faster sources of emergency science funding during the pandemic. We had modest hopes given our inexperience and lack of preparation, but we felt that the opportunity to provide even small accelerations would be worthwhile given the scale of the disaster. 

The original vision was simple: an application form that would take scientists less than 30 minutes to complete and that would deliver funding decisions within 48 hours, with money following a few days later….

The first round of grants were given out within 48 hours. Later rounds of grants, which often required additional scrutiny of earlier results, were given out within two weeks. These timelines were much shorter than the alternative sources of funding available to most scientists. Grant recipients were required to do little more than publish open access preprints and provide monthly one-paragraph updates….”

Impact of a new institutional medical journal on professional identity development and academic cultural change: A qualitative study – Hayes – – Learned Publishing – Wiley Online Library

Abstract:  We launched a new institutional open access journal, the Journal of Maine Medical Center (JMMC), in 2018. We sought to engender community support and engagement through purposeful design and implementation. An ad hoc group was formed of institutional members with diverse backgrounds. Editorial Board and Editorial Team members were drawn from within the academic community. The journal name, aims and scope, recognizable logo, cover page and images were all strategically selected in order to engender institutional and community support. Institutional funding was solicited to support an open-access, no-fee, model. We adopted a philosophy of supporting novice authors with revisions of manuscripts that show merit, as opposed to immediate rejection. We assessed success of community engagement through semi-structured interviews of authors and reviewers and qualitative analysis of the transcripts. As evidenced by their perceptions, we have made positive steps toward supporting the academic mission of our institution and the scholarly professional identity of our participants. We outline a number of elements that are relevant to the start of a new academic journal and community engagement that we feel would be of interest to others considering a similar undertaking.

 

Social media attention and citations of published outputs from re-use of clinical trial data: a matched comparison with articles published in the same journals | BMC Medical Research Methodology | Full Text

Abstract:  Background

Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.

Methods

Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.

Results

89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.

Conclusions

Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.

Editorial: Can Journals, as Trusted Intermediaries, Cut Through the Signal-to-Noise Problem in Medical Publishing?

“Although open-access publication has its upsides, for purposes of this essay, I am going to lump publishing in open-access journals in with posting to preprint servers as potentially problematic. My reason for doing so is that both make it harder for clinicians to separate helpful research from distracting, unhelpful, and in the case of preprint servers, unvetted material. In previous editorials, I’ve highlighted some redeeming qualities of open-access publication [17, 18]; I also note that open access is a publication option here at CORR®. But from where I sit today, it’s becoming clear to me that the distortion of publication incentives that are inherent to fully open-access journals does not serve readers (or their patients) very well….”

Delays in reporting and publishing trial results during pandemics: cross sectional analysis of 2009 H1N1, 2014 Ebola, and 2016 Zika clinical trials | BMC Medical Research Methodology | Full Text

Abstract:  Background

Pandemic events often trigger a surge of clinical trial activity aimed at rapidly evaluating therapeutic or preventative interventions. Ensuring rapid public access to the complete and unbiased trial record is particularly critical for pandemic research given the urgent associated public health needs. The World Health Organization (WHO) established standards requiring posting of results to a registry within 12 months of trial completion and publication in a peer reviewed journal within 24 months of completion, though compliance with these requirements among pandemic trials is unknown.

Methods

This cross-sectional analysis characterizes availability of results in trial registries and publications among registered trials performed during the 2009 H1N1 influenza, 2014 Ebola, and 2016 Zika pandemics. We searched trial registries to identify clinical trials testing interventions related to these pandemics, and determined the time elapsed between trial completion and availability of results in the registry. We also performed a comprehensive search of MEDLINE via PubMed, Google Scholar, and EMBASE to identify corresponding peer reviewed publications. The primary outcome was the compliance with either of the WHO’s established standards for sharing clinical trial results. Secondary outcomes included compliance with both standards, and assessing the time elapsed between trial completion and public availability of results.

Results

Three hundred thirty-three trials met eligibility criteria, including 261 H1N1 influenza trials, 60 Ebola trials, and 12 Zika trials. Of these, 139 (42%) either had results available in the trial registry within 12 months of study completion or had results available in a peer-reviewed publication within 24 months. Five trials (2%) met both standards. No results were available in either a registry or publication for 59 trials (18%). Among trials with registered results, a median of 42 months (IQR 16–76 months) elapsed between trial completion and results posting. For published trials, the median elapsed time between completion and publication was 21 months (IQR 9–34 months). Results were available within 24 months of study completion in either the trial registry or a peer reviewed publication for 166 trials (50%).

Conclusions

Very few trials performed during prior pandemic events met established standards for the timely public dissemination of trial results.

Covid-19 and Open Access in the Humanities: Impacts and Emerging Trends

Abstract:  Discussions of open-access publishing tend to center the scientific disciplines, and this trend has continued during the Covid-19 pandemic. But while the pandemic has certainly shed new light on the importance of openly accessible medical research, its effects—from economic impacts to attitudinal shifts—have been felt and speculated about across disciplines. This paper presents an investigation into present and future impacts of the pandemic on open-access publishing in the humanities, which have historically been slower to adopt open-access models than other disciplines. A survey distributed to scholarly publishing professionals, academic librarians, and others working in open-access humanities publishing sought to determine what changes these professionals had observed in their field since the start of the pandemic, as well as what impacts they projected for the long term. While the lasting effects of this still-evolving global health and economic crisis remain uncertain, the survey results indicate that open-access humanities professionals have already observed changes in areas including market demand, institutional interest, and funding, while many of them predict that the pandemic will have a long-term impact on the field. These findings contribute to an ongoing conversation about the place of the humanities in the openaccess publishing landscape and the need for sustainable institutional investment.

How the COVID pandemic is changing global science collaborations

“Another long-term trend that researchers are watching out for is the push for scientists to share their research data more openly. This was mandated by the biomedical funding charity, Wellcome, for research that it funded on COVID-19, although there have been instances of people circumventing the rules by making data available ‘upon request’.

In theory, the push for open data might lessen international collaboration if it is no longer necessary to establish personal relationships to access data. Sugimoto says this could happen, but also wonders whether open data might help to link researchers from across the world by making their work more visible. “It could actually, in some ways, enhance and increase international collaboration rather than diminish it,” she says….”

eLife announces new approach to publishing in medicine | For the press | eLife

eLife is excited to announce a new approach to peer review and publishing in medicine, including public health and health policy.

One of the most notable impacts of the COVID-19 pandemic has been the desire to share important results and discoveries quickly, widely and openly, leading to rapid growth of the preprint server medRxiv. Despite the benefits of rapid, author-driven publication in accelerating research and democratising access to results, the growing number of clinical preprints means that individuals and institutions may act quickly on new information before it is adequately scrutinised.

To address this challenge, eLife is bringing its system of editorial oversight by practicing clinicians and clinician-investigators, and rigorous, consultative peer review to preprints. The journal’s goal is to produce ‘refereed preprints’ on medRxiv that provide readers and potential users with a detailed assessment of the research, comments on its potential impact, and perspectives on its use. By providing this rich and rapid evaluation of new results, eLife hopes peer-reviewed preprints will become a reliable indicator of quality in medical research, rather than journal impact factor.