DataWorks! Challenge | HeroX

“Share your story of how you reused or shared data to further your biological and/or biomedical research effort and get recognized!…

The Federation of American Societies for Experimental Biology (FASEB) and the National Institutes of Health (NIH) are championing a bold vision of data sharing and reuse. The DataWorks! Prize fuels this vision with an annual challenge that showcases the benefits of research data management while recognizing and rewarding teams whose research demonstrates the power of data sharing or reuse practices to advance scientific discovery and human health. We are seeking new and innovative approaches to data sharing and reuse in the fields of biological and biomedical research. 

To incentivize effective practices and increase community engagement around data sharing and reuse, the 2022 DataWorks! Prize will distribute up to 12 monetary team awards. Submissions will undergo a two-stage review process, with final awards selected by a judging panel of NIH officials. The NIH will recognize winning teams with a cash prize, and winners will share their stories in a DataWorks! Prize symposium.”

The Rise of Open Access Journals in Radiation Oncology: Are We Paying for Impact? – ScienceDirect

Purpose/Objective(s)

We aimed to examine how the rise of open access (OA) journals in biomedicine has impacted resident research in radiation oncology.

Materials/Methods

We built a comprehensive database of first-author, PubMed-searchable articles published by US radiation oncology residents who graduated between 2015 and 2019. We then classified each journal in which these manuscripts appeared as either OA or non-OA, and obtained the current article processing charge (APC) for every publication that appeared in an OA journal. Lastly, we performed a secondary analysis to identify the factors associated with publishing an article in an OA journal.

Results

The US radiation oncology residents in this study published 2,637 first-author, PubMed-searchable manuscripts, 555 (21.0%) of which appeared in 138 OA journals. The number of publications in OA journals increased from 0.47 per resident for the class of 2015 to 0.79 per resident for the class of 2019. Likewise, the number of publications in OA journals with a 2019 impact factor of zero increased from 0.14 per resident for the class of 2015 to 0.43 per resident for the class of 2019. Publications in OA journals garnered fewer citations than those in non-OA journals (8.9 versus 14.9, P < 0.01). 90.6% of OA journals levy an APC for original research reports (median $1,896), which is positively correlated with their 2019 impact factor (r?=?0.63, P < 0.01). Aggregate APCs totaled $900,319.21 for all US radiation oncology residency programs and appeared to increase over the study period.

Conclusion

The number of first-author, PubMed-searchable manuscripts published by graduating US radiation oncology residents in OA journals rose significantly over the study period. US radiation oncology residency programs appear to be investing increasing and significant sums of money to publish the work of their residents in these journals. A more substantive discussion about the proper role of OA journals in resident research is needed.

Blog – Europe PMC: Europe PMC adopts the Principles of Open Scholarly Infrastructure

“As a long-standing service and infrastructure provider in the open science ecosystem, Europe PMC supports the Principles of Open Scholarly Infrastructure (POSI). We welcome the momentum gathering behind this initiative to promote the need to support and sustain the open infrastructure.

Europe PMC has been a part of the public and open infrastructure for over 15 years and is run and managed by EMBL-EBI (which is part of the pan-European organisation of EMBL). It is funded by 34 international funders and is community-driven, open infrastructure, set in the context of key global open data resources such as the European Nucleotide Archive (INSDC), the wwPDB and the European Genome-Phenome Archive. All of these resources exist for the public good, led by scientific need and international collaborations, and have open governance structures and a commitment to long-term sustainability. Together with PMC USA, Europe PMC is a part of the PubMed Central International archive network, which plays an integral part in fulfilling shared goals to enable international open science. Europe PMC has been selected as an ELIXIR Core Data Resource, which means that it is of fundamental importance to the wider life-science community and the long-term preservation of biological data….”

Investigating the replicability of preclinical cancer biology

Abstract:  Replicability is an important feature of scientific research, but aspects of contemporary research culture, such as an emphasis on novelty, can make replicability seem less important than it should be. The Reproducibility Project: Cancer Biology was set up to provide evidence about the replicability of preclinical research in cancer biology by repeating selected experiments from high-impact papers. A total of 50 experiments from 23 papers were repeated, generating data about the replicability of a total of 158 effects. Most of the original effects were positive effects (136), with the rest being null effects (22). A majority of the original effect sizes were reported as numerical values (117), with the rest being reported as representative images (41). We employed seven methods to assess replicability, and some of these methods were not suitable for all the effects in our sample. One method compared effect sizes: for positive effects, the median effect size in the replications was 85% smaller than the median effect size in the original experiments, and 92% of replication effect sizes were smaller than the original. The other methods were binary – the replication was either a success or a failure – and five of these methods could be used to assess both positive and null effects when effect sizes were reported as numerical values. For positive effects, 40% of replications (39/97) succeeded according to three or more of these five methods, and for null effects 80% of replications (12/15) were successful on this basis; combining positive and null effects, the success rate was 46% (51/112). A successful replication does not definitively confirm an original finding or its theoretical interpretation. Equally, a failure to replicate does not disconfirm a finding, but it does suggest that additional investigation is needed to establish its reliability.

 

NIH issues a seismic mandate: share data publicly

“In January 2023, the US National Institutes of Health (NIH) will begin requiring most of the 300,000 researchers and 2,500 institutions it funds annually to include a data-management plan in their grant applications — and to eventually make their data publicly available.

Researchers who spoke to Nature largely applaud the open-science principles underlying the policy — and the global example it sets. But some have concerns about the logistical challenges that researchers and their institutions will face in complying with it. Namely, they worry that the policy might exacerbate existing inequities in the science-funding landscape and could be a burden for early-career scientists, who do the lion’s share of data collection and are already stretched thin….

Such a seismic shift in practice has left some researchers worried about the amount of work that the mandate will require when it becomes effective….

Others worry that data-management activities will further sap funds from under-resourced labs. Although the policy outlines certain fees that researchers can add to their proposed budgets to offset the costs of compliance with the mandate, it doesn’t specify what criteria the NIH will use to grant these requests….

Despite its potential pitfalls, Ross thinks that the policy will have a ripple effect that will persuade smaller funding agencies and industry to adopt similar changes. “This policy establishes what people expect from clinical research,” he says. “It’s essentially saying the culture of research needs to change.” ”

Making Biomedical Sciences publications more accessible for machines | SpringerLink

Abstract:  With the rapidly expanding catalogue of scientific publications, especially within the Biomedical Sciences field, it is becoming increasingly difficult for researchers to search for, read or even interpret emerging scientific findings. PubMed, just one of the current biomedical data repositories, comprises over 33 million citations for biomedical research, and over 2500 publications are added each day. To further strengthen the impact biomedical research, we suggest that there should be more synergy between publications and machines. By bringing machines into the realm of research and publication, we can greatly augment the assessment, investigation and cataloging of the biomedical literary corpus. The effective application of machine-based manuscript assessment and interpretation is now crucial, and potentially stands as the most effective way for researchers to comprehend and process the tsunami of biomedical data and literature. Many biomedical manuscripts are currently published online in poorly searchable document types, with figures and data presented in formats that are partially inaccessible to machine-based approaches. The structure and format of biomedical manuscripts should be adapted to facilitate machine-assisted interrogation of this important literary corpus. In this context, it is important to embrace the concept that biomedical scientists should also write manuscripts that can be read by machines. It is likely that an enhanced human–machine synergy in reading biomedical publications will greatly enhance biomedical data retrieval and reveal novel insights into complex datasets.

 

Funders – About – Europe PMC

“Europe PMC has 33 research funders, across Europe. The Europe PMC funders expect:

Research outputs arising from research that we fund to be made freely and readily available;
Electronic copies of any biomedical research papers that have been accepted for publication in a peer-reviewed journal, and are supported in whole or in part by funding from any of the Europe PMC Funders, to be made available through PubMed Central (PMC) and Europe PMC, as soon as possible and in any event within six months of the journal publisher’s official date of final publication;
Authors and publishers, if an open access fee has been paid, to license research papers such that they may be freely copied and re-used for purposes such as text and data mining, provided that such uses are fully attributed. This is also encouraged where no fee has been paid….”

Data Management and Sharing

“You are not alone. Many researchers in the life sciences are interested in data reuse but find it hard to navigate today’s confusing open data landscape. To address researchers’ concerns around openly sharing their data, we launched FASEB DataWorks!, a new initiative that brings the biological and biomedical research communities together to advance human health through data sharing and reuse. Read more from our CEO

The initiative features four components:

DataWorks! Salons are conversation spaces for the biological and biomedical research community to exchange ideas and design effective practices for data sharing and reuse;
DataWorks! Help Desk provides guidance for the biological and biomedical research community to navigate and adopt data sharing and reuse policies and practices;
DataWorks! Prize recognizes biological and biomedical research teams that integrate data sharing and reuse to advance human health; and
DataWorks! Community enables biological and biomedical researchers and teams to hone skills and mentor peers in data management and sharing.

To ensure we meet the research community’s needs and evolve as the culture changes, FASEB is taking an incremental approach to FASEB DataWorks! The salon series kicked off in October 2021 and continues over the next 12 months. DataWorks! Community will launch this year, followed by the help desk and prize launching in 2022–2023.”

Apropos Data Sharing: Abandon the Distrust and Embrace the Opportunity | DNA and Cell Biology

Abstract:  In this commentary, we focus on the ethical challenges of data sharing and its potential in supporting biomedical research. Taking human genomics (HG) and European governance for sharing genomic data as a case study, we consider how to balance competing rights and interests—balancing protection of the privacy of data subjects and data security, with scientific progress and the need to promote public health. This is of particular relevancy in light of the current pandemic, which stresses the urgent need for international collaborations to promote health for all. We draw from existing ethical codes for data sharing in HG to offer recommendations as to how to protect rights while fostering scientific research and open science.

 

 

 

A Survey of Biomedical Journals To Detect Editorial Bias and Nepotistic Behavior

Alongside the growing concerns regarding predatory journal growth, other questionable editorial practices have gained visibility recently. Among them, we explored the usefulness of the Percentage of Papers by the Most Prolific author (PPMP) and the Gini index (level of inequality in the distribution of authorship among authors) as tools to identify journals that may show favoritism in accepting articles by specific authors. We examined whether the PPMP, complemented by the Gini index, could be useful for identifying cases of potential editorial bias, using all articles in a sample of 5,468 biomedical journals indexed in the National Library of Medicine. For articles published between 2015 and 2019, the median PPMP was 2.9%, and 5% of journal exhibited a PPMP of 10.6% or more. Among the journals with the highest PPMP or Gini index values, where a few authors were responsible for a disproportionate number of publications, a random sample was manually examined, revealing that the most prolific author was part of the editorial board in 60 cases (61%). The papers by the most prolific authors were more likely to be accepted for publication within 3 weeks of their submission. Results of analysis on a subset of articles, excluding nonresearch articles, were consistent with those of the principal analysis. In most journals, publications are distributed across a large number of authors. Our results reveal a subset of journals where a few authors, often members of the editorial board, were responsible for a disproportionate number of publications. To enhance trust in their practices, journals need to be transparent about their editorial and peer review practices.

Data Sharing in Biomedical Sciences: A Systematic Review of Incentives | Biopreservation and Biobanking

Abstract:  Background: The lack of incentives has been described as the rate-limiting step for data sharing. Currently, the evaluation of scientific productivity by academic institutions and funders has been heavily reliant upon the number of publications and citations, raising questions about the adequacy of such mechanisms to reward data generation and sharing. This article provides a systematic review of the current and proposed incentive mechanisms for researchers in biomedical sciences and discusses their strengths and weaknesses.

Methods: PubMed, Web of Science, and Google Scholar were queried for original research articles, editorials, and opinion articles on incentives for data sharing. Articles were included if they discussed incentive mechanisms for data sharing, were applicable to biomedical sciences, and were written in English.

 

Results: Although coauthorship in return for the sharing of data is common, this might be incompatible with authorship guidelines and raise concerns over the ability of secondary analysts to contest the proposed research methods or conclusions that are drawn. Data publication, citation, and altmetrics have been proposed as alternative routes to credit data generators, which could address these disadvantages. Their primary downsides are that they are not well-established, it is difficult to acquire evidence to support their implementation, and that they could be gamed or give rise to novel forms of research misconduct.

Conclusions: Alternative recognition mechanisms need to be more commonly used to generate evidence on their power to stimulate data sharing, and to assess where they fall short. There is ample discussion in policy documents on alternative crediting systems to work toward Open Science, which indicates that that there is an interest in working out more elaborate metascience programs.