Prospective Clinical Trial Registration: A Prerequisite for Publishing Your Results | Radiology

“The ICMJE requires that clinical trial results be published in the same clinical trial depository where the trial is registered. These results are in the form of a short (?500 words) abstract or table (6,7). Full disclosure of the existing results publication in a clinical trial registry should be explicitly stated when the manuscript is submitted for publication. The Food and Drug Administration (FDA) has indicated it will enforce trial results reporting related to ClinicalTrials.gov (8). The FDA is authorized to seek civil monetary penalties from responsible parties, including additional civil monetary penalties. In the United States, the sponsor of an applicable clinical trial is considered the responsible party, unless or until the sponsor designates a qualified principal investigator as the responsible party. The FDA issued its first Notice of Noncompliance in April 2021 for failure to report results in ClinicalTrials.gov based on a lack of reporting the safety and effectiveness results for the drug dalantercept in combination with axitinib in patients with advanced renal cell carcinoma (8).

Finally, as of July 1, 2018, manuscripts submitted to ICMJE journals that report the results of clinical trials must contain a data sharing statement. Clinical trials that begin enrolling participants on or after January 1, 2019, must include a data sharing plan in the trial registration. (for further information, see www.icmje.org/recommendations/browse/publishing-and-editorial-issues/clinical-trial-registration.html). Since most clinical trials take 2 or more years for results to be reported, the Radiology editorial board had expected such mandatory data sharing plans to be reported in the current year. However, because of the COVID-19 pandemic, many clinical trials were halted. Thus, journal publication requirements to include data sharing statements are more likely to impact authors beginning in 2023. Data sharing statements required for Radiological Society of North America (RSNA) journals may be found at https://pubs.rsna.org/page/policies#clinical.

In conclusion, prospective clinical trial registration is a mechanism allowing us to ensure transparency in clinical research conduct, honest and complete reporting of the clinical trial results, and minimization of selective result publications. Since its inception in 2004, this requirement has evolved into a policy that is practiced by major medical journals worldwide, is mandatory for publication of trial results, and, in some circumstances, is enforced by the FDA. Further, ICMJE journals, including RSNA journals, are expecting manuscripts that report trial results to include statements on data sharing. As each clinical trial design is unique, we encourage authors to refer to the full description of the current ICMJE policy at icmje.org for additional information pertaining to their specific circumstances.”

CORONA Project Demonstrates Value of Sharing Knowledge to Save Lives – SPARC

“On Friday, March 13, 2020, much of the United States shut down with COVID-19 restrictions. Three days later, Dr. David Fajgenbaum launched an effort to track and publicly share what drugs were being tried to combat the disease.

The CORONA (Covid-19 Registry of Off-label & New Agents) Project has been a valued resource ever since, keeping an inventory – in real time – of the now more than 500 treatments that have been administered to COVID-19 patients. Fajgenbaum led a team that has reviewed thousands of journal articles to identify the drugs, determine which are most promising at various stages, and make it all available through an open-source data repository.

 

In the past 18 months, more than 30,000 unique users have viewed the database, including members of the general public and officials from the National Institutes of Health and the U.S. Food and Drug Administration who have used the database to select drugs to test in clinical trials….”

Clinical trials and tribulations: lessons from spinal cord injury studies registered on ClinicalTrials.gov | Spinal Cord

Abstract:  Objective

ClinicalTrials.gov is an online trial registry that provides public access to information on past, present, and future clinical trials. While increasing transparency in research, the quality of the information provided in trial registrations is highly variable. The objective of this study is to assess key areas of information on ClinicalTrials.gov in interventional trials involving people with spinal cord injuries.

Setting

Interventional trials on ClinicalTrials.gov involving people with spinal cord injuries.

Methods

A subset of data on interventional spinal cord injury trials was downloaded from ClinicalTrials.gov. Reviewers extracted information pertaining to study type, injury etiology, spinal cord injury characteristics, timing, study status, and results.

Results

Of the interventional trial registrations reviewed, 62.5%, 58.6%, and 24.3% reported injury level, severity, and etiology, respectively. The timing of intervention relative to injury was reported in 72.8% of registrations. Most trials identified a valid study status (89.2%), but only 23.5% of those completed studies had posted results.

Conclusions

Our review provides a snapshot of interventional clinical trials conducted in the field of spinal cord injury and registered in ClinicalTrials.gov. Areas for improvement were identified with regards to reporting injury characteristics, as well as posting results.

Characteristics of available studies and dissemination of research using major clinical data sharing platforms – Enrique Vazquez, Henri Gouraud, Florian Naudet, Cary P Gross, Harlan M Krumholz, Joseph S Ross, Joshua D Wallach, 2021

Abstract:  Background/Aims:

Over the past decade, numerous data sharing platforms have been launched, providing access to de-identified individual patient-level data and supporting documentation. We evaluated the characteristics of prominent clinical data sharing platforms, including types of studies listed as available for request, data requests received, and rates of dissemination of research findings from data requests.

Methods:

We reviewed publicly available information listed on the websites of six prominent clinical data sharing platforms: Biological Specimen and Data Repository Information Coordinating Center, ClinicalStudyDataRequest.com, Project Data Sphere, Supporting Open Access to Researchers–Bristol Myers Squibb, Vivli, and the Yale Open Data Access Project. We recorded key platform characteristics, including listed studies and available supporting documentation, information on the number and status of data requests, and rates of dissemination of research findings from data requests (i.e. publications in a peer-reviewed journals, preprints, conference abstracts, or results reported on the platform’s website).

Results:

The number of clinical studies listed as available for request varied among five data sharing platforms: Biological Specimen and Data Repository Information Coordinating Center (n?=?219), ClinicalStudyDataRequest.com (n?=?2,897), Project Data Sphere (n?=?154), Vivli (n?=?5426), and the Yale Open Data Access Project (n?=?395); Supporting Open Access to Researchers did not provide a list of Bristol Myers Squibb studies available for request. Individual patient-level data were nearly always reported as being available for request, as opposed to only Clinical Study Reports (Biological Specimen and Data Repository Information Coordinating Center?=?211/219 (96.3%); ClinicalStudyDataRequest.com?=?2884/2897 (99.6%); Project Data Sphere?=?154/154 (100.0%); and the Yale Open Data Access Project?=?355/395 (89.9%)); Vivli did not provide downloadable study metadata. Of 1201 data requests listed on ClinicalStudyDataRequest.com, Supporting Open Access to Researchers–Bristol Myers Squibb, Vivli, and the Yale Open Data Access Project platforms, 586 requests (48.8%) were approved (i.e. data access granted). The majority were for secondary analyses and/or developing/validating methods (ClinicalStudyDataRequest.com?=?262/313 (83.7%); Supporting Open Access to Researchers–Bristol Myers Squibb?=?22/30 (73.3%); Vivli?=?63/84 (75.0%); the Yale Open Data Access Project?=?111/159 (69.8%)); four were for re-analyses or corroborations of previous research findings (ClinicalStudyDataRequest.com?=?3/313 (1.0%) and the Yale Open Data Access Project?=?1/159 (0.6%)). Ninety-five (16.1%) approved data requests had results disseminated via peer-reviewed publications (ClinicalStudyDataRequest.com?=?61/313 (19.5%); Supporting Open Access to Researchers–Bristol Myers Squibb?=?3/30 (10.0%); Vivli?=?4/84 (4.8%); the Yale Open Data Access Project?=?27/159 (17.0%)). Forty-two (6.8%) additional requests reported results through preprints, conference abstracts, or on the platform’s website (ClinicalStudyDataRequest.com?=?12/313 (3.8%); Supporting Open Access to Researchers–Bristol Myers Squibb?=?3/30 (10.0%); Vivli?=?2/84 (2.4%); Yale Open Data Access Project?=?25/159 (15.7%)).

Conclusion:

Across six prominent clinical data sharing platforms, information on studies and request metrics varied in availability and format. Most data requests focused on secondary analyses and approximately one-quarter of all approved requests publicly disseminated their results. To further promote the use of shared clinical data, platforms should increase transparency, consistently clarify the availability of the listed studies and supporting documentation, and ensure that research findings from data requests are disseminated.

Status, use and impact of sharing individual participant data from clinical trials: a scoping review | BMJ Open

Abstract:  Objectives To explore the impact of data-sharing initiatives on the intent to share data, on actual data sharing, on the use of shared data and on research output and impact of shared data.

Eligibility criteria All studies investigating data-sharing practices for individual participant data (IPD) from clinical trials.

Sources of evidence We searched the Medline database, the Cochrane Library, the Science Citation Index Expanded and the Social Sciences Citation Index via Web of Science, and preprints and proceedings of the International Congress on Peer Review and Scientific Publication. In addition, we inspected major clinical trial data-sharing platforms, contacted major journals/publishers, editorial groups and some funders.

Charting methods Two reviewers independently extracted information on methods and results from resources identified using a standardised questionnaire. A map of the extracted data was constructed and accompanied by a narrative summary for each outcome domain.

Results 93 studies identified in the literature search (published between 2001 and 2020, median: 2018) and 5 from additional information sources were included in the scoping review. Most studies were descriptive and focused on early phases of the data-sharing process. While the willingness to share IPD from clinical trials is extremely high, actual data-sharing rates are suboptimal. A survey of journal data suggests poor to moderate enforcement of the policies by publishers. Metrics provided by platforms suggest that a large majority of data remains unrequested. When requested, the purpose of the reuse is more often secondary analyses and meta-analyses, rarely re-analyses. Finally, studies focused on the real impact of data-sharing were rare and used surrogates such as citation metrics.

Conclusions There is currently a gap in the evidence base for the impact of IPD sharing, which entails uncertainties in the implementation of current data-sharing policies. High level evidence is needed to assess whether the value of medical research increases with data-sharing practices.

NIH-Wide Strategic Plan: Fiscal Years 2021-2025

“NIH is committed to making findings from the research that it funds accessible and available in a timely manner, while also providing safeguards for privacy, intellectual property, security, and data management. For instance, NIH-funded investigators are expected to make the results and accomplishments of their activities freely available within 12 months of publication. NIH also encourages investigators to share results prior to peer review, such as through preprints, to speed the dissemination of their findings and enhance the rigor of their work through informal peer review. A robust culture of data sharing is critical to continued progress in science, maximizing NIH’s investment in research, and assurance of the highest levels of transparency and rigor. To this end, NIH will continue to promote opportunities for data management and sharing while allowing flexibility for various data types, sharing platforms, and strategies. Additionally, NIH is implementing a policy requiring that all applications include data sharing and management plans that consider input from stakeholders….”

Clinical trial transparency and data sharing among biopharmaceutical companies and the role of company size, location and product type: a cross-sectional descriptive analysis | BMJ Open

Abstract:  Objectives To examine company characteristics associated with better transparency and to apply a tool used to measure and improve clinical trial transparency among large companies and drugs, to smaller companies and biologics.

Design Cross-sectional descriptive analysis.

Setting and participants Novel drugs and biologics Food and Drug Administration (FDA) approved in 2016 and 2017 and their company sponsors.

Main outcome measures Using established Good Pharma Scorecard (GPS) measures, companies and products were evaluated on their clinical trial registration, results dissemination and FDA Amendments Act (FDAAA) implementation; companies were ranked using these measures and a multicomponent data sharing measure. Associations between company transparency scores with company size (large vs non-large), location (US vs non-US) and sponsored product type (drug vs biologic) were also examined.

Results 26% of products (16/62) had publicly available results for all clinical trials supporting their FDA approval and 67% (39/58) had public results for trials in patients by 6 months after their FDA approval; 58% (32/55) were FDAAA compliant. Large companies were significantly more transparent than non-large companies (overall median transparency score of 95% (IQR 91–100) vs 59% (IQR 41–70), p<0.001), attributable to higher FDAAA compliance (median of 100% (IQR 88–100) vs 57% (0–100), p=0.01) and better data sharing (median of 100% (IQR 80–100) vs 20% (IQR 20–40), p<0.01). No significant differences were observed by company location or product type.

Conclusions It was feasible to apply the GPS transparency measures and ranking tool to non-large companies and biologics. Large companies are significantly more transparent than non-large companies, driven by better data sharing procedures and implementation of FDAAA trial reporting requirements. Greater research transparency is needed, particularly among non-large companies, to maximise the benefits of research for patient care and scientific innovation.

Clinical trial results for FDA-approved drugs often remain hidden, new study finds

“A team of American researchers examined 62 products by 42 pharma companies that gained FDA approval in 2016 and 2017. Collectively, these drugs and biologics were approved based on 1,017 clinical trials involving more than 187,000 participants….

Around a quarter of these trials were subject to the FDA Amendments Act, a transparency law that requires drug makers to register applicable trials on a public registry within 21 days of their start date, and to make their results public on the registry within 30 days of initial FDA approval of a product.

 

 

 

 

 

The study team found that 55 of the 62 FDA approvals included at least one clinical trial that was subject to the transparency law. However, in the case of 13 products, these trials did not consistently meet legal registration or reporting requirements.

Large pharma companies were far more likely to comply with the law. For example, Merck Sharp & Dohme was legally responsible for registering and reporting 27 trials, and fully complied in every single case. However, several other major players – Gilead, Johnson & Johnson / Janssen, Novo Nordisk, Sanofi, and Shire – fell short of legal requirements.

 

 

Nonetheless, the study – which also covered companies’ data sharing policies – found that overall, there had been “sustained improvement” in pharma industry disclosure practices compared to previous years….”

 

European law could improve ‘scandalous’ lack of clinical trial data reporting | Science | AAAS

“The global pandemic has turned a spotlight on clinical trials, which test thousands of drugs and therapies each year. In Europe, however, the enthusiasm for trials is not matched with a zeal for reporting the results to the public. A total of 3846 European trials—nearly 28% of 13,874 completed trials in the EU Clinical Trials Register (EUCTR) on 1 July—had not posted their results on the register, according to the latest data from the EU Trials Tracker, set up by U.K. researchers in 2018 to expose lax reporting. Public research hospitals and universities, not drugmakers, are responsible for the vast majority of the lapses, which appear to violate European rules that require sponsors to post their results within 1 year of a trial’s conclusion….”

Clinical trials: regulators’ inaction has left EU registry “riddled with inaccurate and missing data” | The BMJ

“Nearly 6000 clinical trial results are currently missing from the European trial registry, despite transparency rules requiring countries to upload results within 12 months of trial completion, a report has found.1

Researchers from the University of Oxford said the findings show that medicines regulators in the 14 European countries included in the report have failed to ensure that important data on new drugs and vaccines are rapidly and consistently made public….”

Public access to protocols of contemporary cancer randomized clinical trials | Trials | Full Text

Abstract:  Access to randomized clinical trial (RCT) protocols is necessary for the interpretation and reproducibility of the study results, but protocol availability has been lacking. We determined the prevalence of protocol availability for all published cancer RCTs in January 2020. We found that only 36.1% (48/133) of RCTs had an accessible protocol and only 11.3% of RCTs (15/133) had a publicly accessible protocol that was not behind a paywall. Only 18.0% (24/133) of RCTs were published in conjunction with the protocol on the journal website. In conclusion, few cancer RCTs have an accessible research protocol. Journals should require publication of RCT protocols along with manuscripts to improve research transparency.

 

Social media attention and citations of published outputs from re-use of clinical trial data: a matched comparison with articles published in the same journals | BMC Medical Research Methodology | Full Text

Abstract:  Background

Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.

Methods

Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.

Results

89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.

Conclusions

Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.

Delays in reporting and publishing trial results during pandemics: cross sectional analysis of 2009 H1N1, 2014 Ebola, and 2016 Zika clinical trials | BMC Medical Research Methodology | Full Text

Abstract:  Background

Pandemic events often trigger a surge of clinical trial activity aimed at rapidly evaluating therapeutic or preventative interventions. Ensuring rapid public access to the complete and unbiased trial record is particularly critical for pandemic research given the urgent associated public health needs. The World Health Organization (WHO) established standards requiring posting of results to a registry within 12 months of trial completion and publication in a peer reviewed journal within 24 months of completion, though compliance with these requirements among pandemic trials is unknown.

Methods

This cross-sectional analysis characterizes availability of results in trial registries and publications among registered trials performed during the 2009 H1N1 influenza, 2014 Ebola, and 2016 Zika pandemics. We searched trial registries to identify clinical trials testing interventions related to these pandemics, and determined the time elapsed between trial completion and availability of results in the registry. We also performed a comprehensive search of MEDLINE via PubMed, Google Scholar, and EMBASE to identify corresponding peer reviewed publications. The primary outcome was the compliance with either of the WHO’s established standards for sharing clinical trial results. Secondary outcomes included compliance with both standards, and assessing the time elapsed between trial completion and public availability of results.

Results

Three hundred thirty-three trials met eligibility criteria, including 261 H1N1 influenza trials, 60 Ebola trials, and 12 Zika trials. Of these, 139 (42%) either had results available in the trial registry within 12 months of study completion or had results available in a peer-reviewed publication within 24 months. Five trials (2%) met both standards. No results were available in either a registry or publication for 59 trials (18%). Among trials with registered results, a median of 42 months (IQR 16–76 months) elapsed between trial completion and results posting. For published trials, the median elapsed time between completion and publication was 21 months (IQR 9–34 months). Results were available within 24 months of study completion in either the trial registry or a peer reviewed publication for 166 trials (50%).

Conclusions

Very few trials performed during prior pandemic events met established standards for the timely public dissemination of trial results.

JAMA Publishes Trial Results Delayed 5 Years. Here’s Why

“A treatment for shortening the painful episodes of sickle cell disease (SCD) is not effective, results published in JAMA indicate. But the effort it took to publish the findings is an important part of the story and reveal problems with data ownership, company motivations, and public resources that go well beyond a single clinical trial or experimental agent….”