“Nearly 6000 clinical trial results are currently missing from the European trial registry, despite transparency rules requiring countries to upload results within 12 months of trial completion, a report has found.1
Researchers from the University of Oxford said the findings show that medicines regulators in the 14 European countries included in the report have failed to ensure that important data on new drugs and vaccines are rapidly and consistently made public….”
Abstract: Access to randomized clinical trial (RCT) protocols is necessary for the interpretation and reproducibility of the study results, but protocol availability has been lacking. We determined the prevalence of protocol availability for all published cancer RCTs in January 2020. We found that only 36.1% (48/133) of RCTs had an accessible protocol and only 11.3% of RCTs (15/133) had a publicly accessible protocol that was not behind a paywall. Only 18.0% (24/133) of RCTs were published in conjunction with the protocol on the journal website. In conclusion, few cancer RCTs have an accessible research protocol. Journals should require publication of RCT protocols along with manuscripts to improve research transparency.
Data-sharing policies in randomized clinical trials (RCTs) should have an evaluation component. The main objective of this case–control study was to assess the impact of published re-uses of RCT data in terms of media attention (Altmetric) and citation rates.
Re-uses of RCT data published up to December 2019 (cases) were searched for by two reviewers on 3 repositories (CSDR, YODA project, and Vivli) and matched to control papers published in the same journal. The Altmetric Attention Score (primary outcome), components of this score (e.g. mention of policy sources, media attention) and the total number of citations were compared between these two groups.
89 re-uses were identified: 48 (53.9%) secondary analyses, 34 (38.2%) meta-analyses, 4 (4.5%) methodological analyses and 3 (3.4%) re-analyses. The median (interquartile range) Altmetric Attention Scores were 5.9 (1.3—22.2) for re-use and 2.8 (0.3—12.3) for controls (p?=?0.14). No statistical difference was found on any of the components of in the Altmetric Attention Score. The median (interquartile range) numbers of citations were 3 (1—8) for reuses and 4 (1 – 11.5) for controls (p?=?0.30). Only 6/89 re-uses (6.7%) were cited in a policy source.
Using all available re-uses of RCT data to date from major data repositories, we were not able to demonstrate that re-uses attracted more attention than a matched sample of studies published in the same journals. Small average differences are still possible, as the sample size was limited. However matching choices have some limitations so results should be interpreted very cautiously. Also, citations by policy sources for re-uses were rare.
Pandemic events often trigger a surge of clinical trial activity aimed at rapidly evaluating therapeutic or preventative interventions. Ensuring rapid public access to the complete and unbiased trial record is particularly critical for pandemic research given the urgent associated public health needs. The World Health Organization (WHO) established standards requiring posting of results to a registry within 12 months of trial completion and publication in a peer reviewed journal within 24 months of completion, though compliance with these requirements among pandemic trials is unknown.
This cross-sectional analysis characterizes availability of results in trial registries and publications among registered trials performed during the 2009 H1N1 influenza, 2014 Ebola, and 2016 Zika pandemics. We searched trial registries to identify clinical trials testing interventions related to these pandemics, and determined the time elapsed between trial completion and availability of results in the registry. We also performed a comprehensive search of MEDLINE via PubMed, Google Scholar, and EMBASE to identify corresponding peer reviewed publications. The primary outcome was the compliance with either of the WHO’s established standards for sharing clinical trial results. Secondary outcomes included compliance with both standards, and assessing the time elapsed between trial completion and public availability of results.
Three hundred thirty-three trials met eligibility criteria, including 261 H1N1 influenza trials, 60 Ebola trials, and 12 Zika trials. Of these, 139 (42%) either had results available in the trial registry within 12 months of study completion or had results available in a peer-reviewed publication within 24 months. Five trials (2%) met both standards. No results were available in either a registry or publication for 59 trials (18%). Among trials with registered results, a median of 42 months (IQR 16–76 months) elapsed between trial completion and results posting. For published trials, the median elapsed time between completion and publication was 21 months (IQR 9–34 months). Results were available within 24 months of study completion in either the trial registry or a peer reviewed publication for 166 trials (50%).
Very few trials performed during prior pandemic events met established standards for the timely public dissemination of trial results.
“NONE OF MAASTRICHT UNIVERSITY’S 7 DUE TRIALS HAVE BEEN REPORTED….All clinical trials on the European Union Clinical Trials Register (EUCTR) must report their results in the registry within a year of completion.”
“A treatment for shortening the painful episodes of sickle cell disease (SCD) is not effective, results published in JAMA indicate. But the effort it took to publish the findings is an important part of the story and reveal problems with data ownership, company motivations, and public resources that go well beyond a single clinical trial or experimental agent….”
“ICMRA1 and WHO call on the pharmaceutical industry to provide wide access to clinical data for all new medicines and vaccines (whether full or conditional approval, under emergency use, or rejected). Clinical trial reports should be published without redaction of confidential information for reasons of overriding public health interest….
Regulators continue to spend considerable resources negotiating transparency with sponsors. Both positive and negative clinically relevant data should be made available, while only personal data and individual patient data should be redacted. In any case, aggregated data are unlikely to lead to re-identification of personal data and techniques of anonymisation can be used….
Providing systematic public access to data supporting approvals and rejections of medicines reviewed by regulators, is long overdue despite existing initiatives, such as those from the European Medicines Agency and Health Canada. The COVID-19 pandemic has revealed how essential to public trust access to data is. ICMRA and WHO call on the pharmaceutical industry to commit, within short timelines, and without waiting for legal changes, to provide voluntary unrestricted access to trial results data for the benefit of public health.”
“RIAT is an international effort to tackle bias in the way research is reported with the goal of providing more accurate information to patients and other healthcare decision makers.
Randomized controlled trials are known as medicine’s “gold standard” for reliable evidence. However, they are falling short of that standard, in large part due to two fundamental problems:
MISREPORTING: many trials that are published are inaccurately or incompletely reported (misreported trials)
INVISIBILITY: not all trials conducted are published (unpublished trials)
When the original investigators or sponsors do not correct misreporting, or even leave the entire trial unpublished, they can be considered to have abandoned their trial. And the downstream effects can be substantial, drawing to false conclusions about the effectiveness and safety of medical interventions.
The RIAT initiative aims to address these problems by offering a methodology that allows other people to responsibly correct the record….”
The FOI relates to the HRA’s “Clinical Trial Registration Audit Report” covering trials receiving ethics approval during H1 2018:
Please provide the following information:
1. A copy of the full data set that formed the basis for the report of September 2015, including all lines and columns included in the original data set. Please provide the data in Excel format. In case you do not provide the full data set, please redact (rather than delete) the data fields not released, leaving intact the corresponding line and/or column headings.
2. An estimate of the total HRA staff workload involved in performing this audit, using FTE person-days as the metric.
Please note that in response to a similar previous request, the HRA found that it is in the public interest to release this information:
Please also note that in its previous response (linked above), the HRA provided a data set that was barely usable. Please provide a data set that is comprehensible and fully usable in order to avoid the need to manage a request for internal review….”
“Researchers who receive federal help consistently fail to report their results to the public. The government should hold them accountable….
Researchers using federal funds to conduct cancer trials — experiments involving drugs or medical devices that rely on volunteer subjects — were sometimes taking more than a year to report their results to the N.I.H., as required. “If you don’t report, the law says you shouldn’t get any funding,” he said, citing an investigation I had published in Stat with my colleague Talia Bronshtein. “Doc, I’m going to find out if it’s true, and if it’s true, I’m going to cut funding. That’s a promise.”
It was true then. It’s true now. More than 150 trials completed since 2017 by the N.I.H’s National Cancer Institute, which leads the $1.8 billion Moonshot effort, should have reported results by now. About two-thirds reported after their deadlines or not at all, according to a University of Oxford website that tracks clinical trials regulated by the Food and Drug Administration and National Institutes of Health. Some trial results are nearly two years overdue. Over all, government-sponsored scientists have complied less than half the time for trial results due since 2018. (A spokeswoman for the N.I.H. said, “We are willing to do all measures to ensure compliance with ClinicalTrials.gov results reporting.”)…
In 2016, Dr. Francis Collins, the director of the National Institutes of Health, announced that the agency would begin penalizing researchers for failing to comply with its reporting requirements. “We are serious about this,” he said at the time. Yet in the years since, neither the F.D.A. nor N.I.H. has enforced the law. …”
“Led by Europe’s largest academic trial sponsor, Austrian universities are now making their clinical trial results public at an impressive pace. In parallel, national medicines regulator BASG is intensifying its efforts to promote clinical trial transparency.
Overall, Austria’s 14 largest sponsors have made 37% of their due trial results public, compared to just 18% a year ago. Results are still missing for 233 long-completed trials.
Over the past year, the country’s three major medical universities alone have uploaded 65 trial results onto the European trial registry….”
Abstract: This project seeks to conduct language translation on metadata labels for research publications, attribution data, and clinical trials information to make data about medical research queriable in underserved languages through Wikidata and the Linked Open Web. This project has the benefit of distributing content through Wikipedia and Wikidata, which already have an annual userbase of a billion users and which already have established actionable standards to practice diversity, inclusion, openness, FAIRness, and transparency about program development. The impact will be localized access to basic research information in various Global South languages to integrate with existing community efforts for establishing the same. Although Wikidata development in this direction seems inevitable, the cultural and social exchange required to establish global multilingual research partnerships could begin now with support rather than later as a second phase effort for including the developing world. Wikipedia and Wikidata are established forums with an existing active userbase for multilingual research collaboration, but the research practices there still are immature. By applying metadata expertise through this project, we will elevate the current amateur development with more stable Linked Open Data compatibility to English language databases. Using the wiki distribution and discussion platform to develop the global conversation about data sharing will set good precedents for the trend of global research collaboration.
“Biopharma companies with the most effective and robust clinical trial disclosure programs often have one thing in common: a leadership that recognizes the importance of transparency beyond mere regulatory compliance.
These companies – primarily some of the largest pharmaceutical firms such as GlaxoSmithKline plc – have a commitment at the executive level to, for example, publish their disclosure policies, making generous commitments to protocol registration, results disclosure, plain language summaries, and the sharing of a broad range of clinical documents. They invest as well in both tools and company policies to meet these commitments.
In contrast, smaller companies typically delay in investing in the focused systems needed for even the modest goal of regulatory compliance, let alone providing for a strategic view into disclosure activities. For example, they often are making do with manual, spreadsheet-type approaches rather than a centralized review and monitoring system. But repercussions of having less effective programs can go well beyond just regulatory penalties. (See sidebar.) …”
The United States has mobilized the full force of its clinical research enterprise to address the Covid-19 pandemic, allocating billions of dollars to support timely research. As of January 2021, for example, the National Institutes of Health (NIH) had issued nearly a thousand awards cumulatively worth roughly $2 billion to support Covid-19 projects ranging from the development of medical products (including diagnostics and vaccines) to evaluations of population-specific risk factors and outcomes.1 Such initiatives, which have yielded new technologies and important evidence, illustrate the value of robust scientific infrastructure.
• We constructed a corpus of RCT publications annotated with CONSORT checklist items.
• We developed text mining methods to identify methodology-related checklist items.
• A BioBERT-based model performs best in recognizing adequately reported items.
• A phrase-based method performs best in recognizing infrequently reported items.
• The corpus and the text mining methods can be used to address reporting transparency….”