Randomized controlled trials (RCTs) may be considered the gold standard for generating clinical evidence, but there is growing interest in using real-world evidence. However, only a small portion of clinical trials could be replicated in the real world, according to a new study published in JAMA Network Open.
Randomized controlled trials (RCTs) may be considered the gold standard for generating clinical evidence, but there is growing interest in using real-world evidence (RWE). However, only a small portion of clinical trials could be replicated in the real world, according to a new study published in JAMA Network Open.
The challenge with RCTs is that many people get left out and these trials often have string inclusion criteria, explained Andrew Norden, MD, MPH, MBA, chief medical officer of COTA Healthcare in an interview with The American Journal of Managed Care®’s Evidence-Based Oncology™ (EBO).
In the JAMA Network Open study, the authors examined 220 US-based trials published in the 7 highest-impact journals (New England Journal of Medicine, Lancet, JAMA, The BMJ, Annals of Internal Medicine, JAMA Internal Medicine, and PLoS Medicine) with the goal of identifying which trials could possibly be replicated using administrative claims or data from electronic health records (EHRs).
While RWE better reflects patient demographics, comorbidities, adherence, and use of concurrent treatment in actual clinical environments, “enthusiasm for using retrospective RWE to complement the evidence generated by RCTs may still need to be tempered by feasibility concerns because it is unclear whether it is reasonable to expect that observational data can be used to address the same clinical questions being answered by traditional clinical trials,” the authors explained.
Nearly all (92.7%) trials were randomized, and of those 55.4% were double-blind, 14.7% were single-blind, and 29.9% were open-label trials. Two-thirds (66.8%) of the studies tested pharmaceutical interventions, 15.5% tested education, procedural, or behavioral interventions, 9.1% tested clinical or surgical procedures, 6.4% tested medical devices, and 2.3% tested other interventions.
Importantly, only 39.1% of the trials had an intervention that could be established from observational data. The most common interventions that could not be ascertained from EHR data or claims were new drugs that were not yet approved by the FDA (34.1% of trials) and educational, behavioral, or procedural interventions (14.1% of trials).
For the 86 trials for which the intervention could be ascertained, 39 were for drugs with FDA-approval but that were being studied for a new population or indication, 32 trials were comparative effectiveness studies, and 15 were studying postapproval safety or efficacy.
According to the authors, the findings reinforce that caution is needed before thinking that RWE could replace RCTs. They note that observational studies are not useful when examining the safety and efficacy of a product that is not widely used in clinical practice and that many of the trials could not be replicated because the data would be unlikely to appear in a structured form in the EHR.
In his interview with EBO, Norden noted that COTA is not trying to eliminate RCTs. Instead, he views RWE as a way to extend the findings of the RCT and offer additional insight into how drugs work for populations who might not normally be included in RCTs.
The authors of the JAMA Network Open concluded that while there is potential to use real-world data, it is currently hindered by the ability to replicate design elements from clinical trials.
“Further development of observational methods and data systems may help realize the potential of RWE and may, in turn, translate into more generalizable medical research and more rapid evaluation of medical products,” they wrote.
Bartlett VL, Dhruva SS, Shah ND, Ryan P, Ross JS. Feasibility of using real-world data to replicate clinical trial evidence [published online October 9, 2019]. JAMA Netw Open. doi:10.1001/jamanetworkopen.2019.12869.