The Economics of Resistant Pathogens and Antibiotic Innovation
Published Online: April 23, 2014
Michael R. McKellar, BA; Michael E. Chernew, PhD; and A. Mark Fendrick, MD
Pharmaceutical Innovation Has Driven Vast Improvements In Health
The last 70 years have seen an explosion in pharmaceutical and biotechnological innovation that has improved and lengthened the lives of millions across the globe. In the early period of drug development following World War II, several pharmaceuticals were introduced to treat various communicable and infectious diseases. More modern innovations have created agents that treat a range of chronic conditions such as tuberculosis and viral hepatitis. Many of these drugs provide a direct cure or protection against otherwise deadly diseases such as bacterial pneumonia and HIV/AIDS, while others help control the symptoms of, and prevent complications from, common conditions such as pharyngitis, otitis media, and urinary tract infections.
Costs of Innovation
While the pharmaceutical breakthroughs over the last half-century have improved the lives of millions, the development costs have been substantial. Pharmaceutical markets generally are characterized as having huge up-front research and development (R&D) investment and relatively miniscule marginal cost. This leaves discovering firms susceptible to competitors who could manufacture the same or similar compound without repeating costly and risky R&D. Due in part to the high risk of these investments and the time it takes to move from discovery
to regulatory approval, some researchers estimate that the total capitalized cost of developing a new drug exceeds $800 million (in 2000 dollars).1 Firms would not undertake such an investment if others could free-ride on the knowledge. In an attempt to promote the social desire for innovation and to provide suitable financial incentives for companies to develop novel drugs, national governments and international laws offer exclusivity rights for discovering firms under various patent systems.
The Special Case of Antibiotics Social Value
Antibiotic drug classes, which include antibacterials, antimicrobials, and antifungals, have been developed to treat diseases caused by a wide range of infectious agents. Due in part to the tremendous health benefits provided by antibiotics and their unique clinical aspects, this highly valued pharmaceutical class warrants special consideration. In the years following the widespread adoption of penicillin starting in the 1940s, antibiotics had a significant impact on reducing mortality associated with many of the most devastating diseases in human history, including bacterial pneumonia, bloodstream infections, tuberculosis, and childhood diseases such as scarlet fever and diptheria.2 More recently, as a result of the availability of antibiotics, neither the United States nor British militaries reported a single case of gangrene—a highly fatal infection prior to World War II—during the entirety of the Iraq War.3 In addition to curing many otherwise fatal infections, antibiotics reduce morbidity by decreasing hospitalizations and major clinical complications. For example, antibiotics reduce the risk of amputation from diabetic ulcers, which are a leading cause of amputation. Moreover, antibiotics used as prophylaxis have made many surgical procedures possible that were otherwise too risky to perform.
From a cost-effectiveness perspective, the value of antibiotics is highly favorable compared with many other interventions. Antibiotics to treat (full cure) a life-threatening or limb-threatening pathogen usually cost between $2000 and $3000.4 This compares favorably with other therapeutic agents that often provide a benefit—but no cure—for disorders such as cancer, multiple sclerosis, rheumatoid arthritis, lupus, and rare/orphan disorders. These therapeutics often range in cost from $100,000 to $300,000.5 The value of antibiotics also compares favorably to the treatment of other infectious diseases such as the hepatitis C virus, where new therapies cost more than $60,000.6
The emergence of pathogens that are resistant to antibiotics creates unique scientific, clinical, and economic challenges. While the use of most medications does not have an impact beyond the individual patient using it, some drugs, such as vaccines, actually provide benefits to nonusers. Utilization of vaccines generates “herd immunity,” indirectly protecting the unvaccinated.
Use of antibiotics actually has the potential to create the opposite problem. The fact that the targets of antibiotics are independent, living organisms (as opposed to enzymes or cell receptors) that have the potential to adapt to environmental stressors through natural selection is what separates this class of agents from other pharmaceuticals.
Despite the specter of resistance, existing pathogens necessitate antibiotic use. However, the potential of future resistance impacts the definition of appropriate use. Common definitions of appropriate use include consideration of clinical appropriateness (ie, clinical benefit exceeds clinical risk) and economic appropriateness (ie, clinical benefit exceeds clinical risk and cost).
From a societal perspective, clinical risk would include the impact of use on future resistance. For this reason, the greater the impact of use on future resistance, the lower optimal utilization should be.
Yet individual patients and providers may not fully consider the societal cost associated with resistance. As a result, utilization may be greater than what is considered optimal, with factors such as diagnostic uncertainty and demand from patients contributing to excess use. A 2013 Centers for Disease Control and Prevention (CDC) report states that perhaps half of all antibiotics used on humans are unnecessary, with other work suggesting that around one-fourth of multispectrum antibiotics are prescribed for conditions that rarely indicate their use, such as viral upper-respiratory infections.7,8 This is likely exacerbated by patients pressuring caregivers for antibiotics (or other pharmaceutical solutions), with an expectation that if someone shows up sick to the doctor, they should leave with a prescription.9 Existing evidence suggests that this overuse may accelerate resistance. Each year in the United States alone, over 2 million people are believed to develop a resistant infection, with over 23,000 dying as a direct result. The US experience is in no way unique. India has seen a precipitous increase in the incidence of resistant tuberculosis cases, even though antibiotics are often available without a prescription.8,10
Globally, multidrug-resistant (MDR) tuberculosis alone is estimated to infect around 630,000 people.11 While resistant pathogens first appeared almost exclusively in healthcare facilities, there has been a recent increase in community-acquired infections. The rise in resistance not only threatens to increase the numbers of untreatable infections that may lead to serious disability or death, but also threatens to limit the potential of many other medical procedures. Both surgical and nonsurgical procedures that are currently possible due to effective prophylaxis may become no longer possible due to the risk of infection. Finally, the economic costs of resistance are substantial and growing.12
For instance, direct costs of treatment are estimated at $20 billion per year in the United States alone, with indirect costs such as lost wages believed to be underestimated at $35 billion per year. These costs will be even higher as future resistance renders more and more antibiotics ineffective. Overuse is not the only problem with antibiotic prescribing—there are also substantial clinical and economic costs associated with underuse of appropriate antibiotic therapy. Unlike other scenarios in which (in the absence of financial restraints) clinicians provide the most effective treatment, they often prescribe antibiotics that are known to be less effective in order to preserve the potency of alternative, more powerful drugs.13 One study suggests that clinicians avoid using new, more powerful antibiotics in the face of resistant pathogens between 58% and 70% of the time, often citing as their reason that they want to preserve the use of agents that are seen as very effective against resistant pathogens.
Other evidence suggests that cost concerns play a large part in drug choices when treating infectious diseases, with 87% of hospital providers citing cost reduction as a major driver of antibiotic steward programs.14 Regardless of the underlying cause, a large body of research substantiates the belief that when an appropriate antibiotic regimen is delayed or not started, negative clinical consequences, including preventable death, result.15-18 Hesitancy to use the most effective medication due to concern for promoting resistance creates a unique and somewhat paradoxical circumstance, as this restraint reduces the financial incentives for the development of novel agents.
Policies intended to limit use and minimize resistance, such as those restricting the most effective agents to the cases in which less effective agents fail to bring about a cure, are possibly valuable, but they must be balanced with the competing concern of underutilization. The key question is defining the optimal utilization rate. One approach, while hypothetically extreme, would be to ignore the resistance issue and presume that innovation would always develop effective new agents.
PDF is available on the last page.