News
Article
Real-world longitudinal data suggest that in CLL, regular immunoglobulin replacement therapy may not reduce, and may even correlate with increased, serious infection rates during treatment periods.
A significant new study from Australia is prompting clinicians to reconsider the use of immunoglobulin replacement therapy (IgRT) in patients with chronic lymphocytic leukemia (CLL). The findings, published in Blood Advances, suggest that while IgRT use has increased substantially over time, it was not associated with a reduced risk of serious infection-related hospitalizations—and may even correlate with higher infection rates in some cases.1
Drawing on data from 6217 patients diagnosed between 2008 and 2022 across Victoria, this retrospective, real-world analysis was led by researchers at Monash University. Among the entire cohort, 753 patients (12.1%) received at least one IgRT treatment, and of these, 524 patients (69.6%) received IgRT regularly, defined as having at least 3 treatments within 5 months with no treatment gap exceeding 3 months.
The proportion of patients experiencing serious infections doubled to 3.9%, while the proportion receiving IgRT quadrupled from 2.0% in the first year after diagnosis to 8.8% by year 14, peaking at 12.0% at the ten-year mark. Among patients who received IgRT, 524 (about 8.4%) underwent treatment at regular intervals, as per clinical guidelines.
Paradoxically, within this subgroup, the incidence of serious infections was higher during periods of active IgRT (0.056 per person-month; 95% CI, 0.052-0.060) compared to periods off therapy (0.038 per person-month; 95% CI, 0.035-0.042). This unexpected finding persisted even after adjusting for age, comorbidities, anticancer treatment, and prior infections, with a hazard ratio of 1.30 (95% CI, 1.12-1.50) for infection during IgRT periods compared to non-treatment periods.
The results contrast with a recent US study by Soumerai et al., which reported decreased infection rates following IgRT initiation.2 However, that study employed a before-and-after analysis, making it vulnerable to regression to the mean, a potential source of bias that the present study sought to mitigate by comparing infection rates during and outside of IgRT exposure periods.
The temporal relationship between infections and IgRT treatment decisions was also notable. Patients who had a serious infection in the prior 30 days were significantly more likely to start IgRT (HR, 31.91; 95% CI, 24.94-40.82), discontinue IgRT (HR, 3.81; 95% CI, 2.71-5.35), or restart it following a cessation (HR, 11.61; 95% CI, 8.00-16.85). The authors suggest that clinicians may interpret infections during IgRT as treatment failure, prompting discontinuation, only to reinitiate treatment if subsequent infections occur.
Interestingly, the median survival from CLL diagnosis was approximately 10 years (121.5 months), while median survival from the first IgRT treatment was notably shorter, at 72.7 months. In patients with a serious infection in the prior month, the 30-day mortality rate was 9.0 per 100 person-months, compared to 0.8 per 100 person-months in those without a recent infection.
The reasons for these results appear multifactorial, involving both clinical policy and patient selection. The authors note that IgRT is a high-cost, limited-supply blood product, and its use in CLL has nearly doubled in recent years. “In our cohort, 12.1% of CLL patients received IgRT at some time during the follow-up, almost double the proportion of IgRT use in CLL patients recently reported in a US longitudinal study.31 The higher uptake in our study might be influenced by the public funding of IgRT in Australia.” In addition, patients who continued regular IgRT may have been inherently at higher risk for infections or experienced more aggressive disease, possibly necessitating ongoing therapy.
CLL, the most common form of adult leukemia in the Western world, is frequently accompanied by immune dysfunction arising from both the underlying malignancy and its therapeutic interventions.3 Hypogammaglobulinemia, defined by low serum IgG levels, is common in this population and contributes to recurrent infections, which are a leading cause of CLL-related mortality.
Reflecting on these complex dynamics, the study’s authors noted that the study “highlights a disconnect between infection-risk and IgRT use over the disease journey.” Serious infections were associated with not only IgRT initiation and re-initiation, but also cessation. Given the increasing use and costs of IgRT, these results emphasize the need to evaluate the causal association and determine which patient subgroup, and at what point during their disease course, may benefit most from IgRT.
References:
Stay ahead of policy, cost, and value—subscribe to AJMC for expert insights at the intersection of clinical care and health economics.