The Prevalence of Glaucomatous Risk Factors in Patients From a Managed Care Setting: A Pilot Evaluation
Ervin N. Fang, MD; Simon K. Law, MD, PharmD; John G.Walt, MBA; Tina H. Chiang, PharmD, MBA; and Erin N. Williams, RN
The importance of the risk factor (RF) awareness in ophthalmology has been underscored in recent years by the heightened availability of validated predictive models estimating the risk of conversion from ocular hypertension (OHT) to glaucoma.
The goals of RF analysis are to identify patients who will most likely benefit from early treatment, make judicious decisions regarding when to initiate treatment, and determine how aggressively treatment goals should be set.1
The Ocular Hypertension Treatment Study (OHTS), a pivotal prospective trial comparing the outcomes of treatment versus observation in subjects with OHT, demonstrated that reducing intraocular pressure (IOP) by at least 20% decreased the cumulative probability of developing glaucoma after 5 years from 9.5% to 4.4%.2,3
The results of this study appeared to indicate that a relatively smaller percentage of untreated patients with OHT will develop glaucoma compared with those not progressing. Yet, multiple population studies have approximated that 3 million to 6 million individuals in the United States alone (including 4%-10% of adults older than 40 years) harbor IOPs of at least 21 mm Hg without overt clinical signs of glaucomatous damage.4-7
Extrapolating the likelihood of glaucoma progression to these figures, it is estimated that without intervention, nearly 0.3 million to 0.6 million Americans are potentially at risk for glaucomatous injury within a 5-year period.
In addition to ascertaining the link between reduction in IOP and the prevention or delay of glaucoma onset, a landmark contribution of the OHTS was the identification of significant RFs for progressing from OHT to glaucoma. RF recognition by the clinician is paramount to instituting timely treatment as well as avoiding the potential risks, inconvenience, and expense of unwarranted intervention. The manner in which ophthalmologists apply their knowledge of RFs in clinical practice, however, continues to evolve. Managing patients with glaucoma has traditionally relied on the following sequence of events: (1) detecting structural or functional damage; (2) establishing a target IOP to reduce or deter damage; (3) initiating treatment; and (4) monitoring for disease progression.1
Because glaucomatous damage is irreversible, however, treatment decisions based reactively on indicators of disease progression rather than RFs may result in substantial vision loss for high-risk patients who might have otherwise avoided glaucomatous injury through timely intervention.
We conducted a retrospective review and analysis of medical records for 1189 patients attending an ophthalmology clinic of a large, urban, managed care organization in Los Angeles, California. The aim of this study was to determine the percentage and characteristics of patients presenting with glaucomatous RFs in this sample and to calculate the average predicted 5-year probability of glaucoma conversion for patients with OHT. For the former analysis, we explored and described the potential associations between demographic, clinical, and ocular RFs for glaucoma progression using cross-tabulation statistics and regression methods. For the latter, we employed he predictive model developed by the OHTS group, which was derived from the findings of the OHTS. This model was developed based on attributes of the observation group in the OHTS and validated recently in the placebo group of the European Glaucoma Prevention Study (EGPS).2,4,8
The pooled predictive model is comprised of a simple risk scoring system, which may be feasibly adopted by the clinician, and has been found to be reasonably discriminatory for estimating the 5-year risk of conversion from OHT to glaucoma.
Methods and Materials
The electronic medical records database was searched, and all medical records for patients treated under the International Classification of Diseases, Ninth Revision
(ICD-9) global code index 365.x for glaucoma and glaucoma-related diagnoses (eg, OHT and glaucoma suspect) were eligible for inclusion. All patients had been examined on at least 1 occasion with the initial visit occurring between June 2000 and May 2005.
Based on the evidence from the OHTS and other published clinical trials,2,9-17
we identified 15 RFs that have been reputedly associated with glaucoma progression (Table 1
). The medical records were screened for the following RF information: age, racial descent, family history of glaucoma (ie, parents and/or siblings), and documented diagnoses of coexistent diabetes mellitus (DM), systemic hypertension, cardiovascular disease, migraine, or vasospasm. Cardiovascular disease, systemic hypertension, DM, and migraine were considered present if a clear documentation of diagnosis or related treatment was recorded on the medical chart. Additionally, we collected data on reported ocular measurements of IOP, vertical cup-to-disc ratio (CDR), central corneal thickness (CCT), and visual field indices, including pattern standard deviation (PSD), and mean deviation (MD). Documentation of the presence or absence of myopia greater than –3 diopters, pseudoexfoliation, and optic disc hemorrhage was also recorded.
All statistical analyses were performed with the aid of Statistical Package for the Social Sciences (SPSS, Inc, Chicago, IL) for Windows (version 14.0, Microsoft Corporation, Redmond, WA). Missing data were handled using listwise deletion when the pattern of missing data was determined to be random and the number of missing cases did not exceed 5%. For data not meeting these criteria, an expectation maximization (EM) algorithm (SPSS, 1999) was employed. The EM algorithm consists of a 2-step process. First, the expected values missing observations are computed using regression equations based on the observed data. Then, the missing values are replaced by the conditional means derived from the regression equations.18
Statistical significance was defined as P
We applied descriptive statistics to indicate the frequency of demographic and clinical RFs. Measurements of age and ocular indices, including IOP, MD, vertical CDR, CCT, and PSD, were reported as the mean ± 1 standard deviation (SD) and analyzed as continuous variables. For cases documenting ocular metrics for the right and left eyes obtained on 1 or more visits, the calculated mean ocular measurements represented the average of the mean respective values for the right and left eyes. Demographic and clinical parameters, which were characterized as categorical or nominal data, were expressed as number counts and the percentages of the total number of patients.
In exploring the data, we addressed the following research questions for the study sample:
1. Are there any identifiable patterns of association among demographic, clinical, and ocular RFs?
2. What is the relative influence of nonocular and ocular RFs in predicting visual field loss and optic nerve damage as quantified by MD and CDR, respectively?
3. What is the 5-year risk of progression to glaucoma for patients with OHT?
In addressing the first question, we determined the distribution of RFs among patients by dividing the study sample into subgroups using the presence or absence of demographic, clinical, and ocular RFs as stratification variables. Threshold values for defining the presence of risk versus the absence of risk for ocular RFs were based on standards established in the published literature2,9-17
or by expert opinion provided by practicing ophthalmologists19
(Table 1). A preliminary series of univariate analyses was performed, and nonocular and ocular RFs demonstrating statistically significant associations were cross-tabulated in separate two-by-two contingency tables. The strength and significance of associations between the presence of these demographic and/or clinical RFs and the presence of ocular RFs were assessed using χ2 statistics and expressed as a phi coefficient for nominal variables.
For the second investigation, 2 separate multiple regression models, Model 1 and Model 2, were analyzed using a stepwise selection method. The explanatory variables consisted of the RFs from the prior analysis. The mean bilateral MD and vertical CDR were modeled as dependent variables in Model 1 and Model 2, respectively, and were controlled for in Model 2 and Model 1, respectively. These ocular measures were elected as dependent variables because of their dual association as indicators of disease severity, and given the fact that rather than true RFs, these parameters herald signs of early glaucomatous damage.4
Nominal data for demographic and clinical RFs were coded as binary variables, with “0” and “1” signifying the absence and presence of the RF, respectively, whereas age and ocular RFs were analyzed as continuous variables. Goodness of fit and the assumption of independent errors were assessed via R2
and the Durbin-Watson statistic, respectively.
For the final inquiry, we calculated the 5-year risk for progression to glaucoma for the subset of patients with OHT by applying the validated risk scoring system developed by Medeiros and colleagues.4,8
This predictive model has been recently updated and validated by the OHTS group and the EGPS group, and details of the scoring method have been previously published.4
For each patient with a documented diagnosis of OHT, we assigned points ranging from 0 (denoting lowest risk) to 4 (denoting highest risk) to each of 5 predictors: baseline age, IOP, CCT, vertical CDR, and Humphrey visual field PSD. Ocular measurements represented the average of the mean values for the right and left eyes. All points for the 5 RFs were summed, and the 5-year risk for conversion to glaucoma was determined based on the composite risk score (Table 2
). Additionally, we calculated the average composite risk score and the average 5-year risk for conversion from OHT to glaucoma for the total OHT sample.
ResultsRisk Factor Prevalence and Patterns of Association
. Medical records were available for 1189 patients who were evaluated between June 2000 and May 2005 for conditions classified under the ICD-9 global code index 365.x for glaucoma. One patient, aged 4 years, was deemed an outlier and excluded from the analyses. Tables 3
list the demographic, clinical, and ocular characteristics of the sample. The largest percentages of missing data were evident for the age (28.7%) and race (31.6%) parameters and measurements of CCT (81.0%).
Based on the available ages for 847 patients, the sample generally represented an elderly cohort with a mean age of 63.0 years (SD 11.9); approximately one third of these patients exceeded the age of 70 years. Specific glaucoma-related diagnoses were recorded for 1120 patients, of whom 79 (7.1%) carried a documented diagnosis of glaucoma. Glaucoma suspects comprised the majority (69.5%), followed by patients with OHT (18.6%) and those with primary open-angle glaucoma (POAG [3.2%]). Most patients (81.3%) reported no familial history of glaucoma. Hypertension and DM were the most prevalent comorbidities affecting 38.8% and 23.5% of the participants, respectively. Approximately 18% of patients with available data were of African American descent, while Latinos and Caucasians represented 25.8% and 18.6% of the cohort, respectively.
Data on ocular measurements were also inconsistently documented across patients (Table 4). However, 1178 (99%) patients had recorded measurements of IOP on at least 1 occasion. The mean IOP for both eyes was 18.0 (SD 11.9) mm Hg. On average, the vertical CDR, based on measurements for 1048 patients, was 0.52 (SD 0.18). Visual field testing revealed a mean PSD of 2.62 (SD 1.8) dB and a mean MD of −2.20 (SD 3.2) dB based on 861 and 863 documented observations, respectively. For the 227 patients with measurements of CCT, the mean value was 553.0 μm (33.6). None of the patients had pseudoexfoliation; optic disc hemorrhage and high myopia occurred infrequently.
Univariate regression analyses examining the relationships between nonocular and ocular RFs demonstrated that when age and DM were considered alone, both were significant predictors of MD, PSD, vertical CDR, IOP, and CCT (all, P
<.05; each based on the F-statistic and 1 degree of freedom). Age was positively associated with these ocular parameters, whereas the presence of DM was negatively associated. None of the remaining demographic or nonocular clinical RFs shared significant associations with ocular RFs. (Because of the negligible numbers of patients with cardiovascular disease and migraine, we did not include these RFs as predictive variables.)
Based on the results of this preliminary analysis, we stratified the sample using the following criteria: age >70 or ≤70 years, DM or no DM, mean IOP >21 or ≤0.8 in both eyes, CCT ≥536 μm in either eye or >536 μm in both eyes, and mean MD <−10 dB in either eye or MD ≥−10 dB in both eyes. These criteria were selected because of their reputed association with RFs or indicators of glaucoma progression, as reported in the published literature.20-23
The criterion for disease progression as indicated by mean MD was based on expert opinion provided by practicing ophthalmologists.19
Applying cross-tabulation statistics, we found negligible, but significant, associations between age >70 years and both the presence of elevated IOP >21 mm Hg (phi –0.10; P
= .004) and the presence of greater visual field loss (phi 0.16; P
<.001). Coexistent DM was also weakly but inversely and significantly associated with thinner central corneas (<536 μm) (phi –0.15; P
= .029) and higher vertical CDR (>0.8) (phi –0.082; P
= .008) (Table 5
To address the second research question, we investigated the relative contribution of nonocular RFs and ocular RFs in predicting visual field loss and optic nerve damage (as measured by MD and CDR, respectively) by analyzing 2 separate multiple regression models using stepwise methods. In both models, based on the results of the prior analyses, we assigned nonocular RFs, age and diabetes, and ocular RFs, mean IOP, and mean CCT as explanatory variables while controlling for mean MD and mean vertical CDR in the alternate model. Both final regression models were determined to be adequately fitted.
In Model 1, age, mean CCT, and mean vertical CDR were significant predictors of MD, cumulatively accounting for 11.2% of the variance in visual field measurement. Of the 3 explanatory variables, the extent of visual field loss, as measured by mean MD, was most sensitive to variances in age. This was evident in comparing the standardized beta coefficients (Table 6
), which showed that increasing age by 1 SD (ie, ~12 years) in the study sample worsened the mean MD by 0.34 SD (1.8 dB). By comparison, increases in the mean CCT and vertical CDR by 1 SD worsened the mean MD by 0.12 SD and 0.09 SD, respectively. Because of the inverse relationship between mean CCT and mean MD, indicating worsening visual field defect with thicker central corneas, we evaluated the independent variables in model 1 for colinearity, since correlations between predictor variables may affect the direction of relationships between independent and dependent variables. In this analysis, we observed a moderate, significant association between mean CCT and vertical CDR (Pearson’s r = –0.522; P
<.001). Although knowledge of the mean CCT and vertical CDR significantly improved the overall predictive value of the model, these ocular RFs jointly accounted for approximately 1% of the variance in mean MD.
In Model 2, age, mean CCT, mean IOP, and mean MD were collectively responsible for 28.9% of the variance in mean CDR, with all variables contributing significantly to the predictive merits of the model (Table 6). Among the RFs, mean CCT yielded the greatest influence over the extent of optic nerve injury. The standardized beta coefficient for this variable indicated that increasing the mean CCT by 1 SD (33.6 μm) was inversely associated with a decrease in the mean vertical CDR by 0.50 SD (0.9). Compared with the prior model, age was a less reliable predictor, accounting for 0.3% of the variance in the mean vertical CDR.
5-Year Risk of Conversion to Glaucoma for OHT Patients
Medical records for 42 of the 220 (19.1%) cases with an ICD-9 diagnosis of OHT rendered sufficient data for the risk conversion analysis. The average age of this subset was 60.7 (SD 9.2) years with 9 of the 42 (21.4%) patients older than 70 years. The mean IOP for both eyes was 21.9 (SD 3.1) mm Hg. The mean vertical CDR, CCT, and PSD were 0.53 (SD 0.18), 551 (SD 31.9) μm, and 2.3 (SD 1.4) dB, respectively. Compared with the 155 and 220 OHT subjects who had available data on age and mean IOP, respectively, the subset appeared to be similar in age (mean age 60.7 years for subset vs 62.1 for sample; P
= .326) and mean IOP (21.9 mm Hg for subset vs 21.0 for sample; P
Applying the risk scoring system for the OHTS predictive model,4,8 the mean composite score for these 42 patients was 9.7 (SD 3.2), signifying a 15% cumulative 5-year risk for developing glaucoma. Overall, scores ranged from 1.0 to 16.0. More than 71% of the sample harbored a 5-year risk of ≥15%; for 9 (21.4%) of these patients, the cumulative probability of conversion to glaucoma was ≥33%. The Figure
illustrates the distribution of patients grouped by risk scores and the associated 5-year risk for glaucomatous progression.
Given that the majority of patients treated for ICD-9 glaucoma-related diagnoses in this managed care ophthalmology practice were either glaucoma suspects or patients with OHT, our observations, albeit retrospective, imply that there may be a trend toward preventive screening for patients at risk for glaucomatous progression. Alternatively, the higher proportions of glaucoma suspects and patients with OHT may have reflected outdated diagnoses arising from the failure to update the initial recorded diagnoses. However, the documented ocular measurements (Table 4) also confirmed that, on average, patients in this sample did not exhibit clinical evidence of advanced disease. In contrast, nonocular comorbidities were prevalent, with almost one fourth of the cases having DM and over one third afflicted with systemic hypertension.
Exploring the potential relationship between nonocular and ocular RFs, we found that the results of the preliminary univariate analyses mirrored the findings of earlier studies demonstrating significant positive associations between older age and MD, PSD, CDR, IOP, and CCT.3,4,8,13,24,25 Coexistent DM, which has been associated with an increased risk of POAG with some equivocality,26-28
was weakly but inversely and significantly associated with thinner central corneas (<536 μm) and higher vertical CDR (>0.8). In other terms, patients with DM in this sample tended to have thicker central corneas and lower vertical CDR. Regarding the relationship between DM and CCT, the Barbados Eye studies observed thicker corneas among participants who had a history of DM.24 More recently, Özcura and Aydin (2007) surmised that the risk of POAG in diabetic patients is equal to or less than normal individuals because patients in the former group tend to have thicker corneas and, thus, higher IOPs than the latter, which may lead to a false diagnosis of glaucoma.28 The cross-tabulation analysis in our study also appeared to support this postulation. Compared with patients without DM, a significantly lower percentage of patients with DM had central corneal measurements <536 μm, but a greater percentage had IOPs >21 mm Hg (Table 5).
Quantifying the relative contribution of nonocular and ocular RFs in predicting MD and CDR in this sample was compromised by the large percentages of missing data, particularly for the measurements of CCT; only approximately 19% of the total sample had recorded measurements for this RF. Using the EM algorithm to replace missing data, the multivariate regression analyses identified age, mean CCT, and mean vertical CDR as significant predictors of MD. These explanatory RFs collectively accounted for only 11.2% of the variance in MD; however, this low predictive power was not unexpected given the fact that, by definition of diagnosis, visual field defects would not be prevalent among glaucoma suspects, who comprised the majority of the sample.
In Model 2, 4 of the 5 RFs (specifically age, mean CCT, mean IOP, and mean MD) were significant predictors of the extent of optic disc injury. The strongest predictor was CCT, accounting for approximately 50% of the variance in vertical CDR. As in the prior regression analysis, the absence or presence of DM did not contribute significantly to the model’s predictive value.
Based on the OHTS risk scoring system, the final RF analysis showed that, on average, the patients with OHT in our sample had a 15% cumulative 5-year risk for glaucomatous progression. This risk estimate was higher than the 11.6% cumulative 5-year risk reported for the Diagnostic Innovations in Glaucoma Study (DIGS) cohort in Medeiros et al.8
In this pilot study, the risk scoring system was derived from the pooled hazard ratios for both untreated and treated patients from the OHTS and validated longitudinally in untreated patients from the DIGS. For the present study, we used the updated OHTS predictive model, which relied on the pooled data for untreated cohorts from the OHTS and the EGPS.4
We did not collect information on treatment history, which represents a significant limitation of this analysis. It might be noted, however, that the OHTS failed to identify any appreciable differences between the predictive factors for treated and untreated patients.4,8,29
Other limitations of our present study include its retrospective approach, the lack of treatment-related data, and our inability to confirm clinical diagnoses and ocular measurements by direct testing. Additionally, we did not collect information on the methods for obtaining ocular measurements or longitudinal data; thus, the variables represented reported point-in-time assessments. Lastly, it is difficult to determine the extent to which the small subset of OHT cases represented the total sample of subjects diagnosed with OHT in this study. When compared with the total sample of subjects diagnosed with OHT in this study, however, the subset did not appear to differ significantly in mean age and mean IOP.
Notwithstanding these limitations, this study provides further insight into the prevalence and nature of glaucomatous RFs in an urban managed care population. The retrospective review of medical records revealed that known demographic, clinical, and ocular RFs for glaucomatous progression were present in up to approximately one third of patients treated for ICD-9 glaucoma-related diagnoses. Although the results of the statistical analyses confirmed the predictive value of these RFs, it is evident that existing models for determining the risk of glaucomatous progression may not account for several important RFs. Three of the 5 most prevalent RFs in this sample, namely systemic hypertension, positive family history, and Latino ancestry, were not included in the OHTS risk scoring system. These RFs did not enhance the explanatory power of the OHTS model; however, ascertainment of these factors was based on patient history without confirmation of chart review or direct testing.4
Additionally, a number of RFs described as predictive in other studies, such as exfoliation syndrome and pigment dispersion,25,30-32
were not evaluated in the OHTS model because of the small numbers of affected patients. The percentages of missing data in our analyses suggested that current predictive models might also rely on certain ocular indices, such as PSD or CCT, which may not be routinely or practically measured in clinical practice.
Finally, but foremost, for a risk scoring system to be clinically useful, clinicians must be educated on interpreting the results and offered guidance on how to use the information optimally in making treatment decisions. For example, using various methods, several studies evaluating the long-term outcomes of treated and untreated patients have generated data suggesting a clinically relevant threshold value (ie, the risk above which the expected benefits of treatment outweigh the possible risks) for the risk of conversion from OHT to glaucoma and/or to blindness or disability.2,3,33-36
Although one study estimated that between 12 and 83 patients with OHT will require treatment to prevent 1 patient from progressing to unilateral blindness over a 15-year period,33
this information may be less useful for the practitioner who must prospectively render decisions for the individual patient on a daily basis. Thus, future studies should aim to develop predictive models incorporating these considerations and tailor existing models for greater ease and practicality of use and interpretation in the clinical setting.