https://www.ajmc.com/journals/issue/2015/2015-vol21-n6/impact-of-patient-centered-medical-home-on-veterans-experience-care
Impact of the Patient-Centered Medical Home on Veterans' Experience of Care

Ashok Reddy, MD; Anne Canamucio, MS; and Rachel M. Werner, MD, PhD

Strong primary care systems with services dedicated to providing patient-centered, continuous, comprehensive, and coordinated healthcare may improve patient health outcomes and lower costs.1-4 The patient-centered medical home (PCMH) is a widely adopted healthcare delivery model that seeks to strengthen primary care. As of 2013, the National Committee for Quality Assurance (NCQA) has recognized over 5000 practices as medical home practice sites.5 In addition, practice transformation to the medical home model is being tied to payment by several insurers, including CMS, which is investing millions of dollars into medical home practices to achieve the triple aim—improved patient experiences of care, improved quality of care, and reduced costs.6
 
However, early evidence on adoption of PCMH has demonstrated limited success in achieving these goals.7,8 Furthermore, there is limited evidence on whether patient experiences of care have improved in this “patient-centered” intervention. Patient experience of care is increasingly being recognized as an important measure of healthcare quality, as patient-centered care is associated with improved patient satisfaction, adherence to physician recommendations, self-management, and health status among individuals with chronic diseases,9-14 as well as rehospitalization rates and mortality among hospitalized patients.15,16 Despite the evidence in favor of providing patient-centered care, few studies have investigated the effect of the medical home model on patient-centered care.
 
To our knowledge, only 3 studies of adult populations have shown a small but positive effect of medical home adoption on elements of patient care experience17-19 and 2 other studies did not show any effect.20,21 However, these studies have been limited by small patient samples, limited measures of medical home intervention, and lack of evidence on how well the medical home principles are being implemented across practice sites or on the “dose” of the medical home intervention.
 
To address several of these challenges, our study uses data from one of the largest national experiments with medical home adoption to date—medical home adoption by the Veterans Health Administration (VHA). The VHA began implementation of the medical home model in April 2010, which it called the Patient Aligned Care Team, or PACT, initiative. The VHA dedicated over $1 billion nationally to PACT implementation.
 
The PACT initiative’s main goals for primary care are for it to become more comprehensive, coordinated, and patient centered.22 While similar in focus to NCQA medical home recognition, that tool, in several areas, may not be appropriate for the VHA setting. In fact, the VHA has been a leader in several NCQA medical home domains, such as health information technology infrastructure, electronic prescribing, patient registries, and quality performance measurement.23,24 Thus, a major focus for evaluating the PACT initiative has been on how effectively these resources are being implemented.25
 
To measure the effect of this implementation on patient experience of care, we used a mixed-methods approach, linking data from a series of structured interviews with a staff of more than 50 primary care sites on the extent and success of PACT implementation with data on patient experience of care for more than 30,000 veterans.
 
METHODS
Overview
To examine the effect of PACT implementation on patient experience of care, we used 2 sources of variation: the timing and the effectiveness of PACT implementation across study sites. In doing so, we measured the impact of having a PACT primary care provider (PCP) on a patient’s experience of care, and the impact of how effectively a clinic has implemented the PACT model on that same experience. Using a repeated cross-sectional design, we conducted patient-level analyses, with patients clustered within PCPs and sites of care, to test whether changes in healthcare delivery in the VHA under the PACT transformation led to changes in patient experience of care.
 
Study Population
Our study was based in a large mid-Atlantic region of the VHA (Veterans Integrated Service Networks [VISN] 4), which includes 56 primary care sites providing care for more than 300,000 veterans. Our study cohort included patients who responded to the Survey of Healthcare Experiences of Patients (SHEP) between July 2010 and October 2012 within VISN 4. SHEP is mailed monthly to a random sample of veterans with an outpatient visit in the previous 30 days, stratified by clinic site and physician type (primary care vs specialist).26 The national response rate for the outpatient SHEP in the 2010 survey was 53.2%.27
 
Measures
Main independent variables. Our independent variables are derived from detailed interview-based qualitative data conducted in VISN 4 on PACT implementation. Below, we have included a detailed overview of the 3 methods we used to measure PACT implementation. A full description of the mixed-methods methodology and interview guides used to derive each independent variable has been published previously.28
 
Measure of timing of PACT implementation. The first measure of PACT implementation was based on the dates that each PCP in the VISN became a PACT provider. We created a binary variable that equaled 1 when a provider became a PACT provider, and 0 before. Providers were considered to be PACT providers once they had started the PACT training process.
 
Measure of structural change to support PACT implementation. The second measure of PACT implementation measured whether and when specific structural changes in primary care delivery were made. We conducted site visits and structured interviews with key informants at each site with the goal of identifying key structural elements of PACT implementation. Key informants were the persons at each site charged with day-to-day responsibilities related to PACT implementation. In cases where the initial contact was unable to answer all of the questions, we identified a second contact.
 
Structured interviews were based on an interview guide asking about structural changes to support PACT implementation in the following 10 areas: 1) accessing and using data for quality improvement; 2) care management of high-risk patients; 3) nurse medication protocols; 4) transitions from the emergency department; 5) transitions from the hospital; 6) alternatives to single-provider face-to-face visits; 7) changes to enhance access; 8) multidisciplinary teams; 9) team communication and functioning; and 10) using patient-centered methods (see eAppendix 1, available at www.ajmc.com, for interview guide). Five sets of interviews were conducted at 6-month intervals over the 2.5-year period of this study (July 2010 to December 2012 [the end month for the data analyzed]).
 
We summarized the interview data by creating binary variables for 9 of the 10 structural changes, indicating whether the site used any of the specific structural changes in each 6-month period—we did not include responses to queries about accessing and using data for quality improvement as respondents were often confused by this question. For example, in asking about changes to support enhanced access, we created a variable equal to 1 if a clinical site answered “yes” to any of the following questions in each time period: Are any strategies in place for enhanced access? Are scheduling scrubbing methods in place? Are you extending visit intervals when appropriate? Are you using any other methods to enhance access?
 
Measure of the overall quality of PACT implementation. Finally, we created one scale variable measuring the overall quality or effectiveness of PACT implementation at each site. Based on the responses to the questions on implementation of the structural measures, the interviewer was asked to rate the effectiveness of the implementation on a 5-point Likert scale ranging from 0 (if a particular structural change had not been made) to 4 (if fully implemented) for each of the 10 measuresconsisting of the 9 structural measures (after dropping the question on data access) and a measure of support from leadership. We then summed these scale ratings across the 10 questions, resulting in a summary score with a range of 0 to 40. Previous factor analysis demonstrated that the 10 items function as a summative scale with Cronbach’s alpha for the 10 items being greater than 0.75 in 4 time periods.
 
Dependent Variables
Our primary outcome variables include 5 measures of patient care experience: how well doctors/nurses communicate, rating of personal doctor/nurse, getting needed care, overall rating of Veterans Affairs (VA) healthcare, and getting care quickly. We used a standardized method to aggregate and dichotomize SHEP responses (eAppendix 2). For example, a survey respondent was asked the following questions: “A personal doctor or nurse is the one you would see if you need a checkup, want advice about health problem or get sick or hurt. Do you have a personal VA doctor or nurse?” (Response options: yes, no); and “Using any number from 0 to 10, where 0 is the worst personal doctor/nurse possible and 10 is the best personal doctor/nurse, what number would you use to rate your personal VA doctor/nurse?” The respondent was counted only if they had a personal VA doctor or nurse. Next, we created a variable equal to 1 if the respondent gave a score of 9 or 10. 
 
We analyzed SHEP survey responses from July 2010 through fiscal year 2012 in the VHA, which ended in September 2012. In the VHA, patients are assigned a primary care provider at the time of enrollment. This data was linked to PACT implementation data by linking the SHEP survey to corresponding PACT data based on the provider and clinic site and the date of the encounter.
 
Covariates
For each SHEP survey respondent we obtained age, sex, ethnicity, and race from the self-reported survey data. We linked the respondent’s zip code with 2012 Census American Community survey data to obtain the median household income. In addition, we used the RiskSmart Diagnostic Cost Group (DCG) files at the VA Austin Information Technology Center from the same fiscal year as the SHEP survey date to account for illness severity. All patient cohort data and covariates are listed in Table 1.
 
Statistical Analysis
We conducted patient-level analyses, with patients clustered within PCPs, using linear probability models to test whether changes under PACT transformation were associated with changes in patient experience of care. We used the following general form to test our hypotheses:
Outcomei,j,t = αPACTj,t +Xi + βPCP + εi,j,t
 
In this regression, the outcome variable is 1 of 5 defined patient experience outcomes, indexed to patient (i), PCP (j) and 6-month time period (t). The coefficient of interest is alpha, representing the effect of a PCP changing PACT implementation status on the outcome of interest. We modeled PACT implementation in 3 ways, as defined above: 1) a dummy variable indicating whether each PCP was a PACT provider in that study period; 2) a scale variable measuring the quality of PACT implementation in each study period for those providers who are PACT (non-PACT providers were assigned a value of zero); and 3) a vector of 9 dummy variables indicating whether PACT providers had implemented each structural change in each study period. We thus estimated the above equation 15 times, using the 5 outcome variables in combination with each of the 3 PACT implementation variables.
 
In our analysis, we controlled for patient-level covariates (ie, age, gender, income, ethnicity, race, and DCG risk score). In addition, we included PCP fixed effects (controlling for time-invariant differences across providers, allowing us to identify the effect of providers changing PACT status and allowing each PCP to serve as a control for him or herself), 6-month time period fixed effects (controlling for secular changes in the outcomes that are common to PACT and non-PACT providers), and a mean 0 random error component. All standard errors were adjusted for clustering within PCP using Huber-White estimators of variance.29,30
 
RESULTS
Between July 2010 and September 2012, 30,849 patient experience surveys were completed in VISN 4 that were linked to a provider and clinic site. Descriptions of the patient characteristics are shown in Table 1. A majority of our cohort of veterans were white males 65 years and older. The overall results of the SHEP survey demonstrate that veterans had a favorable patient experience of care in 3 domains: overall rating of VHA healthcare, overall rating of personal doctor/nurse, and how well doctor/nurse communicates (Figure 1).
 
During the 2.5-year study period of PACT implementation, there was a 10-fold increase in veterans who had a PACT provider (Figure 2). In addition, the percentage of PCPs who implemented specific elements of the PACT model increased in 8 of 9 measures (Table 2). For example, the percentage of PCPs who adopted high-risk registries increased from nearly 7% in the first time period to 64% in the last period. In contrast, we did not see an increase in the use of nurse medication protocols. During the study period, we also found that the quality of PACT implementation increased from 1 to 14 among PCPs in the cohort.
 
Our regression models examined the effect of PACT implementation measured in the 3 different ways on 5 domains of patient experience of care. Although we saw a substantial increase in the number of veterans with a PACT provider, in the PACT quality level, and in the adoption of PACT structural measures, we find little impact of PACT implementation on patient experience of care (Table 3). For example, having a PACT provider did not have an effect in any of the 5 patient care experience domains. In these models, the adjusted percentage-point difference in positive responses to patient experience of care between having a PACT provider and not having a PACT provider was less than 1 percentage point and not statistically significant. Similarly, there was no effect of a 10-point increase in the PCP PACT quality scale on any patient care experience measures.
 
Finally, we found that the effect of specific structural changes on patient care experience was generally small and not statistically significant. For 2 structural measures—alternative to face-to-face visits, and team communication and functioning—we found inconsistent results. We found that having alternatives to face-to-face visits was associated with a nearly 7-percentage-point (95% CI, –14.6 to –0.6; P = .03) worse rating in getting care needed, but we also discovered that policies related to team communication and functioning were associated with a 4-percentage-point (95% CI, 0.2-9.5; P = .04) higher rating in overall rating of the VHA.
 
DISCUSSION
A key foundation of the PCMH is improving patient experiences of care. However, despite wide adoption of the PCMH, there is little evidence of the impact of medical home transformation on patient experience of care.
 
We examined the impact of medical home implementation in the VHA on patient experiences of care. Over a 2.5-year time period, we found that there were significant structural changes made to improve primary care delivery. In a majority of structural measures of PACT implementation, primary care providers increased their adoption of these PCMH elements more than 7-fold. However, in our primary analysis, having a PACT provider or having PACT more effectively implemented was not associated with evidence of higher ratings in 5 major domains of patient care experience. We also found that a majority of structural measures were not associated with patient care experience; 2 cases were exceptions. Because we had multiple comparisons, these exceptions may in part be due to chance alone.
 
Our results represent an important contribution to the evidence on PCMH implementation and patient care experience. First, we provide evidence from one of the largest PCMH implementation initiatives in the country using a large cohort of patients. Second, we did not measure medical home implementation simply as an on-off switch; we also evaluated whether there was a dose effect of medical home on patient care experiences. By using qualitative data, we not only evaluated if successful implementation matters, but also which structural changes to support the medical home matter, if any. Although we did not find an association with medical home implementation on patient experience, this may simply represent the complexity required to measure and implement the medical home in practice.
 
On one hand, our results may seem surprising. We expected that improvement in patient experiences of care would be a core outcome of PCMH transformation. However, improving patient experiences of care is complex and influenced by an array of patient and social factors, including previous healthcare interactions, expectations, and attitudes that exist prior to any current experience within a healthcare system.20,31 It may be difficult for medical home transformation to influence the myriad of characteristics required to impact patient experience of care, especially over a relatively short period of time.
 
Limitations
Our study has several important limitations. First, the medical home model may have significant lag effects. PCMH, like other innovations, may take time to have significant impact on patient experiences of care. During this early phase of primary care transformation, we may be measuring the effect of unanticipated disruptions needed for practices to become functional medical homes. Second, as with most survey data, we may have response bias. For example, we had a limited sample of racial and ethnic minorities, which can influence the generalizability of our findings in this population. In addition, given the changes to primary care delivery, patients with strong opinions—positive or negative—may have undue influence on our findings. Third, this is an observational study, and as such causality cannot be inferred. Finally, while we use 3 unique measures of medical implementation, most based on qualitative data, this method has limits. For example, we assume providers become fully trained in PACT implementation on the date they became a PACT provider. However, providers may in fact take weeks to years to become proficient at implementing the principles of the medical home. While the interviewees were told about the confidentiality of the interviews, they may have overstated the positives that the clinic and providers have made in medical home transformation.
 
CONCLUSIONS
Despite several limitations, our study provides important insights on the impact of medical home implementation on patient experiences of care. Most medical home implementation efforts have focused on the establishment of key structural elements of the model without sufficient emphasis on the interpersonal aspects of primary care that contribute to improving patients’ experience of care. Increasing focus on these relational aspects around improved communication and trust may be central to improving patient care experience. As we move forward with the medical home model, it will be key to obtain more qualitative and quantitative data on what patients want in a medical home or a primary care practice more generally. As we focus on primary care transformation to improve healthcare delivery, we need to find ways to incorporate the patient’s voice and input into these transitions. 

Print | AJMC Printing...