• Center on Health Equity and Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Call Center Performance Affects Patient Perceptions of Access and Satisfaction

Publication
Article
The American Journal of Managed CareSeptember 2019
Volume 25
Issue 9

Greater telephone wait times, but not abandonment rates, were associated with lower patient perceptions of their ability to obtain urgent care in a timely manner.

ABSTRACT

Objectives: There is little research on the relationship between call center performance and patient-centered outcomes. In this study, we quantified the relationships between 2 measures of telephone access, average speed of answer (ASA) and abandonment rate (AR), and patient satisfaction outcomes within the Veterans Health Administration (VHA).

Study Design: We analyzed 2015 and 2016 data from the Survey of Healthcare Experiences of Patients and linked them with administrative data to gather features of the patient visit and monthly measures of telephone access for each medical center.

Methods: We used mixed effects logistic regression models to estimate the effects of ASA and AR on a variety of access and satisfaction outcomes. Models were adjusted for patient-level demographics, time-varying facility-level characteristics, features of the patient visit, and facility-level random effects to control for care quality and case mix differences.

Results: The VHA made substantial strides in both access measures between 2015 and 2016. We found that a center’s ASA was inversely associated with patients’ perceptions of their ability both to access urgent care appointments and to do so in a timely manner. In contrast, telephone AR was not associated with any of the patient satisfaction outcomes.

Conclusions: Our results associate decreased telephone waits with improved perceptions of urgent care access even without concomitant decreases in observed appointment waits. These findings may have important implications for regulators as well as for healthcare organizations that must decide resource levels for call centers, including hospitals, federal health insurance exchanges, and insurers.

Am J Manag Care. 2019;25(9):e282-e287Takeaway Points

Using survey data from 2015 and 2016, we conclude that telephone access has important consequences for patient satisfaction. We show that:

  • Longer telephone wait times were associated with decreases in patients’ perceived ability to access urgent care appointments and to do so in a timely manner.
  • There was no clear association between call abandonment rates and patients’ perceptions of healthcare access or satisfaction with their care.
  • If hospitals and providers are to become more patient centered, attention is needed to how patients are served when calling for appointments and medical questions.

For many patients, picking up the telephone is the first step in their engagement with the healthcare system. Hospitals and payers each have independent call centers to assist patients with medical and administrative questions, appointments, billing, and more. A 2015 survey of healthcare call center leadership found that many of these call centers are long established, and an overwhelming majority (93%) are managed in house. Most respondents predicted that service levels, staffing, and the importance of telephone-based services will grow in coming years.1 Additionally, the Affordable Care Act imposed requirements for state and federal health insurance exchanges to operate telephone hotlines for citizens, under guidelines promulgated by HHS.2

Two of the largest efforts to collect data on call center performance, at least for public programs, are operated by CMS and the Veterans Health Administration (VHA). CMS has the authority to monitor call centers for Medicare Advantage organizations, Prescription Drug Plan sponsors, and Medicare/Medicaid insurers under 42 CFR 432.128(d)(1). CMS employs secret shoppers to collect a variety of call center performance metrics and conduct quarterly “timeliness studies” to evaluate call center performance. A call center is assigned a passing grade if the average hold time is less than 2 minutes and if fewer than 5% of calls are disconnected. Additionally, summary reports of other performance metrics are provided back to the call center but are not used for assessment purposes.3

VHA, the largest integrated healthcare system in the United States, also provides timely telephone services, including 24/7 telephone access to clinical staff trained to provide healthcare advice and information, to all veterans receiving care at its facilities. Facility-level telephone data and a variety of care quality measures are used to summarize and improve medical center performance. To accelerate improvement, VHA established a nationwide initiative to improve telephone access in 2009-2010. The initiative included components such as the installation of automatic call distribution systems, improved training and monitoring of call center teams, and the creation of multidisciplinary teams at participating medical centers to test and implement quality improvement strategies.4 Timely telephone services are currently assessed by 2 measures: average speed of answer (ASA) and abandonment rate (AR). The VHA’s goal for each facility is to have an ASA of less than or equal to 30 seconds and an AR of less than or equal to 5%.5

There is little research on the relationship between specific measures of call center performance and patient-centered outcomes. Within the healthcare literature, studies have found that hold times, staff courtesy, whether staff provided requested medical information or help,6,7 and the number of transfers4 were all related to overall patient satisfaction with care. Other measures cited as important in the nonhealthcare literature include first-call resolution, AR, ASA, total call volume, and average talk time, among others.8,9 These earlier efforts had several limitations. For instance, they generally involved relatively small sample sizes and a single outcome measure (overall patient satisfaction with care).

In this study, we quantified the relationship among multiple measures of telephone access and satisfaction within the VHA. Using Survey of Healthcare Experiences of Patients (SHEP) data on primary care visits to the VHA during federal fiscal years (FYs) 2015 and 2016, we examine whether improvements in clinic-level telephone access measures led to concomitant improvements in a wide range of patient-reported outcomes. To our knowledge, no previous work has taken advantage of this large, national data set to examine these relationships. It is important for hospitals, payers, and other healthcare organizations such as CMS to understand whether telephone access is meaningfully associated with patient perceptions of care quality and access. Without such evidence, insufficient resources may be directed to call centers, and interventions to improve call center performance may not include the metrics that matter most for patient satisfaction.

METHODS

Sample Selection

These analyses used VHA telephone access administrative measures to predict self-reported patient satisfaction with care. Data for this study came from the FYs 2015 and 2016 SHEP outpatient cohorts, which were the most recent years of data available during the study period. We did not use data from before 2015 due to changes in the SHEP sampling methodology that could confound the analysis. SHEP is a nationwide mail survey distributed to veterans after a visit to a VHA facility that seeks to obtain veterans’ perceptions of their care. For outpatient care, a simple random sample of patients with completed appointments at every VHA facility nationwide is selected each month. Thus, SHEP may be considered a repeated cross-sectional survey. The overall response rates in 2015 and 2016 were 43% and 41%, respectively.

Survey responses were linked back to VHA administrative data to gather features of the patient visit. Respondents were included in the sample if the patient completed at least part of the SHEP survey; they were not missing demographic information, such as age, race/ethnicity, or gender; and their appointment was for primary care. For the 6.2% of patients with more than 1 survey response, only the first response was included in the sample.

Telephone data that were missing responses or had a value of 0 recorded for either AR or ASA were removed from the data set, as were unrealistic outlier data including errors in data entry in the top or bottom 2.5% for either AR or ASA.10 This cutoff threshold was later tested and performed well in various sensitivity analyses. The final sample included 252,145 unique patients across 285 medical facilities.

Telephone Access Measures

VHA collects 2 types of telephone access data for all incoming calls. Response time is measured by the average length of time elapsed before a caller reaches a staff member:

ASA = Sum(Answer speed × Call volume)

Sum(Call volume)

VHA medical centers also collect data on call abandonment, regardless of call length. Telephone AR is defined as the percentage of calls coming into a telephone system that are terminated by the person originating the call before being answered by a staff person. AR is measured as:

AR = Sum(Abandoned calls)

Sum(Call volume)

Both telephone metrics are tracked locally by individual VHA medical centers, and monthly averages are then entered by staff members into a national database. A proportion of centers that have been granted waivers are not required to make monthly submissions. Additionally, some smaller medical centers do not have their own call centers; these are referred to as “covered” facilities. Data from covered facilities are monitored and collected by a larger nearby facility, and in these cases only a combined set of telephone metrics is reported. For example, a VHA medical center may handle the telephone data collection for its affiliated community-based outpatient clinics. Approximately 11% of SHEP responses are from visits to covered facilities. In our regression models, we included a dummy variable taking on a value of 1 if the facility was a covered facility and 0 otherwise. Due to the human involvement in this process, we visually inspected histograms of monthly telephone ARs and wait times for manipulation of the access measures near the VHA’s performance thresholds.

Patient Satisfaction Measures

Table 1 contains a listing of the specific outcome measures used in this analysis along with their respective answer formats. Satisfaction measures were selected and calculated following previous work in this area.11 Respondents were asked questions regarding their ability to obtain appointments for urgent care and routine care as soon as they needed and their ability to get medical questions answered within the same day. Responses for these measures included always, usually, sometimes, and never; we dichotomized responses into always/usually compared with sometimes/never. Additionally, SHEP included questions regarding how long patients had to wait for urgent care appointments, with responses on a 5-point scale from “same day” to “more than 7 days.” We dichotomized responses into wait times of 1 day or less versus more than 1 day. Lastly, satisfaction with their healthcare provider was measured by asking respondents to provide a rating on a scale of 0 to 10. We dichotomized responses into ratings of 9 or 10 compared with less than 9.

Control Variables and Model Specification

For all models, the unit of analysis was the individual SHEP response. Self-reported physical and mental health were measured on separate 5-point scales ranging from poor to excellent. To control for survey context effects, 2 binary variables were included indicating whether the respondent received help completing the survey and whether the survey was administered in English. Additional facility-level characteristics, which are time-varying and may affect satisfaction, were included, such as SHEP nonresponse rate, average wait time for a primary care appointment, average number of days between patient visit and survey return date, covered facility status, and volume of primary care visits and phone calls during the preceding month. Models also included an overall time trend to control for secular changes in wait times and a VHA medical center random effect to control for facility quality and case mix differences.

Our key predictor variables, ASA and AR, were converted into quartiles using cutoffs based on telephone performance data reported by all VHA centers in October 2014, the first month of our data. The cutoff points for these quartiles (in seconds for ASA and percentages for AR) are contained in Table 2.

We estimated separate adjusted mixed effects logistic regression models for each patient satisfaction outcome. As a sensitivity analysis, we repeated this process using fixed instead of random effects for each medical center. Analyses were performed using R version 3.3.2 (R Foundation for Statistical Computing; Vienna, Austria).

RESULTS

Characteristics of our study sample are described in Table 3. The SHEP respondents in our sample were representative of the larger VHA population. Respondents were generally older, overwhelmingly male, and mostly white. Patients were generally highly satisfied with their providers; approximately 2 of 3 rated their provider a 9 or 10 of 10 possible points. Nearly 4 of 5 patients stated they were usually or always able to receive appointments for urgent care as soon as they needed. For routine care, this proportion fell to 3 of 5. Approximately 37% of respondents who sought an urgent care appointment were able to be seen within 1 day or less, whereas only 28% were able to receive answers to medical questions within the same day.

VHA showed improvements in telephone access measures between October 2014 and September 2016 (FY 2015 and FY 2016) (Figure). ASA declined slightly, from an average of 87 seconds in October 2014 to 69 seconds in September 2016. Average AR fell from 12.0% to 8.3% during the same time period. There was significant variation in telephone access measures by VHA center. It is possible that the manual transposition of telephone performance data from VHA centers to the national database could introduce human error into the data-generating process. Additionally, gaming of metrics is a concern when there are institutional incentives to meet or exceed specified performance targets.12 We conducted graphical inspection via histograms of the telephone access data to check for strategic behavior due to the human role in their collection and transmission. We found no evidence of “bunching” or rounding down of ASA or AR to the VHA performance thresholds of 30 seconds or 5%, respectively. The distributions of telephone access metrics are multimodal, with small peaks occurring at many whole numbers for ASA and tenths of a percentage for AR. This suggests that individual centers sometimes rounded their access metrics before transferring them to the national database, despite greater possible precision.

Our findings reveal negative and significant associations between ASA and 2 of the 5 outcome measures (Table 4), and these relationships exhibited a generally decreasing gradient. For instance, patients who made appointments for urgent care were less likely to respond that they could usually or always get appointments as soon as they needed if they visited a VHA center with ASA in the third quartile (odds ratio [OR], 0.93; 95% CI, 0.86-1.02) or fourth quartile (OR, 0.85; 95% CI, 0.76-0.95) compared with a VHA center in the first quartile. Patients were also less likely to report being able to get an urgent care appointment within 1 day if they visited centers in the second quartile (OR, 0.93; 95% CI, 0.88-0.99) or fourth quartile (OR, 0.84; 95% CI, 0.77-0.92) for ASA. The effects of longer telephone wait times on provider ratings were in the expected direction and showed a similar dose—response relationship but were not statistically significant. There was no clear relationship between ASA and the ability to get medical questions answered or to receive routine care appointments.

The regression results did not demonstrate a clear, significant relationship between AR and any of the 5 outcomes. Additional data on CIs and P values for both our main predictors of interest and covariates are included in the eAppendix Table (available at ajmc.com). Sensitivity analysis results using medical center fixed effects were not significantly different from the random effects models, as determined by Hausman tests, supporting the decision to prefer the random effects specifications due to their increased efficiency in the absence of bias.13

DISCUSSION

In this study, we validated the relationship between the average speed of answering telephone calls and patient satisfaction with urgent care access. As our results show, facilities with lower performance in answering phone calls also have lower patient perception of their ability both to access urgent care appointments and to do so in a timely manner.

In contrast, there was no clear relationship between AR and the study outcomes. One potential issue is that the VHA includes all callers in its calculation of AR, although callers who hang up shortly after dialing have most likely misdialed and are not likely to have hung up due to exasperation with their wait time. This “noise” in the data may attenuate the effect of AR on satisfaction and access outcomes. Best practices in the industry suggest that calls shorter than 5 seconds should not be included.14

VHA made substantial strides in reducing telephone wait times and abandoned calls between 2015 and 2016, with average wait times and ARs decreasing by approximately 7% and 22%, respectively. Despite these improvements, nearly 80% of centers still did not meet 1 or both of the VHA’s stated performance thresholds by the end of the study period. Based on our findings, further improvements in telephone access could be expected to lead to improvements in patients’ self-reported access and satisfaction.

These results demonstrate that decreased telephone waits are associated with improved patient perceptions of urgent care access even without concomitant decreases in observed appointment waits. The strength of this association may occur because telephone call centers are often the first point of contact for patients, and this contact immediately precedes the patient visit. However, we did not find evidence that either AR or ASA were associated with patient ratings of routine care access, of their ability to get medical questions answered the same day, or of their providers. These limited findings may have implications for insurers, CMS, and other healthcare organizations that must make resourcing decisions for call centers for hospitals and federal health insurance exchanges. Specifically, if these organizations are to become more patient centered and improve patient satisfaction, managing the promptness of service for patients calling for appointments and medical questions is likely to be an important component of a successful strategy to improve particular satisfaction measures.

Limitations

This study has several limitations. Due to the observational nature of the analysis, the significant relationships between telephone access measures and patient satisfaction may be due to the presence of omitted demographic or time-varying facility-level variables and thus not causal. However, these results comport with prior research demonstrating associations between telephone access measures and caller satisfaction.4,6-9 Further, the relationships we found with ASA were similar to those that were found in prior research validating access metrics with health outcomes and self-reported patient satisfaction.11,15-17 For instance, Prentice et al (2014)11 and Pizer et al (2017)15 found similar negative relationships between satisfaction and longer average appointment wait times for new patients and consult wait times for returning patients.

The SHEP sample consists predominantly of middle-aged and elderly white men, so the results may not generalize to other demographic groups. Further, patients with multiple visits in a short period may not always remember which appointment the survey was referencing.

The VHA does not collect data on other telephone access measures, such as the number of transfers, first-call resolution, and average talk time, that have been cited as important in the literature. Due to widespread availability of technology to track these additional measures, we recommend that the VHA and other organizations collect these data so that future research may study their effects on patient satisfaction as well.

CONCLUSIONS

Our study is the first to look at the relationships between 2 measures of telephone access, wait times and ARs, and patients’ satisfaction with their care experience. Within the VHA system, ASA was associated with lower patient ratings of their ability to access urgent care appointments and to do so in a timely manner. We observed these effects even after controlling for a variety of demographic characteristics, institutional factors, and average medical center wait times. However, ASA was not associated with patient ratings of their ability to access routine care or their providers. AR was associated with neither patients’ perceptions of their ability to access care nor satisfaction with their care. Our results suggest that hospitals and providers could achieve modest improvements in patient satisfaction by reducing the time patients spend on hold.Author Affiliations: Partnered Evidence-Based Policy Resource Center, Veterans Health Administration (KNG, SDP), Boston, MA; Department of Health Law, Policy & Management, Boston University School of Public Health (KNG, SDP), Boston, MA; Center for Healthcare Organization & Implementation Research, Veterans Health Administration (DL, JCP), Boston, MA; Office of Veterans Access to Care, Veterans Health Administration (MLD), Washington, DC; Department of Psychiatry, Boston University School of Medicine (JCP), Boston, MA.

Source of Funding: Funding for this research was provided by the Department of Veterans Affairs, Veterans Health Administration Office of Veterans Access to Care. The views expressed in this article are those of the authors and do not necessarily represent the position or policy of the Department of Veterans Affairs, Boston University, or Northeastern University.

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (KNG, MLD, SDP, JCP); acquisition of data (KNG, DL, JCP); analysis and interpretation of data (KNG, DL, MLD, SDP, JCP); drafting of the manuscript (KNG); critical revision of the manuscript for important intellectual content (KNG, MLD, SDP, JCP); statistical analysis (KNG, DL, SDP); provision of patients or study materials (MLD); obtaining funding (MLD, SDP, JCP); administrative, technical, or logistic support (MLD, SDP, JCP); and supervision (SDP, JCP).

Address Correspondence to: Kevin N. Griffith, MPA, VA Boston Healthcare System, 150 S Huntington Ave, Jamaica Plain Campus, Bldg 9, Boston, MA 02122. Email: kevin.griffith@va.gov.REFERENCES

1. Fell D. 2015 healthcare call center survey results. Oral presentation at: 27th Annual Conference of Healthcare Call Centers; June 10-12, 2015; Charlotte, NC.

2. Patient Protection and Affordable Care Act, HR 3590, 111th Cong, 2nd Sess (2010).

3. Prescription drug coverage contracting. CMS website. cms.gov/Medicare/Prescription-Drug-Coverage/PrescriptionDrugCovContra/index.html. Updated April 3, 2019. Accessed March 9, 2018.

4. LaVela SL, Gering J, Schectman G, Locatelli SM, Weaver FM, Davies M. Improving the quality of telephone-delivered health care: a national quality improvement transformation initiative. Fam Pract. 2013;30(5):533-540. doi: 10.1093/fampra/cmt020.

5. Veterans Health Administration. Telephone service for clinical care. US Department of Veterans Affairs website. va.gov/vhapublications/ViewPublication.asp?pub_ID=1605. Published October 11, 2007. Accessed October 5, 2017.

6. LaVela SL, Gering J, Schectman G, Weaver FM. Optimizing primary care telephone access and patient satisfaction. Eval Health Prof. 2012;35(1):77-86. doi: 10.1177/0163278711411479.

7. Moscato SR, Valanis B, Gullion CM, Tanner C, Shapiro SE, Izumi S. Predictors of patient satisfaction with telephone nursing services. Clin Nurs Res. 2007;16(2):119-137. doi: 10.1177/1054773806298507.

8. Feinberg RA, Kim IS, Hokama L, de Ruyter K, Keen C. Operational determinants of caller satisfaction in the call center. Int J Serv Ind Manag. 2000;11(2):131-141. doi: 10.1108/09564230010323633.

9. Antonides G, Verhoef PC, van Aalst M. Consumer perception and evaluation of waiting time: a field experiment. J Consum Psychol. 2006;12(3):193-202. doi: 10.1207/S15327663JCP1203_02.

10. Hellerstein JM. Quantitative data cleaning for large databases. UC Berkeley Database Group website. db.cs.berkeley.edu/jmh/papers/cleaning-unece.pdf. Published February 27, 2008. Accessed October 3, 2018.

11. Prentice JC, Davies ML, Pizer SD. Which outpatient wait-time measures are related to patient satisfaction? Am J Med Qual. 2014;29(3):227-235. doi: 10.1177/1062860613494750.

12. Prentice JC, Frakt AB, Pizer SD. Metrics that matter. J Gen Intern Med. 2016;31(suppl 1):70-73. doi: 10.1007/s11606-015-3559-0.

13. Wooldridge JM. Introductory Econometrics: A Modern Approach. Mason, OH: South-Western; 2006.

14. Berger S. 5 pitfalls when measuring abandonment rate. Fonolo website. fonolo.com/blog/2016/02/5-pitfalls-when-measuring-abandonment-rate/. Published February 16, 2016. Accessed February 10, 2017.

15. Pizer SD, Davies ML, Prentice JC. Consult coordination affects patient experience. Am J Accountable Care. 2017;5(1):23-28.

16. Prentice JC, Pizer SD. Delayed access to health care and mortality. Health Serv Res. 2007;42(2):644-662. doi: 10.1111/j.1475-6773.2006.00626.x.

17. Prentice JC, Pizer SD. Waiting times and hospitalizations for ambulatory care sensitive conditions. Health Serv Outcomes Res Methodol. 2008;8(1):1-18. doi: 10.1007/s10742-007-0024-5.

Related Videos
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.