Comparison of the Use of Top-Ranked Cancer Hospitals Between Medicare Advantage and Traditional Medicare

, , ,
The American Journal of Managed Care, October 2021, Volume 27, Issue 10

Medicare Advantage enrollees, particularly those with lower out-of-network benefits, may have restricted access to top-ranked hospitals for complex cancer care compared with traditional Medicare enrollees.

ABSTRACT

Objectives: To compare the use of top-ranked cancer hospitals for complex cancer surgery between Medicare Advantage (MA) and traditional Medicare fee-for-service (FFS) enrollees.

Study Design: Cross-sectional analysis of Medicare claims and enrollment data.

Methods: The study used Medicare Provider Analysis and Review files to compare differences in use of top-ranked cancer hospitals for complex cancer surgery (lobectomy, esophagectomy, gastrectomy, colectomy, and the Whipple procedure [pancreaticoduodenectomy]) between MA and FFS enrollees 65 years and older who underwent the surgery in 2015 to 2017.

Results: After adjusting for demographic characteristics and county fixed effects, MA enrollees were less likely to use top-ranked cancer hospitals than FFS enrollees by 6.0 percentage points (95% CI, 4.7-7.2) overall; the difference varied from 3.5 percentage points (95% CI, 2.5-4.6) for colectomy to 14.3 percentage points (95% CI, 10.9-17.8) for the Whipple procedure. The difference in cancer surgery rate at a top-ranked cancer hospital between MA and FFS enrollees was larger for MA plans without out-of-network (OON) benefits (–7.5 percentage points; 95% CI, –9.1 to –5.9) than for MA plans with OON benefits (–2.3 percentage points; 95% CI, –2.9 to –1.7).

Conclusions: MA enrollees were less likely to use top-ranked cancer hospitals for complex cancer surgery than FFS enrollees. This difference was larger for MA plans with more restrictive OON policies. These findings suggest that MA enrollees, particularly those with lower OON benefits, may have restricted access to top-ranked hospitals for cancer care compared with FFS enrollees.

Am J Manag Care. 2021;27(10):e355-e360. https://doi.org/10.37765/ajmc.2021.88766

_____

Takeaway Points

Medicare Advantage (MA) plans often limit provider networks as a cost-containment strategy. These networks may limit enrollees’ access to higher-quality providers, particularly for high-cost and complex cancer care.

  • Analysis of Medicare Provider Analysis and Review files from 2015 to 2017 indicated that MA enrollees were less likely than fee-for-service enrollees to use top-ranked cancer hospitals for complex cancer surgery.
  • The difference was larger for MA plans with more restrictive out-of-network policies.
  • It is important for MA enrollees to understand the consequences of choosing plans that restrict the network of care and access to high-quality providers, particularly for complex cancer care.

_____

The Medicare Advantage (MA) program has been rapidly expanding over the past 16 years, with an increase in enrollment from 5.3 million in 2004 to 24.1 million in 2020, which accounts for 39% of all Medicare beneficiaries.1 In the MA program, private plans receive capitated payments to cover their enrollees’ care.2,3 Whereas traditional fee-for-service (FFS) Medicare does not typically limit the choice of provider, MA plans are able to create selective provider networks as a strategy to improve care quality and to contain costs.4 Although in some cases these networks may steer enrollees away from lower-quality care,5 there is concern that these networks may limit enrollees’ access to higher-quality providers.6-8 Prior work has found that the average MA plan includes just more than half of the hospitals in its county, and 41% of MA plans with a National Cancer Institute–designated cancer center in their county exclude the center from their network.9

Narrow networks may be of particular concern to enrollees with complex and highly specialized care needs, such as those undergoing cancer surgery.10 Outcomes for patients with cancer undergoing complex surgical procedures vary greatly across hospitals11-14 and are often strongly associated with hospitals’ surgical volume,15-20 particularly for high-risk and technically difficult surgeries such as esophagectomy21,22 and the Whipple procedure (pancreaticoduodenectomy).23 Patients with cancer have expressed strong demand for comparative information on hospital quality and surgical volume, with nearly three-fourths of patients who underwent cancer surgery reporting being very or somewhat likely to use a list of the best hospitals for cancer surgery to select the site of their procedure.24 However, patients’ choices of the site of complex cancer surgery are constrained by the hospitals available in their insurance plan’s network. If hospitals with better reputations charge higher fees for their services, they may be costly to be included in MA networks.25 More than half of respondents indicated an insurance restriction (eg, limited network) as the main barrier to using a higher-quality cancer hospital for complex cancer surgery.26

It is currently unknown to what extent MA enrollees have access to high-quality hospitals for high-cost and complex cancer care. In this study, we compare the use of top-ranked cancer hospitals (ie, top 50 cancer hospitals as reported in US News & World Report) between MA and FFS enrollees for complex cancer surgery. We focused on the US News & World Report ranking given its prominence and the limited number of other publicly available reports on the quality of cancer care in US hospitals.27-29

METHODS

Study Design

This cross-sectional study compared the proportion of MA enrollees who underwent complex cancer surgery at a top-ranked cancer hospital with that of FFS enrollees living in the same county. We then further conducted the MA-FFS comparisons for MA plans with and without out-of-network (OON) benefits that allow enrollees to receive care from hospitals out of their plan’s network. The study was approved by the Brown University Institutional Review Board.

Data Sources and Study Population

The primary data sources were the 100% CMS Medicare Provider and Analysis Review (MedPAR) file, the Master Beneficiary Summary File (MBSF), and Medicare Advantage Plan Benefit Package (PBP) data. We used the MedPAR file to identify Medicare beneficiaries who underwent complex cancer surgery in the period from January 1, 2015, to December 31, 2017. The MedPAR file contains hospital inpatient claims at the stay level for both MA and FFS enrollees. For MA enrollees, the MedPAR file includes claims for the enrollees who were admitted to hospitals that receive disproportionate share hospital payments or graduate medical education payments. A recent study found that hospitals that submit MedPAR data for their MA patients accounted for 92% of Medicare discharges.30 Indeed, our sample of 2154 hospitals that reported cancer surgery for MA enrollees in the 2015 MedPAR file accounts for 98% of all MA admissions for cancer surgery in the 2015 MA encounter data that contain all hospital admissions of MA enrollees. The MBSF was used to identify each Medicare beneficiary’s MA or FFS status at the time of admission to a hospital. The MBSF contains beneficiaries’ demographic characteristics such as age, sex, race/ethnicity, dual eligibility for Medicaid, and zip code. The MA PBP data provide detailed information on each MA plan’s characteristics and benefits, including an OON benefit that allows enrollees to receive care from providers out of the plan’s network, and the plan’s monthly premium. We linked information on OON benefits and monthly premiums at the contract level to MA contract identifiers for each enrollee in the MBSF.

The study population included 181,406 Medicare beneficiaries (56,117 MA enrollees and 125,289 FFS enrollees) 65 years and older who underwent complex cancer surgery at 3383 hospitals between 2015 and 2017.

Measures

The primary outcome was receipt of a complex cancer surgery (lobectomy, colectomy, gastrectomy, esophagectomy, or pancreaticoduodenectomy [Whipple procedure]) at a top-ranked cancer hospital (ie, a top-50 hospital as reported in the 2015 US News & World Report). The rankings are based on a composite measure including hospital reputation, patient volume, patient safety and mortality, and nursing staffing.29,31 International Classification of Diseases, Ninth Revision and Tenth Revision codes were used to identify the 5 surgical procedures (eAppendix Table 1 [eAppendix available at ajmc.com]).27 These surgical procedures are common, high-risk, complex cancer surgeries for which the quality of performing hospitals significantly affects patient outcomes (ie, morbidity and mortality).20,32-34 For beneficiaries who underwent multiple complex cancer surgical procedures in a given calendar year, we used the first procedure.

The primary explanatory variable of interest was an indicator for enrollment in an MA plan at the time of hospital admission for undergoing complex cancer surgery. Covariates included age, sex, race/ethnicity (non-Hispanic White, non-Hispanic Black, Hispanic, Asian/Pacific Islander, American Indian, other), dual eligibility for Medicaid, and the reason for Medicare entitlement. The Elixhauser comorbidity measures were used to identify enrollees’ comorbid conditions.35 Distance from an enrollee’s residence to the nearest top-ranked cancer hospital was calculated based on geodetic distances—the length of the shortest curve between 2 points along the surface of the earth—from patient zip code centroid to hospital zip code centroid.36

Statistical Analysis

We used multivariable linear regression models for a binary indicator for whether a Medicare beneficiary underwent complex surgery at a top-ranked cancer hospital from 2015 to 2017. All models were estimated for all 5 procedures combined (any 1 of the 5 procedures undertaken) and for each procedure. We stratified all analyses by MA plan type: plans with and without OON benefits that allow enrollees to receive care from OON hospitals. We adjusted for patients’ demographic characteristics (age, gender, race/ethnicity, and dual eligibility for Medicare and Medicaid) and comorbid conditions with fixed effects for year and quarter of hospital admission and county of residence. To better account for geographical factors that might affect differential use of top-ranked cancer hospitals, we adjusted for distance from the patient’s residence to the nearest top-ranked cancer hospital. Heteroskedasticity-robust standard errors were clustered at the county level to allow for unrestricted serial correlation within county.

Sensitivity Analysis

In sensitivity analyses, we conducted the analyses by each one of top-ranked cancer hospitals, comparing differences between MA and FFS enrollees in the use of that specific hospital relative to non–top-ranked hospitals located in the top-ranked cancer hospital’s referral region (HRR). In addition, we restricted inclusion to enrollees residing in counties with a high MA penetration rate (greater than the median of county-level MA penetration rates in 2015, 20.2%). We also included HRR and patient zip code fixed effects instead of county fixed effects. Further, we applied the logit specification to the main regression model to account for the binary outcome: whether to use a top-ranked cancer hospital for cancer surgery. In addition, we conducted the analyses using the 2015 MA encounter data and separately using the 2015 MedPAR data to ensure that the results were not driven by the fact that we used MedPAR data for MA enrollees. We also estimated cancer surgery rates at top-ranked cancer hospitals among MA enrollees by their plans’ monthly premium to see if enrollees in high-premium plans (with premiums in the fourth quartile of premiums), which are more likely to provide OON benefits, used top-ranked cancer hospitals more than those in low-premium plans (with premiums in the first quartile). We also plotted time trends of cancer surgery rates at top-ranked cancer hospitals for MA and FFS enrollees from 2011 to 2017 to determine whether MA-FFS differences in the use of these hospitals changed over time.

RESULTS

The study population of 181,406 Medicare beneficiaries included 56,117 (31%) MA enrollees (mean [SD] age, 75.3 [6.8] years; 52.1% women; 73.8% non-Hispanic White; 16.1% dually eligible for Medicaid) and 125,289 (69%) FFS enrollees (mean [SD] age, 75.7 [7.2] years; 52.2% women; 84.2% non-Hispanic White; 12.4% dually eligible for Medicaid) 65 years and older who underwent complex cancer surgery between January 1, 2015, and December 31, 2017 (Table 1). Of the 56,117 MA enrollees, 40,197 (72%) were enrolled in MA plans without OON benefits (mostly health maintenance organization plans). MA enrollees in plans with OON benefits (mostly preferred provider organization plans) were more similar to FFS enrollees than were MA enrollees in plans without OON benefit. Colectomy was the most frequently undertaken procedure (60.6% of MA enrollees; 59.9% of FFS enrollees) among the 5 procedures, followed by lobectomy (25.4% of MA enrollees; 25.9% of FFS enrollees).

Table 2 presents unadjusted differences in the percentage of enrollees who underwent cancer surgery at top-ranked cancer hospitals between MA and FFS enrollees, and the adjusted differences controlling for patient characteristics with county fixed effects. MA enrollees were less likely to use top-ranked cancer hospitals for complex cancer surgery than FFS enrollees by 6.0 percentage points (95% CI, 4.7-7.2) overall. This difference varied by procedure, from 3.5 percentage points (95% CI, 2.5-4.6) for colectomy to 14.3 percentage points (95% CI, 10.9-17.8) for the Whipple procedure. The difference in cancer surgery rate at a top-50 cancer hospital between MA and FFS enrollees was larger for MA plans without OON benefits (–7.5 percentage points; 95% CI, –9.1 to –5.9) than for MA plans with OON benefits (–2.3 percentage points; 95% CI, –2.9 to –1.7).

eAppendix Figure 1 plots adjusted differences in surgery rate between MA and FFS enrollees at each top-ranked cancer hospital relative to non–top-ranked cancer hospitals located in the same HRR of the top-ranked cancer hospital. The figure shows that the difference in surgery rate at a top-ranked cancer hospital between MA and FFS enrollees was larger for higher-ranked cancer hospitals (1st to 25th), and smaller for lower-ranked hospitals (26th to 50th), with a differential of 9.6 (P = .007). Consistent with the results shown in Table 2, most of these differences were driven by MA plans without OON benefits (eAppendix Figure 2).

In the Figure, we present the cancer surgery rates at top-ranked cancer hospitals by distance from enrollee’s residence to the nearest top-ranked cancer hospital. Overall, enrollees who lived closer to a top-ranked cancer hospital were more likely to use top-ranked cancer hospitals. More importantly, among those who lived within the same distance to a top-ranked cancer hospital, MA enrollees in plans without OON benefits were least likely to use top-ranked cancer hospitals, followed by those in MA plans with OON benefits. At all distances from the nearest top-ranked cancer hospitals, FFS enrollees were the most likely to use top-ranked cancer hospitals relative to MA enrollees with and without OON benefits (P < .001 for comparisons). In eAppendix Figure 3, we showed the same pattern for each of the 5 procedures, with larger differences in the Whipple procedure and esophagectomy and a smaller difference in colectomy. We also estimated cancer surgery rates at top-ranked cancer hospitals among MA enrollees by their plans’ monthly premium (eAppendix Figure 4). The proportion of MA enrollees in plans in the lowest quartile of monthly premiums, which were mostly MA plans without OON benefits, who received surgery in a top-ranked cancer hospital was lower than that of enrollees in plans in the highest quartile of monthly premiums (4.7% vs 9.3%; P < .001).

Our findings were consistent in analyses that restricted inclusion to enrollees residing in counties with a high MA penetration rate (eAppendix Table 2), that included HRR and zip code fixed effects (eAppendix Table 3), and that modeled use of a top-ranked cancer hospital with a logit rather than a linear probability model (eAppendix Table 4). The analysis based on the 2015 MA encounter data that contain all hospital claims of MA enrollees yielded virtually the same results as those based on the 2015 MedPAR data (eAppendix Table 5). In eAppendix Figure 5, we plotted the time trends of quarterly surgery rates at top-ranked cancer hospitals for MA and FFS enrollees from 2011 to 2017. The figure shows that the differences in the surgery rates have been persistent over time.

DISCUSSION

MA enrollees were substantially less likely than FFS enrollees to use a top-ranked cancer hospital for 5 cancer-related surgical procedures (ie, common, high-risk complex cancer surgeries for which the quality of performing hospitals significantly affects patient outcomes).20,32-34 The largest difference was for the Whipple procedure and the smallest difference was for colectomy. These differences persisted when accounting for enrollees’ area of residence and their proximity to the nearest top-ranked cancer hospital. The difference was smaller for MA enrollees in plans that permit OON hospital coverage and for those enrolled in higher-premium plans.

Although we cannot determine if the differences in the use of top-ranked cancer hospitals are due to differences in MA hospital networks or differences in patient preferences, the difference between plans that allow and do not allow OON hospital use is indicative of the former. MA enrollees in plans with higher premiums were more likely to use top-ranked hospitals. This could indicate that enrollees may be willing to pay a higher premium to have access to a higher-quality network; however, enrollees in more expensive plans may also have greater financial means to seek care at a top-ranked hospital or differ in other unobserved ways from those in plans with lower premiums.

There are several possible explanations for these findings. MA plans have been found to pay lower rates to hospitals than FFS.25 If the top-ranked cancer hospitals charge higher rates than other hospitals, MA plans may avoid them. Ten of the top-ranked hospitals are prospective payment system–exempt facilities, which have been found to charge higher rates.37 MA plans may therefore face incentives to exclude these facilities from their networks. Our findings align with those of prior studies that have reported that, compared with FFS enrollees, MA enrollees are less likely to receive care from higher-quality nursing homes and home health providers.7,8

Plans with narrow networks typically contract with low-cost providers,38 which may include low-quality providers.10 A previous study showed that narrower provider networks are more likely to exclude high-quality cancer care providers in the individual health insurance exchange market.39 Although in this study we are unable to assess differences in treatment outcomes between MA and FFS enrollees as a result of differential use of top-ranked cancer hospitals, to the extent that top-ranked hospitals provide better outcomes than other hospitals,27 this may have negative consequences for MA enrollees. Our finding of larger differences in the use of top-ranked cancer hospitals between MA and FFS enrollees for esophagectomy and the Whipple procedure, for which the quality differential is larger than for colectomy and lobectomy,15,20 also suggests likely negative consequences for MA enrollees.

It is noteworthy that our findings apply to patients with cancer who underwent complex cancer surgery. In other settings, narrow-network MA plans may enhance quality of care while reducing costs by selectively contracting with high-quality and cost-effective providers. Further research is needed to determine the impact of narrow networks in other clinical contexts.

Limitations

This study has several limitations. First, there is some criticism of hospital ratings as calculated by US News & World Report.40,41 For instance, the rankings rely largely on subjective measure of hospital reputation based on a survey of specialists, although objective measures such as patient volume and mortality are also scored in the rankings.31,42 Also, the rankings do not completely align with other quality indications, such as Commission on Cancer (CoC) accreditation: Some non–top-ranked hospitals earned the CoC accreditation. Despite these concerns, there is evidence that top-ranked hospitals do perform better than those that are unranked27,43-46 and the variation in subjective reputation in the rankings is explained by objective quality measures among cancer hospitals.29 Our data showed that the annual volume of the 5 cancer surgeries, as a proxy for quality, in top-ranked cancer hospitals was 178, compared with 22 in other cancer hospitals during the study period. Second, detailed hospital network data for Medicare Advantage plans are not readily available, so we cannot determine with certainty if the differences we detect are due to differences in network or in enrollee preferences. Of note, our distance analyses exclude the possibility that differences in the use of top-ranked hospitals are related to enrollees’ residential proximity to these facilities. Third, we could not account for each patient’s stage of cancer diagnosis because the study data (MedPAR) do not contain this information. However, it would be unlikely to expect large differences in the clinical characteristics between MA and FFS patients given our focus on patients who underwent 1 of 5 specific cancer surgeries. Fourth, this cross-sectional study may be susceptible to potential bias arising from unobserved factors that may affect enrollees’ choices of hospital for complex cancer surgery.

CONCLUSIONS

MA enrollees are significantly less likely than FFS enrollees to use top-ranked cancer hospitals for complex cancer surgery. This difference is larger for MA plans that do not allow OON hospital use. As the MA program continues to grow, it is important for enrollees to understand the consequences of choosing plans that restrict the network of care and access to high-quality providers, particularly for complex cancer care. CMS could consider requiring MA plans to provide more transparent and detailed information on the breadth and quality of provider networks to facilitate beneficiaries’ informed decisions when they choose between MA and FFS enrollment or across MA plans.

Author Affiliations: Department of Health Services, Policy, and Practice, Brown University (DK, DJM, MR, ANT), Providence, RI; Providence VA Medical Center (ANT), Providence, RI.

Source of Funding: The study was supported by the National Institute on Aging of the National Institutes of Health (P01AG027296).

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (DK, DJM, MR, ANT); acquisition of data (ANT); analysis and interpretation of data (DK, MR, ANT); drafting of the manuscript (DK, DJM, MR, ANT); and critical revision of the manuscript for important intellectual content (DK, DJM, MR).

Address Correspondence to: Daeho Kim, PhD, Department of Health Services, Policy, and Practice, Brown University, 121 S Main St, Providence, RI 02903. Email: Daeho_Kim@brown.edu.

REFERENCES

1. Freed M, Damico A, Neuman T. A dozen facts about Medicare Advantage in 2020. Kaiser Family Foundation. January 13, 2021. Accessed February 3, 2021. https://www.kff.org/medicare/issue-brief/a-dozen-facts-about-medicare-advantage-in-2020/

2. Neuman P, Jacobson GA. Medicare Advantage checkup. N Engl J Med. 2018;379(22):2163-2172. doi:10.1056/NEJMhpr1804089

3. Medicare Managed Care Manual: Chapter 8 – payments to Medicare Advantage organizations. CMS. September 19, 2014. Accessed August 1, 2019. https://www.cms.gov/Regulations-and-Guidance/Guidance/Manuals/Downloads/mc86c08.pdf

4. Meyers DJ, Rahman M, Trivedi AN. Narrow primary care networks in Medicare Advantage. J Gen Intern Med. Published online January 19, 2021. doi:10.1007/s11606-020-06534-2

5. Baker LC, Bundorf MK, Kessler DP. The effects of Medicare Advantage on opioid use. J Health Econ. 2020;70:102278. doi:10.1016/j.jhealeco.2019.102278

6. Meyers DJ, Trivedi AN, Mor V, Rahman M. Comparison of the quality of hospitals that admit Medicare Advantage patients vs traditional Medicare patients. JAMA Netw Open. 2020;3(1):e1919310. doi:10.1001/jamanetworkopen.2019.19310

7. Meyers DJ, Mor V, Rahman M. Medicare Advantage enrollees more likely to enter lower-quality nursing homes compared to fee-for-service enrollees. Health Aff (Millwood). 2018;37(1):78-85. doi:10.1377/hlthaff.2017.0714

8. Schwartz ML, Kosar CM, Mroz TM, Kumar A, Rahman M. Quality of home health agencies serving traditional Medicare vs Medicare Advantage beneficiaries. JAMA Netw Open. 2019;2(9):e1910622.

9. Jacobson G, Trilling A, Neuman T, Damico A, Gold M. Medicare Advantage hospital networks: how much do they vary? Kaiser Family Foundation.June 20, 2016. Accessed January 22, 2020. https://www.kff.org/medicare/report/medicare-advantage-hospital-networks-how-much-do-they-vary/

10. Schleicher SM, Mullangi S, Feeley TW. Effects of narrow networks on access to high-quality cancer care. JAMA Oncol. 2016;2(4):427-428. doi:10.1001/jamaoncol.2015.6125

11. Fox JP, Tyler JA, Vashi AA, Hsia RY, Saxe JM. A variation in the value of colectomy for cancer across hospitals: mortality, readmissions, and costs. Surgery. 2014;156(4):849-856, 860. doi:10.1016/j.surg.2014.06.011

12. Spencer BA, Miller DC, Litwin MS, et al. Variations in quality of care for men with early-stage prostate cancer. J Clin Oncol. 2008;26(22):3735-3742. doi:10.1200/JCO.2007.13.2555

13. Brooks GA, Li L, Sharma DB, et al. Regional variation in spending and survival for older adults with advanced cancer. J Natl Cancer Inst. 2013;105(9):634-642. doi:10.1093/jnci/djt025

14. Brooks GA, Li L, Uno H, Hassett MJ, Landon BE, Schrag D. Acute hospital care is the chief driver of regional spending variation in Medicare patients with advanced cancer. Health Aff (Millwood). 2014;33(10):1793-1800. doi:10.1377/hlthaff.2014.0280

15. Begg CB, Cramer LD, Hoskins WJ, Brennan MF. Impact of hospital volume on operative mortality for major cancer surgery. JAMA. 1998;280(20):1747-1751. doi:10.1001/jama.280.20.1747

16. Birkmeyer JD, Siewers AE, Finlayson EVA, et al. Hospital volume and surgical mortality in the United States. N Engl J Med. 2002;346(15):1128-1137. doi:10.1056/NEJMsa012337

17. Reames BN, Ghaferi AA, Birkmeyer JD, Dimick JB. Hospital volume and operative mortality in the modern era. Ann Surg. 2014;260(2):244-251. doi:10.1097/SLA.0000000000000375

18. Auerbach AD, Maselli J, Carter J, Pekow PS, Lindenauer PK. The relationship between case volume, care quality, and outcomes of complex cancer surgery. J Am Coll Surg. 2010;211(5):601-608. doi:10.1016/j.jamcollsurg.2010.07.006

19. Bach PB, Cramer LD, Schrag D, Downey RJ, Gelfand SE, Begg CB. The influence of hospital volume on survival after resection for lung cancer. N Engl J Med. 2001;345(3):181-188. doi:10.1056/NEJM200107193450306

20. Finks JF, Osborne NH, Birkmeyer JD. Trends in hospital volume and operative mortality for high-risk surgery. N Engl J Med. 2011;364(22):2128-2137. doi:10.1056/NEJMsa1010705

21. van Lanschot JJ, Hulscher JB, Buskens CJ, Tilanus HW, ten Kate FJ, Obertop H. Hospital volume and hospital mortality for esophagectomy. Cancer. 2001;91(8):1574-1578. doi:10.1002/1097-0142(20010415)91:8<1574::aid-cncr1168>3.0.co;2-2

22. Swisher SG, Deford L, Merriman KW, et al. Effect of operative volume on morbidity, mortality, and hospital use after esophagectomy for cancer. J Thorac Cardiovasc Surg. 2000;119(6):1126-1132. doi:10.1067/mtc.2000.105644

23. Birkmeyer JD, Finlayson SR, Tosteson AN, Sharp SM, Warshaw AL, Fisher ES. Effect of hospital volume on in-hospital mortality with pancreaticoduodenectomy. Surgery. 1999;125(3):250-256.

24. Yang A, Chimonas S, Bach PB, Taylor DJ, Lipitz-Snyderman A. Critical choices: what information do patients want when selecting a hospital for cancer surgery? J Oncol Pract. 2018;14(8):e505-e512. doi:10.1200/JOP.17.00031

25. Baker LC, Bundorf MK, Devlin AM, Kessler DP. Medicare Advantage plans pay hospitals less than traditional Medicare pays. Health Aff (Millwood). 2016;35(8):1444-1451. doi:10.1377/hlthaff.2015.1553

26. Resio BJ, Chiu AS, Hoag JR, et al. Motivators, barriers, and facilitators to traveling to the safest hospitals in the United States for complex cancer surgery. JAMA Netw Open. 2018;1(7):e184595. doi:10.1001/jamanetworkopen.2018.4595

27. Hoag JR, Resio BJ, Monsalve AF, et al. Differential safety between top-ranked cancer hospitals and their affiliates for complex cancer surgery. JAMA Netw Open. 2019;2(4):e191912. doi:10.1001/jamanetworkopen.2019.1912

28. Pope DG. Reacting to rankings: evidence from “America’s Best Hospitals.” J Health Econ. 2009;28(6):1154-1165. doi:10.1016/j.jhealeco.2009.08.006

29. Prasad V, Goldstein JA. US News and World Report cancer hospital rankings: do they reflect measures of research productivity? PLoS One. 2014;9(9):e107803. doi:10.1371/journal.pone.0107803

30. Huckfeldt PJ, Escarce JJ, Rabideau B, Karaca-Mandic P, Sood N. Less intense postacute care, better outcomes for enrollees in Medicare Advantage than those in fee-for-service. Health Aff (Millwood). 2017;36(1):91-100. doi:10.1377/hlthaff.2016.1027

31. Sehgal AR. The role of reputation in U.S. News & World Report’s rankings of the top 50 American hospitals. Ann Intern Med. 2010;152(8):521-525. doi:10.7326/0003-4819-152-8-201004200-00009

32. Sheetz KH, Dimick JB, Nathan H. Centralization of high-risk cancer surgery within existing hospital systems. J Clin Oncol. 2019;37(34):3234-3242. doi:10.1200/JCO.18.02035

33. Sheetz KH, Massarweh NN. Centralization of high-risk surgery in the US: feasible solution or more trouble than it is worth? JAMA. 2020;324(4):339-340. doi:10.1001/jama.2020.2953

34. Sheetz KH, Chhabra KR, Smith ME, Dimick JB, Nathan H. Association of discretionary hospital volume standards for high-risk cancer surgery with patient outcomes and access, 2005-2016. JAMA Surg. 2019;154(11):1005-1012. doi:10.1001/jamasurg.2019.3017

35. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8-27. doi:10.1097/00005650-199801000-00004

36. Banerjee S. On geodetic distance computations in spatial modeling. Biometrics. 2005;61(2):617-625. doi:10.1111/j.1541-0420.2005.00320.x

37. Payment methods for certain cancer hospitals should be revised to promote efficiency. Government Accountability Office. March 23, 2015. Accessed January 22, 2020. https://www.gao.gov/products/gao-15-199

38. Atwood A, Lo Sasso AT. The effect of narrow provider networks on health care use. J Health Econ. 2016;50:86-98. doi:10.1016/j.jhealeco.2016.09.007

39. Yasaitis L, Bekelman JE, Polsky D. Relation between narrow networks and providers of cancer care. J Clin Oncol. 2017;35(27):3131-3135. doi:10.1200/JCO.2017.73.2040

40. Lascano D, Finkelstein JB, Barlow LJ, et al. The correlation of media ranking’s “best” hospitals and surgical outcomes following radical cystectomy for urothelial cancer. Urology. 2015;86(6):1104-1112. doi:10.1016/j.urology.2015.07.049

41. Rothberg MB, Morsi E, Benjamin EM, Pekow PS, Lindenauer PK. Choosing the best hospital: the limitations of public quality reporting. Health Aff (Millwood). 2008;27(6):1680-1687. doi:10.1377/hlthaff.27.6.1680

42. Green J, Wintfeld N, Krasner M, Wells C. In search of America’s best hospitals: the promise and reality of quality assessment. JAMA. 1997;277(14):1152-1155. doi:10.1001/jama.1997.03540380066033

43. Wang DE, Wadhera RK, Bhatt DL. Association of rankings with cardiovascular outcomes at top-ranked hospitals vs nonranked hospitals in the United States. JAMA Cardiol. 2018;3(12):1222-1225. doi:10.1001/jamacardio.2018.3951

44. Lichtman JH, Leifheit EC, Wang Y, Goldstein LB. Hospital quality metrics: “America’s best hospitals” and outcomes after ischemic stroke. J Stroke Cerebrovasc Dis. 2019;28(2):430-434. doi:10.1016/j.jstrokecerebrovasdis.2018.10.022

45. Mehta R, Merath K, Farooq A, et al. U.S. News and World Report hospital ranking and surgical outcomes among patients undergoing surgery for cancer. J Surg Oncol. 2019;120(8):1327-1334. doi:10.1002/jso.25751

46. Chen J, Radford MJ, Wang Y, Marciniak TA, Krumholz HM. Do “America’s Best Hospitals” perform better for acute myocardial infarction? N Engl J Med. 1999;340(4):286-292. doi:10.1056/NEJM199901283400407