The Michigan Value Collaborative has created a claims-based algorithm that categorizes claims into episode components. This manuscript describes the validation of this algorithm.
Objectives: Although hospitals face increasing pressure from payers to improve the efficiency of healthcare delivery beyond the index hospitalization, they often lack information on postdischarge events. The Michigan Value Collaborative (MVC) developed a claims-based algorithm to provide hospitals with data on events that occur to patients beyond the hospitalization. Herein, we discuss the validation of MVC’s claims-based algorithm.
Study Design: Retrospective analysis of a claims-based algorithm’s ability to identify specific medical events, such as index hospitalizations, 30-day readmissions, emergency department visits, skilled nursing facility admissions, home health visits, and rehabilitation services. The claims-based events were validated using a primary review at 63 hospitals.
Methods: We selected 1830 Blue Cross Blue Shield of Michigan episodes from MVC data and asked 63 Michigan hospitals to query their medical records for the presence or absence of specific events. We then calculated agreement statistics and improved our algorithm using feedback from hospitals.
Results: All 63 hospitals participated in the validation process and successfully identified 99% of episodes in their medical records. The initial agreement between our algorithm and medical records was moderate for 4 postdischarge events (kappa ranging from 0.62-0.78) and poor for rehabilitation services (0.16). Much of the disagreements occurred because hospitals could not identify postdischarge events occurring outside of their hospital systems. Other disagreements occurred because of hospital coding practices. Through this analysis, the claims-based algorithm was improved to better reflect real-world coding practice.
Conclusions: Our findings suggest that the MVC claims-based algorithm identifies and classifies claims with high fidelity and outperforms medical records in the identification of postdischarge events. These findings provide important insight to policy makers, payers, and hospital administrators about the value of claims-based data for the implementation of episode-based programs.
Am J Manag Care. 2017;23(11):e382-e386Takeaway Points
The Michigan Value Collaborative (MVC)’s claims-based algorithm identifies postdischarge events with high fidelity, often outperforming medical records.
Hospitals are increasingly being held accountable for services and expenditures that occur beyond the hospitalization through episode-based performance measures.1-6 Postdischarge expenditures, such as postacute care and readmissions, have been cited as the fastest growing spending categories over the last 2 decades and have been the target of many national programs focused on reducing healthcare costs.3 For example, CMS recently implemented the Comprehensive Care for Joint Replacement bundled payment program, which will hold hospitals financially accountable for expenditures occurring from admission through 90 days post discharge.7 In addition, accountable care organizations were developed to reduce costs that occur both outside of and during hospitalizations.8 Based on the prevalence and growth of episode-based payment programs, it is evident that many payers believe the key to reducing healthcare expenditures is to hold hospitals responsible for efficiency along the entire patient care episode.
Despite enthusiasm for increased episode efficiency, identifying specific high-cost events, such as readmissions, can be challenging for hospitals for several reasons. First, it is difficult for hospitals to track events outside of the initial hospitalization. In fact, hospitals are often not even aware of postdischarge events that occur at outside facilities. Secondly, unless directly affiliated with the hospital, postdischarge providers are not incentivized to report utilization patterns to hospitals. Furthermore, many small hospitals may not have internal resources to monitor and track postdischarge events and spending.
In Michigan, one response to these challenges was the development of the Michigan Value Collaborative (MVC).9 Established in 2012 and funded by Blue Cross Blue Shield of Michigan (BCBSM), MVC’s mission is to provide hospitals with episode-level data and promote high-quality care at the lowest reasonable costs.10 One particular area of interest has been postacute care, specifically rehabilitation. MVC hospitals have begun to use episode-level data to monitor rehabilitation expenditures, especially for conditions such as acute myocardial infarction (AMI) and hip replacement, where more than one-third of all patients are discharged to rehabilitation facilities (Table 1). However, a necessary step in this process is to ensure that MVC’s claims-based identification of postdischarge services is accurate.
In this context, we describe a large-scale medical records—based validation of the algorithm used by MVC to define clinical episodes of care in commercial claims. We believe that the MVC validation experience will provide useful insight to hospitals and payers about the advantages and limitations of using claims to track events that occur after hospitalization.
The MVC collects claims data for BCBSM beneficiaries admitted to 1 of 63 Michigan hospitals for 21 medical and surgical conditions. We consulted clinical experts who used International Classification of Diseases, Ninth Revision (ICD-9) diagnosis/procedure and Current Procedural Terminology (CPT) codes to define the conditions. Then, we utilized variables such as revenue codes, diagnosis-related groups (DRGs), facility identification, and CPT codes to classify individual claims into: inpatient, skilled nursing facility (SNF), home health, emergency department (ED), inpatient rehabilitation, outpatient rehabilitation, and general outpatient claims. Cases with readmissions to hospitals other than the index facility were excluded from the readmissions validation step. eAppendix A (eAppendices available at ajmc.com) further details the entire attribution process. Table 1 presents the characteristics of episodes identified from claims data by condition type, including use of postdischarge services. This study was deemed exempt from review by the Michigan Institutional Review Board.
Our validation process occurred in 2 phases (pilot and full validation). During the pilot, 6 hospitals were asked to review 10 to 20 BCBSM preferred provider organization (PPO) cases from between January 1, 2013, and October 31, 2014. For matching purposes, the clinical condition, national provider identifier, date of birth, gender, admission date, and discharge date for each patient was provided. Specifically, participants were instructed to indicate if the patient listed had records demonstrating that a specified event occurred within 90 days of their discharge date. These events included 30-day readmissions, ED visits, SNF admissions, home health visits, and rehabilitation services (inpatient and outpatient).
The lessons learned from the pilot were used to inform the full validation. Here, we distributed the same key variables to all 63 MVC participants. Each hospital was provided 30 cases to review, with the exception of hospitals that participated in the pilot (which were asked to review 20 cases). We selected conditions based on volume, prevalence of associated postdischarge services, and suggestions from clinical experts. The conditions included colectomy, coronary artery bypass graft, AMI, pneumonia, congestive heart failure, hip replacement, knee replacement, cesarean delivery, vaginal delivery, trauma, and spine surgery. In total, 1830 BCBSM PPO cases were selected for data validation from these 11 conditions.
We identified areas of agreement and disagreement between MVC’s claims-based algorithm and medical records. First, we looked for agreement that these episodes occurred and were attributed to the correct hospital and to the correct condition. Next, we used a kappa statistic to assess agreement for postdischarge services (eAppendix B). For each disagreement in a case, 2 members of the MVC team, a clinician and an analyst, reviewed the specific claims. After determining the cause of the discordance, we adjusted our algorithm to better capture hospital events and re-evaluated the level of agreement.
In the cases where MVC reported a postdischarge event that did not match a hospital’s medical records, we used confirmatory evidence from the claims to improve our confidence that the event occurred. Specifically, we examined the following items: 1) Did the postdischarge event have more than 2 claim line items with dates following the index admission?; 2) Did the place of service designation on the claims support the assignment of the claim to the postdischarge service?; and 3) Did the revenue center code on the claims support the assignment of the claim to the postdischarge service?
One hundred percent of hospitals (n = 63) participated in the validation process. Hospitals matched 1812 of the 1830 (99%) MVC episodes to records in the hospital’s medical charts.
The agreement for the occurrence of postdischarge services ranged from 0.16 to 0.78, with rehabilitation services having the poorest agreement. eAppendix B contains details of the validation process.
Using administrative claims, we were able to identify 183 readmissions for 1812 episodes (10% readmission rate). Of these, there were 15 cases in which MVC observed a 30-day readmission that was not evident in the hospital data (Table 2). These were due to readmissions to a hospital different than the index facility and, per the methods section, were excluded. There were 24 discordant readmissions which MVC did not identify and a hospital did; these cases were classified as observation unit stays in MVC data (eAppendix B).
We identified 452 ED visits (25% of episodes). Of these visits, 292 (65%) were identified by hospitals. Importantly, there were 160 out of 452 cases (35%) in which MVC observed an ED visit that was not reported in the hospital clinical data (Table 2). Of the 160 discordant cases, 122 (76%) of these ED visits occurred at a different facility than the index admission. All 160 cases had confirmatory evidence of a valid ED visit.
There were 85 episodes (5%) in which a hospital reported an ED visit that was not evident in the MVC data. Upon review, many of these were classified elsewhere. For example, 38 preceded an index admission or a readmission and the ED services are grouped with this hospitalization in the MVC episode (eAppendix B).
We identified 64 SNF admissions (4% of episodes). Of these, 43 (67%) admissions were identified by hospitals. There were 21 cases (33%) in which MVC identified a SNF admission but the admissions were not reported by hospitals (Table 2). We found evidence in the claims data to confirm that all 21 visits occurred. There were only 9 cases in which hospitals reported care in a SNF, and we found no evidence of a SNF admission in the MVC claims data (eAppendix B).
Home Health Visits
We identified 949 home health visits (52% of episodes). Of these visits, 781 (82%) were identified by hospitals (Table 2). Of those reported by hospitals, there were 168 cases (21%) in which MVC observed a home health visit that was not evident in the hospital clinical data. There were 99 cases (10% of episodes) in which a hospital observed a home health visit that was not evident in the MVC data (eAppendix B).
The algorithm identified 1223 rehabilitation visits (67% of episodes). Of these visits, 350 (29%) were reported by hospitals. There were 873 cases (71%) in which MVC observed a rehab visit that was not evident in the hospital clinical data (Table 2). Of the 873 discordant cases, 851 (97%) had confirmatory evidence in the MVC claims for utilization of rehabilitation services. There were only 42 cases (3% of episodes) in which a hospital observed a rehab visit that was not evident in the MVC data (eAppendix B).
Improvements to the Claims-Based Algorithm
After reviewing all cases of discordance between the claims-based algorithm and the medical records, we identified several areas for improvement, the most important being to update the ICD-9 codes used to identify postdischarge claims to better reflect the coding practices used by hospitals. For example, ICD-9 code V57.8 (care involving other specified rehabilitation procedure) was initially not considered by MVC as a related diagnosis code for patients after joint replacement. However, this code was used for a large number of rehabilitation services after joint replacement. After making this and other improvements, we re-evaluated the level of agreement between our algorithm and the medical records and found improvement in agreement for all services (eAppendix B).
In this study, we validated the MVC’s claims-based algorithm for the identification and classification of postdischarge events. During the process, we found a high episode match rate between MVC data and medical records. Much of the disagreement was due to the inability of hospitals to identify readmissions, rehabilitation services, and other postdischarge events. Collectively, these findings suggest that this claims-based algorithm outperforms medical records in identifying postdischarge events.
Previous investigators have convincingly demonstrated that variation in episode spending is largely due to postdischarge events.3,11-16 Preventable readmissions and variations in postacute care could indicate areas to improve hospital efficiency and outcomes. Others have also validated and demonstrated on a health-system level the success of claims-based algorithms in identifying hospital events.17-19 Existing literature has demonstrated the effectiveness of utilizing these tools to identify high-cost inpatient events and improve the value of care.20 It is reasonable to believe that by using this same approach to identify postdischarge events, institutions could potentially achieve similar results outside the hospital setting. The current study findings demonstrate, on a large statewide scale, that accurately identifying and measuring postdischarge utilization may be difficult for hospitals using medical records alone. A claims-based algorithm could better identify postdischarge events, especially those that occur outside hospitals’ networks. With the current national focus on episode efficiency, identification of these events is imperative to driving high-value care.
Our study has several limitations. First, we only included BCBSM patients in our validation process. Due to CMS privacy restrictions, we were unable to validate our algorithm with Medicare beneficiaries. Second, the MVC algorithm may not be generalizable for other commercial payers, although BCBSM is the largest commercial payer in Michigan. Third, we did not have hospitals look for readmissions that occurred outside of the hospital where the index event occurred. This decision was primarily made to reduce the chart review burden to hospitals; we received early feedback that hospitals could not identify these particular events. Finally, we did not validate our classification algorithm for other postdischarge events (eg, outpatient procedures) and intensity of services (eg, SNF length of stay). However, this study was focused on validating the occurrence of major postdischarge services. Although we did not end with perfect agreement between the MVC data and the medical records, there were only a few events identified by hospitals not seen in MVC claims (0.5%-4%).
Our findings will help stakeholders understand the opportunities and challenges of using a claims-based algorithm to measure episode spending. Relevant to hospital administrators, the finding that the claims-based algorithm used in this study outperformed medical records suggests that such data provide more complete intelligence about the postdischarge period. This finding should encourage hospital administrators to obtain additional claims data by participating in a statewide, regional, or health-system collaboration and by asking payers to share these data. Without these claims data, hospitals will be limited in their ability to measure and optimize services provided outside of their facilities. This is particularly important as CMS and commercial payers are increasingly using episode-based performance measurement and payment bundling.
Moving forward, research in this area should focus on how these data can be refined to provide more granular information to hospitals. For instance, providing hospitals with data on the average length of stay and intensity of services provided at SNFs may help providers understand the efficiency of facilities where patients are sent after discharge. Ultimately, the value of episode-based performance measurement and bundled payment programs as mechanisms to drive high-value care will strongly depend on the accurate measurement of episode-level payments and utilization.
The authors thank Dr Vinita Bahl (University of Michigan), Kelly Rice (Beaumont Health System), Steve Lewis (St. Joseph Mercy Health System), Meghan Coughlin (MidMichigan Medical Center), and John Robertson (Hillsdale Community Health Center) for their comments and insight during the course of this research. They would also like to thank Dr David Share, Ellen Ward, Tom Leyden, and the Value Partnerships at BCBSM for their ongoing support of the Michigan Value Collaborative and this manuscript.Author Affiliations: Institute for Healthcare Policy and Innovation (CE, DCM, JMD), and Dow Division of Health Services Research, Department of Urology (CE, JDS, DCM, JMD), and Michigan Value Collaborative (CE, JDS, BV, VG, DCM, JMD), University of Michigan, Ann Arbor, MI.
Source of Funding: This research was supported by the Agency for Healthcare Research and Quality (1F32HS024193-01 to Dr Ellimoottil). Dr Miller receives salary support from Blue Cross Blue Shield of Michigan for his role as the director of the Michigan Urological Surgery Improvement Collaborative and the Michigan Value Collaborative. Dr Dupree receives salary support from Blue Cross Blue Shield of Michigan for his role as the co-director of the Michigan Value Collaborative and his involvement in the Michigan Urological Surgery Improvement Collaborative. Mr Voit receives salary support from the Michigan Value Collaborative as an account manager and is an employee of ArborMetrix.
Author Disclosures: The authors are employed by University of Michigan, which has a contract from Blue Cross Blue Shield of Michigan to operate the Michigan Value Collaborative.
Authorship Information: Concept and design (CE, JDS, DCM, JMD); acquisition of data (CE, JDS, BV, DCM); analysis and interpretation of data (CE, JDS, BV, VG, DCM, JMD); drafting of the manuscript (CE, JDS, BV, VG, DCM); critical revision of the manuscript for important intellectual content (CE, JDS, BV, VG, DCM, JMD); statistical analysis (JDS, VG); provision of patients or study materials (JMD); obtaining funding (JMD); administrative, technical, or logistic support (CE, JDS, VG, DCM, JMD); and supervision (JDS, JMD).
Address Correspondence to: Chad Ellimoottil, MD, MS, University of Michigan, 2800 Plymouth Rd, Bldg 16, 1st Fl, Room 100S, Ann Arbor, MI 48109-2800. E-mail: email@example.com. REFERENCES
1. Chen LM, Meara E, Birkmeyer JD. Medicare’s Bundled Payments for Care Improvement initiative: expanding enrollment suggests potential for large impact. Am J Manag Care. 2015;21(11):814-820.
2. Newcomer LN, Gould B, Page RD, Donelan SA, Perkins M. Changing physician incentives for affordable, quality cancer care: results of an episode payment model. J Oncol Pract. 2014;10(5):322-326. doi: 10.1200/JOP.2014.001488.
3. Chandra A, Dalton MA, Holmes J. Large increases in spending on postacute care in Medicare point to the potential for cost savings in these settings. Health Aff (Millwood). 2013;32(5):864-872. doi: 10.1377/hlthaff.2012.1262.
4. Newcomer LN. Trying something new: episode payments for cancer therapy. Am J Manag Care. 2012;18(spec no 1):SP5.
5. Rosen AB, Aizcorbe A, Ryu AJ, Nestoriak N, Cutler DM, Chernew ME. Policy makers will need a way to update bundled payments that reflects highly skewed spending growth of various care episodes. Health Aff (Millwood). 2013;32(5):944-951. doi: 10.1377/hlthaff.2012.1246.
6. Cutler DM, Ghosh K. The potential for cost savings through bundled episode payments. N Engl J Med. 2012;366(12):1075-1077. doi: 10.1056/NEJMp1113361.
7. Mechanic RE. Mandatory Medicare bundled payment—is it ready for prime time? N Engl J Med. 2015;373(14):1291-1293. doi: 10.1056/NEJMp1509155.
8. Ackerly DC, Grabowski DC. Post-acute care reform—beyond the ACA. N Engl J Med. 2014;370(8):689-691. doi:10.1056/NEJMp1315350.
9. Michigan Value Collaborative website. michiganvalue.org. Published 2015. Accessed March 13, 2016.
10. Share DA, Campbell DA, Birkmeyer N, et al. How a regional collaborative of hospitals and physicians in Michigan cut costs and improved the quality of care. Health Aff (Millwood). 2011;30(4):636-645. doi: 10.1377/hlthaff.2010.0526.
11. Grenda TR, Pradarelli JC, Thumma JR, Dimick JB. Variation in hospital episode costs with bariatric surgery. JAMA Surg. 2015;150(12):1109-1115. doi: 10.1001/jamasurg.2015.2394.
12. Cram P, Ravi B, Vaughan-Sarrazin MS, Lu X, Li Y, Hawker G. What drives variation in episode-of-care payments for primary TKA? an analysis of Medicare administrative data. Clin Orthop Relat Res. 2015;473(11):3337-3347. doi: 10.1007/s11999-015-4445-0.
13. Schoenfeld AJ, Harris MB, Liu H, Birkmeyer JD. Variations in Medicare payments for episodes of spine surgery. Spine J. 2014;14(12):2793-2798. doi: 10.1016/j.spinee.2014.07.002.
14. Bozic KJ, Ward L, Vail TP, Maze M. Bundled payments in total joint arthroplasty: targeting opportunities for quality improvement and cost reduction. Clin Orthop Relat Res. 2014;472(1):188-193. doi: 10.1007/s11999-013-3034-3.
15. Miller DC, Gust C, Dimick JB, Birkmeyer N, Skinner J, Birkmeyer JD. Large variations in Medicare payments for surgery highlight savings potential from bundled payment programs. Health Aff (Millwood). 2011;30(11):2107-2115. doi: 10.1377/hlthaff.2011.0783.
16. Birkmeyer JD, Gust C, Baser O, Dimick JB, Sutherland JM, Skinner JS. Medicare payments for common inpatient procedures: implications for episode-based payment bundling. Health Serv Res. 2010;45(6 pt 1):1783-1795. doi: 10.1111/j.1475-6773.2010.01150.x.
17. Horwitz LI, Grady JN, Cohen DB, et al. Development and validation of an algorithm to identify planned readmissions from claims data. J Hosp Med. 2015;10(10):670-677. doi: 10.1002/jhm.2416.
18. Sacks GD, Dawes AJ, Russell MM, et al. Evaluation of hospital readmissions in surgical patients: do administrative data tell the real story? JAMA Surg. 2014;149(8):759-764. doi: 10.1001/jamasurg.2014.18.
19. Wijeysundera DN, Austin PC, Hux JE, Beattie WS, Buckley DN, Laupacis A. Development of an algorithm to identify preoperative medical consultations using administrative data. Med Care. 2009;47(12):1258-1264. doi: 10.1097/MLR.0b013e3181bd479c.
20. Lee VS, Kawamoto K, Hess R, et al. Implementation of a value-driven outcomes program to identify high variability in clinical costs and outcomes and association with reduced cost and improved quality. JAMA. 2016;316(10):1061-1072. doi: 10.1001/jama.2016.12226.