Choosing Wisely Clinical Decision Support Adherence and Associated Inpatient Outcomes

August 15, 2018
Andrew M. Heekin, PhD

,
John Kontor, MD

,
Harry C. Sax, MD

,
Michelle S. Keller, MPH

,
Anne Wellington, BA

,
Scott Weingarten, MD

Volume 24, Issue 8

This analysis examines the associations between adherence to Choosing Wisely recommendations embedded into clinical decision support alerts and 4 measures of resource use and quality.

ABSTRACT

Objectives: To determine whether utilization of clinical decision support (CDS) is correlated with improved patient clinical and financial outcomes.

Study Design: Observational study of 26,424 patient encounters. In the treatment group, the provider adhered to all CDS recommendations. In the control group, the provider did not adhere to CDS recommendations.

Methods: An observational study of provider adherence to a CDS system was conducted using inpatient encounters spanning 3 years. Data comprised alert status (adherence), provider type (resident, attending), patient demographics, clinical outcomes, Medicare status, and diagnosis information. We assessed the associations between alert adherence and 4 outcome measures: encounter length of stay, odds of 30-day readmission, odds of complications of care, and total direct costs. The associations between alert adherence and the outcome measures were estimated using 4 generalized linear models that adjusted for potential confounders, such as illness severity and case complexity.

Results: The total encounter cost increased 7.3% (95% CI, 3.5%-11%) for nonadherent encounters versus adherent encounters. We found a 6.2% (95% CI, 3.0%-9.4%) increase in length of stay for nonadherent versus adherent encounters. The odds ratio for readmission within 30 days increased by 1.14 (95% CI, 0.998-1.31) for nonadherent versus adherent encounters. The odds ratio for complications increased by 1.29 (95% CI, 1.04-1.61) for nonadherent versus adherent encounters.

Conclusions: Consistent improvements in measured outcomes were seen in the treatment group versus the control group. We recommend that provider organizations consider the introduction of real-time CDS to support adherence to evidence-based guidelines, but because we cannot determine the cause of the associations between CDS interventions and improved clinical and financial outcomes, further study is required.

Am J Manag Care. 2018;24(8):361-366Takeaway Points

This analysis examined the associations between adherence to Choosing Wisely recommendations embedded into clinical decision support (CDS) alerts and 4 measures of resource use and quality.

  • Encounters in which providers adhered to all alerts had significantly lower total costs, shorter lengths of stay, a lower probability of 30-day readmissions, and a lower probability of complications compared with nonadherent encounters.
  • Full adherence to Choosing Wisely alerts was associated with savings of $944 from a median encounter cost of $12,940.
  • Health systems should consider real-time CDS interventions as a method to encourage improved adoption of evidence-based guidelines.

The Health Information Technology for Economic and Clinical Health Act, an important component of the American Recovery and Reinvestment Act, enabled the federal government to subsidize hospitals, health systems, and physicians $40 billion1 to implement electronic health records (EHRs) and imposed significant penalties on nonadopters. This investment was expected to result in up to $470 billion in inpatient cost savings alone2 through reduced patient length of stay,3 reduced utilization of services, and other outcomes.4 Today, certified EHRs are operational in 96% of nonfederal acute care hospitals and health systems in the United States,5 but the expected cost savings have not yet been realized.6 The evidence that EHRs improve quality and patient outcomes has been mixed. Some studies have found improved quality of care in the ambulatory care setting,7 higher guideline adherence, fewer medication errors, and decreased adverse drug effects.8 However, other studies have found that EHR use is not associated with decreased readmissions9,10 or lower rates of mortality.8

In 2012, the ABIM Foundation introduced the Choosing Wisely (CW) initiative, a voluntary effort by more than 70 physician subspecialty societies to identify commonly used low-value services.11 The intent of this publicly promoted initiative was to stimulate provider—patient discussions about appropriate care and thereby reduce low-value tests and treatments.7 Although the primary aim of CW is not to lower costs, reducing inappropriate care could lead to lower costs for both patients and payers. To date, CW may not have achieved clinically significant changes in reducing low-value care.12-14 Public promotion alone does not appear to be sufficient to achieve widespread adoption.10 A 2015 claims-based analysis of 7 CW recommendations found that use of 2 low-value services declined, but the decreases were not clinically significant.10 In their recommendations, the authors called for innovative methods to disseminate CW recommendations.10 Provider difficulty interpreting guidelines and evaluating patient risk,15,16 patient need for reassurance,13 and provider fear of malpractice litigation17 pose additional obstacles.

Ideally, an EHR infrastructure could overcome these obstacles and provide real-time computerized clinical decision support (CDS) to inform healthcare providers when their care deviates from evidence-based guidelines. CDS comprises a variety of tools, including computerized alerts and reminders with information such as diagnostic support, clinical guidelines, relevant patient information, diagnosis-specific order sets, documentation templates, and drug—drug interactions.18 CDS provides the ability to modify tests and treatments based on context- and patient-specific information presented at the point of care. Utilizing CDS can help providers avoid ordering a low-value test or intervention that could lead to additional nontherapeutic interventions or harm. CDS has been shown to improve a variety of processes, including prescribing practices,19 appropriate use of diagnostic radiology,20 adherence to quality measures,21 and conformance to evidence-based care.19 Systems that automate CDS, provide tailored recommendations based on patient characteristics, and prompt clinicians to provide a reason for overriding recommendations have been shown to be significantly more likely to succeed than systems that provide only patient assessments.19

We implemented select CW recommendations in the EHR at a large academic health system in the form of 92 alert-based CDS interventions, both inpatient and ambulatory. Inpatient alerts selected for study were those deemed the most technically feasible to deploy accurately and with a sufficient number of relevant orders that would trigger an alert, thus providing a sufficient volume of alerted encounters to evaluate. When initiating a potentially inappropriate order, a provider received real-time notification of deviation from a CW recommendation. That provider then had the option to cancel, change, or justify the order, if he or she agreed with the alert’s recommendation in the context of the individual patient. The objective of this study was to evaluate the relationships between providers who adhered to CW alerts and measurable outcomes.

METHODS

Study Setting

We conducted an observational study of provider adherence to the 18 highest-volume CW alerts utilizing a commercially available EHR-embedded CDS system at Cedars-Sinai Health System, a nonprofit tertiary 886-bed hospital and multispecialty academic health science center located in Los Angeles, California. The medical staff is pluralistic and includes employed and independent physicians in private practice, physician extenders, and residents.

This study included inpatient encounters from October 22, 2013, to July 31, 2016. The study protocol was approved by the Cedars-Sinai Medical Center (CSMC) Institutional Review Board.

Study Population and Data Sources

Data for the study were collected from 3 sources: data sent from the EHR to the CDS analytics platform, which included the category of the provider triggering the alert (eg, resident, attending) and clinical data allowing for the assessment of adherence or nonadherence to the alert during the encounter; claims data, which included patient demographics (eg, age, gender), diagnoses, services provided, admit and discharge dates, Medicare Severity-Diagnosis Related Group codes, and costs; and direct cost data associated with the patient care department, which we describe below. The unit of analysis is the patient encounter; this covers the entire inpatient visit, and there is only 1 encounter per visit. Data were matched using a common encounter identifier. Encounters in which the providers were considered adherent included all encounters that received CW alerts and for which providers adhered to all alerts. Alerts were considered adhered to when the order flagged by the CDS as potentially conflicting with CW was not signed within 1 hour after an alert was shown to a provider. The nonadherent group included encounters where providers received CW alerts and for which they did not adhere to any (complete nonadherence). Approximately 1400 encounters that did not meet either of these criteria were excluded because they included partial adherence to some but not all of the alerts. The Elixhauser index was computed as an unweighted sum of comorbidities present during all encounters for a given patient to estimate the morbidity burden.22

Alert Selection and Development

In 2013, we integrated CW recommendations as CDS alerts into the Epic EHR at CSMC. A clinical informatics team enabled the translation of the CW recommendations through a standardized process. First, clinicians reviewed the primary sources cited in each recommendation to define inclusion and exclusion criteria for the CDS rule. Once defined, the clinical logic was deployed in the EHR using standardly available alert tools. Finally, the team reviewed patient charts from encounters in which alerts were triggered and identified opportunities to refine the logic and reduce false positives.

To define the alerts for inclusion in the study, we initially reviewed all inpatient CW alerts that were active in the CSMC EHR at any point during the study period. For this analysis, we eliminated any low-volume alerts that sounded an average of less than once per month. The general definition of when an alert is adhered to is when a provider is advised against taking a particular action and complies with the request. Specifically, because our adherence criteria are used to evaluate EHR data to determine whether a particular order was signed within an hour after seeing an alert, we cannot accurately categorize adherence to alerts that either make recommendations about the appropriateness of individual orders within a series of identical orders (ie, repeat or standing laboratory testing) or that do not flag a particular order as inappropriate and instead are reminders unrelated to avoiding unnecessary care (eg, “Don’t delay palliative care for patients with advanced gynecological cancer.”). All remaining alerts were included in the data set (eAppendix Table 1 [eAppendix available at ajmc.com]).

Outcomes

We assessed the associations between alert adherence and 4 outcomes measures: encounter length of stay, 30-day readmissions, complications of care, and total direct costs. We defined 30-day readmission as an inpatient readmission to the same facility for any cause occurring within 30 days of discharge that was unplanned and deemed unavoidable. Complications of care were defined using the Agency for Healthcare Research and Quality (AHRQ) Healthcare Cost and Utilization Project (HCUP) classification system for complication codes.23 Total direct costs were defined as expenses directly associated with patient care, such as labor (wages, salaries, agency, and employee benefits), supplies (medical, implant, and nonmedical), professional fees, contracted services, equipment, and equipment depreciation.24 We selected these 4 outcomes measures due to their relevance to patients, health systems, and payers. As the industry shifts from fee-for-service to value-based contracts, cost containment and quality have become critical priorities for healthcare providers. Length of stay, readmission rates, and complication rates also merit evaluation, given their potential impact on patient outcomes and hospital value-based payment programs.25,26 Given that many low-value tests and procedures can result in a chain of additional tests and procedures, we theorized that reducing inappropriate and low-value services may lead to shorter lengths of stay, lower 30-day readmission rates, and lower complication rates.

Statistical Analysis

The adherent and nonadherent encounter groups were compared based on demographic characteristics, number of diagnoses, and case severity. The χ2 test was used for categorical variables, and the Wilcoxon rank sum test was used for continuous variables.

We estimated the association between alert adherence and the outcome measures using 4 generalized linear models. Alert adherence was measured as a dichotomous predictor. We adjusted for potential confounders, such as illness severity and case complexity, using demographic and clinical variables, such as gender, age, All Patient Refined Diagnosis Related Group (APR-DRG) severity level, number of diagnoses, expected length of stay, Elixhauser comorbidity index, Medicare status, and case mix index. A subset of all independent variables was used in each regression model to maximize the quality of fit of the model. Variable selection was performed using a backward stepwise method while minimizing the Akaike information criteria. In addition to alert adherence, variables were included in all models to adjust for the differences between the characteristics of the 2 groups. The continuous covariates generally had skewed distributions and were transformed prior to inclusion in the models. All statistical analyses were performed using R version 3.3.127 and the following packages: glm2,28 caret,29 and sqldf.30

Multiple logistic regression was used to estimate the odds of patient outcomes that were dichotomous (ie, 30-day readmissions and complications of care). The 2 continuous outcomes, length of stay and total cost, were estimated using multiple linear regression models, with the dependent variable log-transformed to correct for significant right skew in the distribution of each outcome. The outcome variables also appeared as independent variables in other models. Statistical tests were 2-sided, with P <.05 considered statistically significant. More detailed discussion of the regression models is included in the eAppendix.

RESULTS

A total of 26,424 encounters were included in the analysis out of a total of approximately 100,000 encounters. In 1591 (6%) of these encounters, providers adhered to all alerts (an “adherent encounter”); in the remaining 24,833 (94%) encounters, no alerts were adhered to (a “nonadherent encounter”) (Table 1). Patients in the adherent and nonadherent encounter groups were similar with respect to age (P = .32) and total diagnoses (P = .26). Additionally, both encounter groups were comparable with respect to the proportion of patients whose primary payer was Medicare (P = .94). There were significant differences in APR-DRG severity levels (P = .01), with sicker patients in the nonadherent group (a greater proportion of nonadherent patients classified at level 4, extreme). Additionally, there were differences with respect to Elixhauser index scores (P = .04), case mix index values (P = .02), gender (P = .05), and expected length of stay (P <.001) (Table 1).

With respect to outcomes, bivariate analyses indicated that patient encounters in the group in which providers did not adhere to CW recommendations had longer unadjusted actual lengths of stay (P <.001) and higher complication rates (P <.001), 30-day readmission rates (P = .02), and direct costs (P <.001).

Overall, adherent encounters had significantly lower total costs, shorter lengths of stay, and lower odds of complications compared with nonadherent encounters. The coefficient of the independent variable used to determine lower odds of 30-day readmissions when the encounter is in the adherent group did not achieve significance. After adjusting for patient characteristics, nonadherent encounters also showed a 7.3% (95% CI, 3.5%-11%; P <.001) increase in total direct costs versus adherent encounters. That represents an increase of $944 for a nonadherent encounter versus an adherent encounter (Table 2).

We found a 6.2% (95% CI, 3.0%-9.4%; P <.001) increase in length of stay for nonadherent versus adherent encounters (Table 3). We found that the odds of a patient having a readmission within 30 days were 1.14 (95% CI, 0.998-1.31; P = .0503) times higher in nonadherent encounters (Table 4). The odds of a patient having complications were 1.29 (95% CI, 1.04-1.61; P = .02) times higher in nonadherent encounters (Table 5).

DISCUSSION

To our knowledge, this is the first study to evaluate the association between adherence to multiple CW guidelines delivered via CDS and changes in clinical and financial outcomes. Previous studies have established that effective CDS can impact provider behavior and contribute to improved patient outcomes for specific CDS interventions.31 This study contributes to the established body of research indicating that adherence to effective CDS alerts is associated with improved outcomes, such as length of stay,32,33 complication rates,34,35 and overall cost.36,37 Our analysis provides new evidence of the effect that a more comprehensive collection of alerts has on high-level patient and financial outcomes, including shorter length of stay (0.06 days), lower complication rates (odds ratio, 1.29), and reduced cost (7.3%) per adhered patient episode.

Our results suggest that the difference in cost savings is statistically and clinically significant. Adherent encounters resulted in approximately $944 in savings from the median encounter cost of $12,940. A previous study examined the prevalence of 28 low-value services in a large population of commercially insured adults and identified an average potential cost savings of approximately $300 for each patient who received a single low-value service.10 Our findings surpass this estimate and imply significant cost-savings opportunities through improved and broader utilization of CDS.

Our results also confirm the association between alert adherence and odds of complications as defined by AHRQ’s HCUP.18 The majority of studies do not specifically analyze the effect of CDS interventions on complication rates for patients; rather, they identify undesired outcomes, such as adverse drug events26 and mortality rates.25 Although it is plausible that the reduction in the utilization of low-value services resulting in lower inpatient lengths of stay may lead to reduced complication rates, we did not evaluate potential causation between specific complications and avoided interventions. Additional research to confirm these findings and, more specifically, to delineate the causal pathway is indicated.

Previous studies’ findings have shown a positive correlation between CDS implementation and patient length of stay.23,24 However, to our knowledge, no analyses have established a correlation between CDS content targeting unnecessary care and improved lengths of stay. Our findings demonstrate an association between adherence to guideline-based alerts and reduction in unnecessary care with shortened inpatient length of stay.

Limitations

One limitation is our strict definition of “alert compliance”: In order to be in the adherent encounter group, providers had to be adherent to all of the CW CDS-related alerts. Patient episodes in which clinicians followed some but not all of the CW alerts that fired were considered “mixed-adherence” episodes and were excluded from analysis. This strict inclusion criterion limits our understanding of the clinical and financial impacts that patients with partially adherent episodes may have experienced. Similarly, we were unable to differentiate the impact of specific alerts on our studied outcomes. Although there appear to be some differences between individual alerts, our study did not have enough power to make inferences due to its small sample sizes. Another key limitation of this study is the lack of control for provider effects. The analysis did not include provider characteristics and thus could not examine confounding on the provider level; it is possible that some providers are more likely to trigger alerts or are more likely to be nonadherent to alerts, even though we found no overall correlation between provider acceptance rate and provider outcomes. Additionally, providers who are more likely to adhere to evidence-based guidelines, including CW, may be more likely to ascribe to other system-based approaches and practices consistent with value-based patient care. We need to better understand the differences in characteristics and practice patterns of providers who adhere to CW recommendations compared with providers who do not.

Future analyses should examine the role of specific physician and alert characteristics on adherence to CDS and the effects on outcomes. The Elixhauser index computation was based on all relevant diagnoses made on all encounters for a given patient within our data set. It is possible, however, that the patient could have received additional relevant diagnoses outside of the time frame or hospital system applicable to this study.

Although our regression models adjusted for severity of illness, it is possible that the model did not control for all differences in patient severity or characteristics. Moreover, this study did not seek to establish causation between CW adherence and improved patient and financial outcomes. Many factors determine whether a single alert is adhered to or ignored, including alert fatigue,38 provider familiarity with the guideline presented,39 fear of malpractice,13 or need to reassure one’s patient through further diagnostic tests.16 We were unable to capture some relevant data, including professional billing fees and cost and readmissions data from other facilities, limiting our outcomes analyses. Finally, our demonstrated correlations between adherence and outcomes cannot necessarily be generalized to all CDS interventions, as the alerts evaluated in this study were implemented in the inpatient setting, were deemed the most technically feasible to deploy accurately, and had sufficient volume to evaluate.

CONCLUSIONS

We recommend that health systems consider real-time CDS interventions as a method to encourage improved adoption of CW and other evidence-based guidelines. A meta-analysis of CDS systems concluded that by providing context-specific information at the point of care, the odds of providers adopting guideline recommendations are 112 times higher.19 CDS enables the provision of context-specific information at the point of care and could help to overcome several known barriers to CW guideline adoption.

Our findings contribute to the evidence base surrounding the use of CDS and improvements in patient clinical and financial outcomes. Formal prospective cohort studies and randomized CDS intervention trials, perhaps randomizing providers assigned to receive CDS interventions, should be prioritized to help guide future provider strategies in regard to reducing low-value care.&ensp;

Acknowledgments

The authors gratefully acknowledge the administrative and material support of Georgia Hoyler and Marin Lopci.Author Affiliations: Optum (AMH, JK), Washington, DC; Cedars-Sinai Medical Center (HCS, MSK, SW), Los Angeles, CA; Stanson Health (AW, SW), Los Angeles, CA.

Source of Funding: No funding beyond existing employment agreements was provided for this research.

Author Disclosures: Drs Heekin and Kontor are employed by Optum, which is a licensed reseller of Stanson Health, including its Choosing Wisely alert content evaluated in this study. Dr Sax is employed by Cedars-Sinai, which is the major shareholder of Stanson Health; he is not directly involved in or on the board of Stanson. Ms Keller is employed by Cedars-Sinai Medical Center, which is the employer of Stanson Health’s founders Scott Weingarten and Darren Dworkin. Ms Wellington was employed by and owns stock in Stanson Health; she is now employed at Cedars-Sinai Medical Center, which is a major shareholder of Stanson Health. Dr Weingarten is Chairman of the Board of and owns stock in Stanson Health.

Authorship Information: Concept and design (AMH, JK, HCS, SW); acquisition of data (AW); analysis and interpretation of data (AMH, JK, HCS, MSK, AW, SW); drafting of the manuscript (AMH, JK, HCS, MSK, AW, SW); critical revision of the manuscript for important intellectual content (JK, HCS, MSK, SW); statistical analysis (AMH); and provision of patients or study materials (SW).

Address Correspondence to: John Kontor, MD, Optum, 2445 M St NW, Washington, DC 20001. Email: kontorj@advisory.com.REFERENCES

1. Kayyali B, Knott D, Van Kuiken S. The big-data revolution in US health care: accelerating value and innovation. McKinsey & Company website. mckinsey.com/industries/healthcare-systems-and-services/our-insights/the-big-data-revolution-in-us-health-care. Published April 2013. Accessed December 2, 2016.

2. Hillestad R, Bigelow J, Bower A, et al. Can electronic medical record systems transform health care? potential health benefits, savings, and costs. Health Aff (Millwood). 2005;24(5):1103-1117. doi: 10.1377/hlthaff.24.5.1103.

3. Tierney WM, Miller ME, Overhage JM, McDonald CJ. Physician inpatient order writing on microcomputer workstations: effects on resource utilization. JAMA. 1993;269(3):379-383. doi: 10.1001/jama.1993.03500030077036.

4. Wang SJ, Middleton B, Prosser LA, et al. A cost-benefit analysis of electronic medical records in primary care. Am J Med. 2003;114(5):397-403. doi: 10.1016/S0002-9343(03)00057-3.

5. Henry J, Pylypchuk Y, Searcy T, Patel V. Adoption of electronic health record systems among U.S. non-federal acute care hospitals: 2008-2015 [ONC data brief no. 35]. Office of the National Coordinator for Health Information Technology website. dashboard.healthit.gov/evaluations/data-briefs/non-federal-acute-care-hospital-ehr-adoption-2008-2015.php. Published May 2016. Accessed October 27, 2016.

6. Adler-Milstein J, Everson J, Lee S-YD. EHR adoption and hospital performance: time-related effects. Health Serv Res. 2015;50(6):1751-1771. doi: 10.1111/1475-6773.12406.

7. Kern LM, Barrón Y, Dhopeshwarkar RV, Edwards A, Kaushal R; HITEC Investigators. Electronic health records and ambulatory quality of care. J Gen Intern Med. 2013;28(4):496-503. doi: 10.1007/s11606-012-2237-8.

8. Campanella P, Lovato E, Marone C, et al. The impact of electronic health records on healthcare quality: a systematic review and meta-analysis. Eur J Public Health. 2016;26(1):60-64. doi: 10.1093/eurpub/ckv122.

9. Lammers EJ, McLaughlin CG, Barna M. Physician EHR adoption and potentially preventable hospital admissions among Medicare beneficiaries: panel data evidence, 2010-2013. Health Serv Res. 2016;51(6):2056-2075. doi: 10.1111/1475-6773.12586.

10. Patterson ME, Marken P, Zhong Y, Simon SD, Ketcherside W. Comprehensive electronic medical record implementation levels not associated with 30-day all-cause readmissions within Medicare beneficiaries with heart failure. Appl Clin Inform. 2014;5(3):670-684. doi: 10.4338/aci-2014-01-ra-0008.

11. Wolfson D, Santa J, Slass L. Engaging physicians and consumers in conversations about treatment overuse and waste: a short history of the Choosing Wisely campaign. Acad Med. 2014;89(7):990-995. doi: 10.1097/ACM.0000000000000270.

12. Berwick DM, Hackbarth AD. Eliminating waste in US health care. JAMA. 2012;307(14):1513-1516. doi: 10.1001/jama.2012.362.

13. Rosenberg A, Agiro A, Gottlieb M, et al. Early trends among seven recommendations from the Choosing Wisely campaign. JAMA Intern Med. 2015;175(12):1913-1920. doi: 10.1001/jamainternmed.2015.5441.

14. Reid RO, Rabideau B, Sood N. Low-value health care services in a commercially insured population. JAMA Intern Med. 2016;176(10):1567-1571. doi: 10.1001/jamainternmed.2016.5031.

15. Zikmund-Fisher BJ, Kullgren JT, Fagerlin A, Klamerus ML, Bernstein SJ, Kerr EA. Perceived barriers to implementing individual Choosing Wisely recommendations in two national surveys of primary care providers. J Gen Intern Med. 2017;32(2):210-217. doi: 10.1007/s11606-016-3853-5.

16. Krouss M, Croft L, Morgan DJ. Physician understanding and ability to communicate harms and benefits of common medical treatments. JAMA Intern Med. 2016;176(10):1565-1567. doi: 10.1001/jamainternmed.2016.5027.

17. Colla CH, Kinsella EA, Morden NE, Meyers DJ, Rosenthal MB, Sequist TD. Physician perceptions of Choosing Wisely and drivers of overuse. Am J Manag Care. 2016;22(5):337-343.

18. Clinical decision support: more than just ‘alerts’ tipsheet. CMS website. cms.gov/regulations-and-guidance/legislation/EHRincentiveprograms/downloads/clinicaldecisionsupport_tipsheet-.pdf. Published 2014. Accessed October 25, 2016.

19. Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765. doi: 10.1136/bmj.38398.500764.8F.

20. Goldzweig CL, Orshansky G, Paige NM, et al. Electronic health record-based interventions for improving appropriate diagnostic imaging: a systematic review and meta-analysis. Ann Intern Med. 2015;162(8):557-565. doi: 10.7326/m14-2600.

21. Raja AS, Gupta A, Ip IK, Mills AM, Khorasani R. The use of decision support to measure adherence to a national imaging quality measure. Acad Radiol. 2014;21(3):378-383. doi: 10.1016/j.acra.2013.10.017.

22. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8-27.

23. Clinical Classifications Software (CCS) for ICD-9-CM. Healthcare Cost and Utilization Project website. hcup-us.ahrq.gov/toolssoftware/ccs/ccs.jsp. Published October 2016. Accessed December 2, 2016.

24. Lawthers AG, McCarthy EP, Davis RB, Peterson LE, Palmer RH, Iezzoni LI. Identification of in-hospital complications from claims data: is it valid? Med Care. 2000;38(8):785-795.

25. Readmissions Reduction Program (HRRP). CMS website. cms.gov/medicare/medicare-fee-for-service-payment/acuteinpatientpps/readmissions-reduction-program.html. Published April 2016. Accessed December 2, 2016.

26. Hospital-acquired conditions. CMS website. cms.gov/medicare/medicare-fee-for-service-payment/hospitalacqcond/hospital-acquired_conditions.html. Published August 2015. Accessed December 2, 2016.

27. The R project for statistical computing. R project website. r-project.org. Published 2016. Accessed August 16, 2016.

28. glm2: fitting generalized linear models. R project website. CRAN.R-project.org/package=glm2. Published 2014. Accessed August 16, 2016.

29. caret: classification and regression training. R project website. CRAN.R-project.org/package=caret. Published 2016. Accessed August 16, 2016.

30. sqldf: manipulate R data frames using SQL. R project website. CRAN.R-project.org/package=sqldf. Published 2014. Accessed August 16, 2016.

31. Schedlbauer A, Prasad V, Mulvaney C, et al. What evidence supports the use of computerized alerts and prompts to improve clinicians’ prescribing behavior? J Am Med Inform Assoc. 2009;16(4):531-538. doi: 10.1197/jamia.M2910.

32. Dimagno MJ, Wamsteker EJ, Rizk RS, et al. A combined paging alert and web-based instrument alters clinician behavior and shortens hospital length of stay in acute pancreatitis. Am J Gastroenterol. 2014;109(3):306-315. doi: 10.1038/ajg.2013.282.

33. Vicente V, Svensson L, Wireklint Sundström B, Sjöstrand F, Castren M. Randomized controlled trial of a prehospital decision system by emergency medical services to ensure optimal treatment for older adults in Sweden. J Am Geriatr Soc. 2014;62(7):1281-1287. doi: 10.1111/jgs.12888.

34. Panella M, Marchisio S, Di Stanislao F. Reducing clinical variations with clinical pathways: do pathways work? Int J Qual Health Care. 2003;15(6):509-521. doi: 10.1093/intqhc/mzg057.

35. Wolfstadt JI, Gurwitz JH, Field TS, et al. The effect of computerized physician order entry with clinical decision support on the rates of adverse drug events: a systematic review. J Gen Intern Med. 2008;23(4):451-458. doi: 10.1007/s11606-008-0504-5.

36. Lin Y-C, Chang C-S, Yeh C-J, Wu Y-C. The appropriateness and physician compliance of platelet usage by a computerized transfusion decision support system in a medical center. Transfusion. 2010;50(12):2565-2570. doi: 10.1111/j.1537-2995.2010.02757.x.

37. Bayati M, Braverman M, Gillam M, et al. Data-driven decisions for reducing readmissions for heart failure: general methodology and case study. PLoS One. 2014;9(10):e109264. doi: 10.1371/journal.pone.0109264.

38. van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc. 2006;13(2):138-147. doi: 10.1197/jamia.M1809.

39. Halm EA, Atlas SJ, Borowsky LH, et al. Understanding physician adherence with a pneumonia practice guideline: effects of patient, system, and physician factors. Arch Intern Med. 2000;160(1):98-104. doi: 10.1001/archinte.160.1.98.