Currently Viewing:
Supplements Special Issue: Health Information Technology — Guest Editors: Sachin H. Jain, MD, MBA; and David B
The Road to Electronic Health Records Is Paved With Operations
Amir Dan Rubin, MBA, MHSA; and Virginia A. McFerran, MA
Finding Cancer at Home
Katlyn L. Nemani, BA
Alternative Measures of Electronic Health Record Adoption Among Hospitals
Fredric E. Blavin, MS; Melinda J. Beeuwkes Buntin, PhD; and Charles P. Friedman, PhD
Physician ePortfolio: The Missing Piece for Linking Performance With Improvement
Nancy L. Davis, PhD; Lloyd Myers, RPh; and Zachary E. Myers
Using Electronic Prescribing Transaction Data to Estimate Electronic Health Record Adoption
Emily Ruth Maxson, BS; Melinda J. Beeuwkes Buntin, PhD; and Farzad Mostashari, MD, ScM
Understanding Meaningful Outcomes
Daniel C. Armijo, MHSA; Eric J. Lammers, MPP; and Dean G. Smith, PhD
Currently Reading
Electronic Health Record Feedback to Improve Antibiotic Prescribing for Acute Respiratory Infections
Jeffrey A. Linder, MD, MPH; Jeffrey L. Schnipper, MD, MPH; Ruslana Tsurikova, Msc, MA; D. Tony Yu, MD, MPH; Lynn A. Volk, MHS; Andrea J. Melnikas, MPH; Matvey B. Palchuk, MD, MS; Maya Olsha-Yehiav, MS
Achieving Meaningful Use: A Health System Perspective
Cynthia L. Bero, MPH; and Thomas H. Lee, MD
Health Information Technology Is Leading Multisector Health System Transformation
Sachin H. Jain, MD, MBA; and David Blumenthal, MD, MPP
Health Information Technology and Health System Redesign-The Quality Chasm Revisited
Reed V. Tuckson, MD; Deneen Vojta, MD; and Andrew M. Slavitt, MBA
Health Information Technology and the Medical School Curriculum
Marc M. Triola, MD; Erica Friedman, MD; Christopher Cimino, MD; Enid M. Geyer, MLS, MBA; Jo Wiederhorn, MSW; and Crystal Mainiero
Congressional Intent for the HITECH Act
Pete Stark
Optimizing Health Information Technology's Role in Enabling Comparative Effectiveness Research
Amol S. Navathe, MD, PhD; and Patrick H. Conway, MD, MSc
Healthcare Information Technology Interventions to Improve Cardiovascular and Diabetes Medication Adherence
Alexander S. Misono, BA; Sarah L. Cutrona, MD, MPH; Niteesh K. Choudhry, MD, PhD; Michael A. Fischer, MD, MS; Margaret R. Stedman, PhD; Joshua N. Liberman, PhD; Troyen A. Brennan, MD, JD; Sachin H. Ja
Uniting the Tribes of Health System Improvement
Aaron McKethan, PhD; and Craig Brammer
Electronic Health Record Adoption and Quality Improvement in US Hospitals
Spencer S. Jones, PhD; John L. Adams, PhD; Eric C. Schneider, MD; Jeanne S. Ringel, PhD; and Elizabeth A. McGlynn, PhD
A Health Plan Prescription for Health Information Technology
Newt Gingrich, PhD, MA; and Malik Hasan, MD
HITECH Lays the Foundation for More Ambitious Outcomes-Based Reimbursement
John Glaser, PhD
Increasing Consumerism in Healthcare Through Intelligent Information Technology
Seth B. Cohen, MBA, MPA; Kurt D. Grote, MD; Wayne E. Pietraszek, MBA; and Francois Laflamme, MBA
Electronic Health Records: Potential to Transform Medical Education
Bryant A. Adibe, BS; and Sachin H. Jain, MD, MBA
Smart Health Community: The Hidden Value of Health Information Exchange
James N. Ciriello, MS; and Nalin Kulatilaka, PhD, MS
Effects of Documentation-Based Decision Support on Chronic Disease Management
Jeffrey L. Schnipper, MD, MPH; Jeffrey A. Linder, MD, MPH; Matvey B. Palchuk, MD, MS; D. Tony Yu, MD; Kerry E. McColgan, BA; Lynn A. Volk, MHS; Ruslana Tsurikova, MA; Andrea J. Melnikas, BA; Jonathan

Electronic Health Record Feedback to Improve Antibiotic Prescribing for Acute Respiratory Infections

Jeffrey A. Linder, MD, MPH; Jeffrey L. Schnipper, MD, MPH; Ruslana Tsurikova, Msc, MA; D. Tony Yu, MD, MPH; Lynn A. Volk, MHS; Andrea J. Melnikas, MPH; Matvey B. Palchuk, MD, MS; Maya Olsha-Yehiav, MS

An electronic health record–based feedback program, the Acute Respiratory Infection Quality Dashboard, did not lead to an overall change in antibiotic prescribing in primary care.

Objective: To examine whether the Acute Respiratory Infection (ARI) Quality Dashboard, an electronic health record (EHR)–based feedback system, changed antibiotic prescribing.

 

Study Design: Cluster randomized, controlled trial.

 

Methods: We randomly assigned 27 primary care practices to receive the ARI Quality Dashboard or usual care. The primary outcome was the intent-to-intervene antibiotic prescribing rate for ARI visits. We also compared antibiotic prescribing between ARI Quality Dashboard users and nonusers.

 

Results: During the 9-month intervention, there was no difference between intervention and control practices in antibiotic prescribing for all ARI visits (47% vs 47%; P = .87), antibiotic-appropriate ARI visits (65% vs 64%; P = .68), or non–antibiotic-appropriate ARI visits (38% vs 40%; P = .70). Among the 258 intervention clinicians, 72 (28%) used the ARI Quality Dashboard at least once. These clinicians had a lower overall ARI antibiotic prescribing rate (42% vs 50% for nonusers; P = .02). This difference was due to less antibiotic prescribing for non–antibiotic-appropriate ARIs (32% vs 43%; P = .004), including nonstreptococcal pharyngitis (31% vs 41%; P = .01) and nonspecific upper respiratory infections (19% vs 34%; P = .01).

 

Conclusions: The ARI Quality Dashboard was not associated with an overall change in antibiotic prescribing for ARIs, although when used, it was associated with improved antibiotic prescribing. EHR-based quality reporting, as part of “meaningful use,” may not improve care in the absence of other changes to primary care practice.

 

(Am J Manag Care. 2010;16(12 Spec No.):e311-e319)

Quality reporting is one of the criteria for the "meaningful use" of electronic health records. However, introduction of a quality report about antibiotic prescribing for acute respiratory infections, the Acute Respiratory Infection Quality Dashboard, was not associated with improved quality of care.

 

  • Quality reporting, by itself, is frequently insufficient to improve the quality of care.

 

  • To be effective, quality reporting likely needs to be coupled with other interventions like clinician detailing, clinical decision support, patient education, or financial incentives.

 

  • Meaningful use criteria should be evaluated for effectiveness as they are implemented.
Electronic health records (EHRs) have been touted as a way to improve the quality of healthcare in the United States.1,2 The Health Information Technology for Economic and Clinical Health (HITECH) Act, which authorizes unprecedented incentives for EHR adoption, requires eligible physicians to engage in “meaningful use” of EHRs. One of the meaningful use “menu” criteria is the ability to “generate lists of patients by specific conditions” for, among other things, quality improvement.3 Generating such lists may help clinicians understand patterns of care and improve the quality of care, but the effectiveness of this capability is largely untested.

Acute respiratory infections (ARIs) are the most common symptomatic reason for ambulatory visits and account for about half of antibiotic prescriptions in the United States.4,5 Despite guidelines generally discouraging antibiotic prescribing for ARIs, especially for non–antibiotic- appropriate ARIs, about half of antibiotic prescriptions for ARIs are inappropriate.6,7 Inappropriate antibiotic prescribing is clinically ineffective, increases medical costs, increases the prevalence of antibiotic-resistant bacteria, and unnecessarily exposes patients to adverse drug events.8 Most interventions to decrease inappropriate antibiotic prescribing for ARIs have been, at best, modestly effective.9

To examine whether providing EHR-based feedback improves the quality of care and reduces inappropriate antibiotic prescribing for ARIs, we developed the ARI Quality Dashboard, an EHR-integrated, clinician-level report that details antibiotic prescribing for ARIs. We evaluated the effectiveness of the ARI Quality Dashboard in a cluster randomized, controlled clinical trial in primary care practices.

METHODS

Partners HealthCare System is an integrated regional healthcare delivery network in eastern Massachusetts. The main EHR used in Partners HealthCare ambulatory clinics is the Longitudinal Medical Record (LMR). The LMR is an internally developed, full-featured, Certification Commission for Healthcare Information Technology–approved EHR (2006) including primary care and subspecialty notes, problem lists, medication lists, coded allergies, and laboratory test and radiographic study results. The practices in this study began using the LMR between 1999 and 2003.

ARI Quality Dashboard

The ARI Quality Dashboard contains views of clinicians’ antibiotic prescribing and billing practices for ARI visits (Figure 1). Each view displays a clinician’s performance against his or her clinic peers and against national benchmarks. The ARI Quality Dashboard includes the proportion of ARI visits at which antibiotics were prescribed; the proportion of individual ARI diagnoses (eg, pneumonia, sinusitis, acute bronchitis) at which antibiotics were prescribed; the proportion of broaderspectrum antibiotic prescribing; the distribution of ARI visits by evaluation and management billing codes (eg, level 1 through 5); and individual patient visit details, including date of service, antibiotic prescribed, antibiotic class, date of prescription, diagnosis codes, and evaluation and management billing codes. We designed the ARI Quality Dashboard based on the recommendations of the Centers for Disease Control and Prevention and the American College of Physicians.10

We included billing data to provide a sense of a financial incentive to clinicians. By showing evaluation and management billing codes, clinicians could learn whether they were under-billing for ARI visits, which generally make up about 10% of all visits, compared with their peers. However, clinicians had no direct financial incentive to view the ARIQuality  Dashboard. Clinicians’ salaries were overwhelmingly productivity based. Pay-for-performance incentives were in place, which accounted for about 5% of clinicians’ salary, but none were related to antibiotic prescribing.

Clinicians accessed the ARI Quality Dashboard from the EHR Reports Central area, which contained about 10 other reports about preventive and chronic disease management. A clinician could “drill down” to any patient’s medical record directly from the ARI Quality Dashboard to review patient details and export the report for additional follow-up or analysis. We used ASP.NET technology to build the ARI Quality Dashboard. Reports were constructed and viewed using Crystal Reports XI, with data from the Partners HealthCare Quality Data Warehouse, which aggregates data from various sources. The ARI Quality Dashboard displayed visit and prescribing data for the previous year and was automatically updated monthly.

We previously piloted the ARI Quality Dashboard and found that pilot users accessed the ARI Quality Dashboard and found it useful for understanding their antimicrobial prescribing patterns.11-13 Pilot users also found it convenient to be able to validate the ARI Quality Dashboard reports with primary data from the EHR by drilling down to individual patient charts.

Practice Matching, Randomization, and Intervention Implementation

We randomly assigned 27 primary care clinics associated with Partners HealthCare that use the LMR to receive the ARI Quality Dashboard or to usual care. We matched clinics on the basis of size. Matched pairs were randomized, with 1 practice from each pair assigned to receive the intervention and the other assigned to usual care. The Human Research Committee of Partners HealthCare approved the study protocol.

The intervention period was from November 27, 2006, to August 31, 2007. Throughout the intervention period, we sent monthly e-mails reminding clinicians about the ARI Quality Dashboard. Beyond e-mails, there was no coordinated effort to educate the EHR support team or a formal release of the ARI Quality Dashboard functionality to EHR users. The research team provided application and user support for the ARI Quality Dashboard.

Outcomes

The primary outcome was the antibiotic prescribing rate for ARIs, based on electronic prescribing using the EHR, in an intent-to-intervene analysis, adjusted for clustering by practice. We considered ARIs in aggregate to avoid the potential problem of “diagnosis shifting” in which clinicians might prescribe antibiotics, but select more antibiotic-appropriate diagnoses to mask inappropriate prescribing.14 Secondary outcomes included the antibiotic prescribing rate for antibiotic-appropriate diagnoses and non–antibiotic-appropriate diagnoses (see Data Collection and Analysis, below) and for individual ARI diagnoses.

We also performed an “as-used” analysis by comparing antibiotic prescribing between intervention clinicians who used the ARI Quality Dashboard at least once with intervention clinicians who did not use the ARI Quality Dashboard, adjusted for clustering by clinician. Because of the practice-level randomization, we excluded the control clinicians

from the as-used analysis.

Data Collection and Analysis

We identified ARI visits using administrative data coded using International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes. We considered antibiotic-appropriate ARI visits those with an ICD-9-CM code for pneumonia (481-486), streptococcal pharyngitis (034.0), sinusitis (461 and 473), and otitis media (381 and 382).  We considered non– antibiotic-appropriate ARI visits those with an ICD-9-CM code for nonstreptococcal pharyngitis (462 and 463), influenza (487), acute bronchitis (466 and 490), and nonspecific upper respiratory infection (460, 464, and 465). These administrative data have a sensitivity of 98%, a specificity of 96%, and a positive predictive value of 96% for diagnosing ARIs compared with medical record review.15 If a patient had multiple ARI diagnoses at a visit, we counted that visit only once, giving preference to more antibiotic-appropriate diagnoses.

We defined antibiotic use as the EHR prescription of an orally administered antibiotic agent within 3 days of an ARI visit. We previously found that the sensitivity of EHR antibiotic prescribing (ie, the proportion of all antibiotic prescriptions that were generated using the EHR) increased rapidly from 2000 to 2003.15 During the intervention period, it was the policy of study practices that clinicians write all prescriptions using the EHR.

We considered clinicians who saw patients in both intervention and control practices (7% of 573 clinicians) to be intervention clinicians and assigned them to the intervention practices at which they had the most visits. These clinicians had an ARI Quality Dashboard use rate similar to that of clinicians overall. A secondary analysis excluding clinicians who saw patients in both the intervention and control practicesdid not change the results substantively. We removed data for 3 physicians who were involved in the design or implementation of the ARI Quality Dashboard.

We compared characteristics between the control and intervention practices, clinicians, and patients. In the intervention practices, we compared clinicians who used the ARI Quality Dashboard at least once with clinicians who never used the ARI Quality Dashboard.

Statistical Analysis and Power Calculation

We used standard descriptive statistics to compare clinicians and patients. To account for the level of randomization, we adjusted statistical analyses—the c2 test for categorical variables and the t test for continuous variables—for clustering by practice using PROC GENMOD in SAS version 9.1 (SAS Institute, Inc, Cary, NC).16 For the comparison in antibiotic prescribing between intervention clinicians who did and did not use the ARI Quality Dashboard, we adjusted for clustering by clinician. Two-sided P values less than .05 were considered to be significant. Assuming a baseline antibiotic prescribing rate for ARIs of 35%, a of .05, and an intraclass correlation coefficient of 0.10, 1798 visits in each group were required to have 80% power to detect a 7% absolute reduction in the antibiotic prescribing rate, a difference we thought would be clinically significant.15

RESULTS

Practice, Clinician, and Patient Characteristics

Practices ranged in size from 4 to 36 clinicians (mean = 18 [SD = 10]). During the 9-month intervention period, 136,633 patients made 296,548 primary care visits, including 18,488 ARI visits, to 573 clinicians (Figure 2). There was no significant difference between intervention and control practices in number of years using the EHR, mean visits per year, the baseline antibiotic prescribing rate, or the baseline antibiotic prescribing rate for ARIs (data not shown). There were no significant differences in clinician or patient characteristics between intervention and control practices (Table 1 and Table 2).

ARI Quality Dashboard Use

 
Copyright AJMC 2006-2020 Clinical Care Targeted Communications Group, LLC. All Rights Reserved.
x
Welcome the the new and improved AJMC.com, the premier managed market network. Tell us about yourself so that we can serve you better.
Sign Up