Currently Viewing:
The American Journal of Managed Care Special Issue: Health Information Technology
Improving Adherence to Cardiovascular Disease Medications With Information Technology
William M. Vollmer, PhD; Ashli A. Owen-Smith, PhD; Jeffrey O. Tom, MD, MS; Reesa Laws, BS; Diane G. Ditmer, PharmD; David H. Smith, PhD; Amy C. Waterbury, MPH; Jennifer L. Schneider, MPH; Cyndee H. Yonehara, BS; Andrew Williams, PhD; Suma Vupputuri, PhD; and Cynthia S. Rand, PhD
Information Retrieval Pathways for Health Information Exchange in Multiple Care Settings
Patrick Kierkegaard, PhD; Rainu Kaushal, MD, MPH; and Joshua R. Vest, PhD, MPH
The 3 Key Themes in Health Information Technology
Julia Adler-Milstein, PhD
Leveraging EHRs to Improve Hospital Performance: The Role of Management
Julia Adler-Milstein, PhD; Kirstin Woody Scott, MPhil; and Ashish K. Jha, MD, MPH
Currently Reading
Electronic Alerts and Clinician Turnover: The Influence of User Acceptance
Sylvia J. Hysong, PhD; Christiane Spitzmuller, PhD; Donna Espadas, BS; Dean F. Sittig, PhD; and Hardeep Singh, MD, MPH
Primary Care Capacity as Insurance Coverage Expands: Examining the Role of Health Information Technology
Renuka Tipirneni, MD, MSc; Ezinne G. Ndukwe, MPH; Melissa Riba, MS; HwaJung Choi, PhD; Regina Royan, MPH; Danielle Young, MPH; Marianne Udow-Phillips, MHSA; and Matthew M. Davis, MD, MAPP
Adoption of Electronic Prescribing for Controlled Substances Among Providers and Pharmacies
Meghan Hufstader Gabriel, PhD; Yi Yang, MD, PhD; Varun Vaidya, PhD; and Tricia Lee Wilkins, PharmD, PhD
Health Information Exchange and the Frequency of Repeat Medical Imaging
Joshua R. Vest, PhD, MPH; Rainu Kaushal, MD, MPH; Michael D. Silver, MS; Keith Hentel, MD, MS; and Lisa M. Kern, MD
Information Technology and Hospital Patient Safety: A Cross-Sectional Study of US Acute Care Hospitals
Ajit Appari, PhD; M. Eric Johnson, PhD; and Denise L. Anthony, PhD
Automated Detection of Retinal Disease
Lorens A. Helmchen, PhD; Harold P. Lehmann, MD, PhD; and Michael D. Abràmoff, MD, PhD
Trending Health Information Technology Adoption Among New York Nursing Homes
Erika L. Abramson, MD, MS; Alison Edwards, MS; Michael Silver, MS; Rainu Kaushal, MD, MPH; and the HITEC investigators
Electronic Health Record Availability Among Advanced Practice Registered Nurses and Physicians
Janet M. Coffman, PhD, MPP, MA; Joanne Spetz, PhD; Kevin Grumbach, MD; Margaret Fix, MPH; and Andrew B. Bindman, MD
The Value of Health Information Technology: Filling the Knowledge Gap
Robert S. Rudin, PhD; Spencer S. Jones, PhD; Paul Shekelle, MD, PhD; Richard J. Hillestad, PhD; and Emmett B. Keeler, PhD
Overcoming Barriers to a Research-Ready National Commercial Claims Database
David Newman, JD, PhD; Carolina-Nicole Herrera, MA; and Stephen T. Parente, PhD
The Effects of Health Information Technology Adoption and Hospital-Physician Integration on Hospital Efficiency
Na-Eun Cho, PhD; Jongwha Chang, PhD; and Bebonchu Atems, PhD

Electronic Alerts and Clinician Turnover: The Influence of User Acceptance

Sylvia J. Hysong, PhD; Christiane Spitzmuller, PhD; Donna Espadas, BS; Dean F. Sittig, PhD; and Hardeep Singh, MD, MPH
Users' acceptance of electronic health record-based asynchronous alerts can negatively impact provider satisfaction, intentions to quit, and ultimately turnover.
The VA is the largest integrated healthcare system in the United States and one of the most advanced in terms of fully functional EHR use,36 with EAS-capable EHRs having been in place at all medical facilities for almost a decade.37 In addition, the VA provides various national-level resources to support providers in their use of the Computerized Patient Records System (CPRS, the VA’s EHR), including nationally developed training modules, clinical application coordinators whose role is to assist clinicians in using CPRS, and a national performance measurement and quality reporting system. Despite these nationally available resources, however, previous research has shown considerable variability in the implementation of standardized national resources across VA facilities38 (eg, computerized clinical reminders)39 and in some cases this local variability has significantly impacted quality and performance.40 It is this local variation and degree of adaptation among facilities, despite the availability of national-level resources, which inspired the facility-level analyses in this study.

Electronic alert system features. CPRS features an inbox style electronic alert notification system (“View Alerts”).41 The View Alert window is displayed when a provider logs into CPRS, notifying the user of clinically significant events such as abnormal diagnostic test results (see Figure 2; a full taxonomy of the available alert/notification types within CPRS has been described elsewhere).12 The user can customize how alerts are displayed in several ways via features such as sorting and turning off nonmandatory notifications.31 Alerts stay in the View Alert window for a prespecified time or until the user specifically acknowledges the alert (ie, clicks the alert to read it). Thus, the View Alert system is a system of asynchronous communication used nationally in the VA to facilitate communication among multiple members of the patient’s care team. A core set of CPRS functionalities is determined at the national VA level. In addition, individual facilities have the flexibility to alter some of the CPRS settings. For instance, some facilities opt to have providers receive a larger number of relevant alerts, while other facilities alter settings so providers only view certain types of alerts that are considered “mandatory” at the institution level.

Participants and Procedure

Details of the survey’s development are reported elsewhere.32 In brief, using a nationwide VA administrative database (VA Primary Care Management Module), we invited all VA PCPs with a minimum practice panel size of 250 patients (N = 5001). Of 5001 PCPs invited, 2590 (51.8%) responded, representing data from 131 different VA facilities. Figure 3 displays the distribution of respondents within a facility across all 131 facilities. Respondents were 55.4% female, 31.1% nonwhite, and 31.5% nonphysician providers (eg, physician assistants, nurse practitioners); and 82.1% had 2 or more years in VA practice (Table 1). Within VA primary care, nonphysician providers behave largely as physicians do: They have their own patient panels, do at least 85% of the same work as physicians (differences being largely administrative),42 and use CPRS in largely the same way. Consequently, physicians and nonphysician providers were treated as a single population and identified as PCPs for study purposes.


Our study was reviewed and approved by our local institutional review board. Participants were recruited as follows: We first asked chiefs of primary care at each facility to email information about the project and the upcoming survey to the PCPs at their respective sites. We subsequently invited all participants via a personalized email from the study’s principal investigator; this e-mail described the study and provided a link to the Web-based survey. To increase response rates, invitation e-mails and subsequent reminders were followed by telephone attempts to reach nonrespondents.

Measures

Table 2 contains a list of constructs, construct definitions, sample items, and response scales used for the current study.

User Acceptance Factors. Measures for EAS Supportive Norms, Monitoring/Feedback and Training Infrastructure, and PPOV were developed specifically for this study based on a literature review43-50 and refined through pilot-testing with PCPs. Descriptive statistics, correlations, and reliability coefficients for these measures appear in Table 3. Details on the development of the survey instrument are reported elsewhere.33

Provider Satisfaction and Intention to Quit. Satisfaction and intention to quit were each measured via a single survey item, as suggested by Cortese and Quaglino.51

Turnover. Facility-level voluntary turnover rates for 2010 were obtained from the Veterans Health Administration Service Support Center Human Resources Cube, a large administrative database repository maintained centrally by the VA.

Data Analysis

Facility-Level Aggregation. Because our primary out-come, turnover, is a facility-level variable, it was necessary to aggregate all predictors (monitoring/feedback, supportive norms, PPOV, training, intention to quit, and provider satisfaction) to the facility level (n = 131) in order to successfully assess their impact on facility-level turnover. To test whether within-facilities responses were sufficiently homogeneous to justify aggregation, we calculated rwg, a measure of interrater agreement,52 for each relevant variable for each facility. Average rwg score was 0.71, suggesting sufficient agreement to warrant aggregation.

Model Test. We used structural equation modeling to test our hypothesized path model. Figure 1 presents the model tested; purple lines and green lines denote the initial model; red and green lines denote the final, best-fitting model. We additionally computed simple bivariate correlations to further explore the data. All analyses were conducted using SPSS Amos 17.53

RESULTS

Descriptive Statistics

Table 3 presents means, standard deviations, and correlations among study variables. Of note, bivariate correlations indicated a significant positive relationship between intention to quit and facility level turnover (r = 0.169, P <.05); and a significant negative relationship between provider satisfaction and facility-level turnover (r = –0.167, P <.05). Additionally, supportive norms and PPOV each correlated with provider satisfaction (r = 0.286, P <.01 and r = 0.495, P <.01, respectively) and intention to quit (r = –0.170, P <.05 and r = –0.383, P <.01, respectively), whereas monitoring/feedback correlated only with intention to quit (r = 0.185, P <.05). These significant correlations suggest testing a complete model is warranted.


Test of Hypothesized Model

Initial model fit. The hypothesized relationships (ie, each factor independently predicts provider satisfaction and intention to quit, both of which intercorrelate and in turn predict turnover, depicted in Figure 1 in the purple and green lines) resulted in poor fit (RMSEA = 0.21, PCLOSE <.001) when tested as a cohesive model. Of note, facility-level turnover was unrelated to provider satisfaction or intentions to quit.


 
Copyright AJMC 2006-2020 Clinical Care Targeted Communications Group, LLC. All Rights Reserved.
x
Welcome the the new and improved AJMC.com, the premier managed market network. Tell us about yourself so that we can serve you better.
Sign Up