Users' acceptance of electronic health record-based asynchronous alerts can negatively impact provider satisfaction, intentions to quit, and ultimately turnover.
Use of certain components of electronic health records (EHRs), such as EHR-based alerting systems (EASs), might reduce provider satisfaction, a strong precursor to turnover. We examined the impact of factors likely to influence providers’ acceptance of an alerting system, designed to facilitate electronic communication in outpatient settings, on provider satisfaction, intentions to quit, and turnover.
Study Design and Methods
We conducted a cross-sectional Web-based survey of EAS-related practices from a nationwide sample of primary care providers (PCPs) practicing at Department of Veterans Affairs (VA) medical facilities. Of 5001 invited VA PCPs, 2590 completed the survey.
We relied on Venkatesh’s Unified Theory of Acceptance and Use ofTechnology to create survey measures of 4 factors likely to impact user acceptance of EAS: supportive norms, monitoring/ feedback, training, and providers’ perceptions of the value (PPOV) of EASs to provider effectiveness. Facility-level PCP turnover was measured via the VA’s Service Support Center Human Re- sources Cube. Hypotheses were tested using structural equation modeling.
After accounting for intercorrelations among predictors, monitoring/feedback regarding EASs significantly predicted intention to quit (b = 0.30, <.01), and PPOV of EASs predicted both overall provider satisfaction (b = 0.58, <.01) and facility-level provider turnover levels (b = —0.19, <.05), all without relying on any intervening mechanisms.
Design, implementation, and use of EASs might impact provider satisfaction and retention. Institutions should consider strategies to help providers perceive greater value in these clinical tools.
Am J Manag Care. 2014;20(11 Spec No. 17):SP520-SP530
How electronic health record (EHR)-based alerting systems are implemented, accepted, and used in real-world clinical settings can impact not just quality of care, but also provider satisfaction and retention.
Retaining primary care providers (PCPs) is critical to ensuring healthcare access and quality. However, PCPs are moving to other specialty areas or leaving medicine altogether, a significant threat to high-quality care in many US regions.Recent healthcare legislation and initiatives such as the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, the Affordable Care Act of 2010, and the Patient-Centered Medical Home make explicit resource provisions such as training, additional staff, and resources to implement electronic health records (EHRs), all of which could make primary care more attractive.While PCPs’ decisions to seek alternate employment might be determined by a multitude of factors,provider dissatisfaction with the implementation and meaningful use of EHRsmay pose unique retention challenges despite the HITECH Act’s strong incentives for their use.
Implementing a full-service EHR constitutes a major organizational intervention; it presents a significant change in clinician-to-clinician communication and in some instances can require additional skills beyond those that were needed in paper-based systems to deliver care of comparable quality.For example, providers spend an average of 49 minutes per day reviewing and responding to electronic alert notifications, yet nearly half of these notifications do not contain messages that providers perceive to contain “high value” information.This volume of information is significantly higher than what would be expected in a paper system (given the additional resources involved with physical mail and messaging), and thus requires a different work strategy for accurate and timely handling. In addition, research has also shown marked differences in workflow efficiency of paper-based versus EHR; paper-based clinical information often gets “lost in the shuffle” and becomes untrackable.The needed changes in communication and work flow attributable to EHR use are significant enough that despite their benefits, 12% of pediatric urologists reported they would retire if EHR use were mandated.
Recent research has demonstrated a relationship between the use of health information technology (HIT) and physician satisfaction,although different components
of HIT have different effects on physician satisfaction. For example, compared with traditional, paper-based forms of communication, PCPs who communicated electronically with patients and other providers, and who shared their visit notes electronically with patients, were more likely to report higher satisfaction levels; in contrast, PCPs who wrote prescriptions electronically were less likely to report high satisfaction levels.Ensuring physicians’ satisfaction with their work is important because poor satisfaction often leads to several undesirable results, including turnover, mental health concerns (eg, anxiety, depression, burnout), poorer relationships with patients, and reduced quality of care.
Within the Accountable Care Organization model, EHRs are expected to facilitate communication and coordination, especially in the outpatient setting.Increasingly, practices are relying on EHR-based alerting systems (EASs) within their EHR to track, route, and communicate clinical information such as test results.This electronic communication may occur through an “asynchronous” alert notification inbox, much like email, where the sender and recipient need not be simultaneously engaged. Notifications transmitted through these systems could include test results, referrals, status updates on patients, and other provider-to-provider communications. Although many commercial EHRs already feature EASs functionality for communication, its use is expected to grow. For example, results management is one of the core EHR functionalitiesand key criteria for achieving Stage 2 meaningful usedue to its potential to reduce lag time in recognition and treatment of medical problems, reduce redundant testing, and improve appropriate and timely follow-up. Unlike other EHR components, how- ever, the impact of EASs on provider satisfaction and turnover is not well documented.
PCPs’ utilization of EHR-based EASs has become an integral job characteristic, especially because PCPs utilizing EASs tend to spend significant amounts of time interfacing with them.Consequently, problems related to EASs could potentially impact PCP job attitudes such as satisfaction and intentions to quit, both known ante- cedents of turnover.For example, our previous research suggests providers using EASs receive an average of 56 to 63 alert notifications per day; in addition, providers do little to customize their alerts interface to optimize efficiency and effectiveness. Rather, they employ varying strategies for managing these notifications, with mixed success,leading to information overload.
PCPs have requested new visualization tools such as color coding and advanced filtering to address some of these problems.Thus, without modifications, EASs use could lead to a combination of high-volume, low-value work that could function as a driver of turnover for PCPs, rather than a source of retention. As there is no prior research that explores the relationship between EHR variables and provider outcomes, our objective was to conduct an initial examination to identify how providers’ perceptions of the use of EASs may impact their satisfaction and intention to quit. This research could inform strategies about how those perceptions could be altered if needed, and can also serve as a stepping-stone for the integration of EHR research with turnover research.
To answer our research questions, we took guidance from 2 theoretical models to ultimately derive the model depicted in . The Job Demands Resource Model of burnout (JDRM)posits that job demands (ie, aspects of the job such as, potentially, the use of EASs) that require sustained physical or mental effort and lead to increased workload can lead to negative outcomes such as low satisfaction, intentions to quit one’s job, and eventually actual turnover. The model further proposes that (ie, aspects of the job that are functional in achieving work goals) that reduce job demands, or stimulate personal growth and development, relate to positive attitudinal outcomes and lower levels of withdrawal.
With respect to EASs, it is unclear whether PCPs perceive EASs as a demand or a resource. The second model, Venkatesh’s Unified Theory of Acceptance and Use of Technology (UTAUT),sheds some light on what may drive this decision for providers. The UTAUT proposes several factors that could impact the aforementioned outcomes; among these are performance expectancy (the extent to which the user believes the system will help attain gains in performance); social influence (the extent to which the user perceives that important others, such as family and friends, believe the system should be used), and facilitating conditions (the extent to which the user perceives that technical and organizational resources exist to support system use). We thus examined 4 specific examples of the types of factors proposed by Venkatesh and their impact on physician satisfaction, intention to quit, and turnover: (1) EAS-supportive norms, such as the extent to which colleagues use and see value in the notifications (an example of social influence), (2) whether providers receive feedback about their use of EASs, (3) whether providers receive training on the use of notifica- tions (both examples of facilitating conditions), and (4) the perceived contribution of EASs to provider effectiveness (an example of performance expectancy, henceforth referred to as provider perceptions of value [PPOV]). Based on the JDRM and UTAUT, we hypothesized that each of the 4 factors will positively impact provider satisfaction, and inversely relate to intention to quit. Furthermore, intention to quit and provider satisfaction will significantly impact turnover.
The present study is part of a larger cross-sectional Web-based survey of EAS practices conducted between June and November 2010 on a nationwide sample of PCPs practicing at Department of Veterans Affairs (VA) medical facilities.
The VA is the largest integrated healthcare system in the United States and one of the most advanced in terms of fully functional EHR use,with EAS-capable EHRs having been in place at all medical facilities for almost a decade.In addition, the VA provides various national-level resources to support providers in their use of the Computerized Patient Records System (CPRS, the VA’s EHR), including nationally developed training modules, clinical application coordinators whose role is to assist clinicians in using CPRS, and a national performance measurement and quality reporting system. Despite these nationally available resources, however, previous research has shown considerable variability in the implementation of standardized national resources across VA facilities(eg, computerized clinical reminders)and in some cases this local variability has significantly impacted quality and performance.It is this local variation and degree of adaptation among facilities, despite the availability of national-level resources, which inspired the facility-level analyses in this study.
Electronic alert system features.
CPRS features an inbox style electronic alert notification system (“View Alerts”).The View Alert window is displayed when a provider logs into CPRS, notifying the user of clinically significant events such as abnormal diagnostic test results (see ; a full taxonomy of the available alert/notification types within CPRS has been described elsewhere).The user can customize how alerts are displayed in several ways via features such as sorting and turning off nonmandatory notifications.Alerts stay in the View Alert window for a prespecified time or until the user specifically acknowledges the alert (ie, clicks the alert to read it). Thus, the View Alert system is a system of asynchronous communication used nationally in the VA to facilitate communication among multiple members of the patient’s care team. A core set of CPRS functionalities is determined at the national VA level. In addition, individual facilities have the flexibility to alter some of the CPRS settings. For instance, some facilities opt to have providers receive a larger number of relevant alerts, while other facilities alter settings so providers only view certain types of alerts that are considered “mandatory” at the institution level.
Participants and Procedure
Details of the survey’s development are reported elsewhere.In brief, using a nationwide VA administrative database (VA Primary Care Management Module), we invited all VA PCPs with a minimum practice panel size of 250 patients (N = 5001). Of 5001 PCPs invited, 2590 (51.8%) responded, representing data from 131 different VA facilities. displays the distribution of respondents within a facility across all 131 facilities. Respondents were 55.4% female, 31.1% nonwhite, and 31.5% nonphysician providers (eg, physician assistants, nurse practitioners); and 82.1% had 2 or more years in VA practice (). Within VA primary care, nonphysician providers behave largely as physicians do: They have their own patient panels, do at least 85% of the same work as physicians (differences being largely administrative),and use CPRS in largely the same way. Consequently, physicians and nonphysician providers were treated as a single population and identified as PCPs for study purposes.
Our study was reviewed and approved by our local institutional review board. Participants were recruited as follows: We first asked chiefs of primary care at each facility to email information about the project and the upcoming survey to the PCPs at their respective sites. We subsequently invited all participants via a personalized email from the study’s principal investigator; this e-mail described the study and provided a link to the Web-based survey. To increase response rates, invitation e-mails and subsequent reminders were followed by telephone attempts to reach nonrespondents.
contains a list of constructs, construct definitions, sample items, and response scales used for the current study.
User Acceptance Factors.
Measures for EAS Supportive Norms, Monitoring/Feedback and Training Infrastructure, and PPOV were developed specifically for this study based on a literature reviewand refined through pilot-testing with PCPs. Descriptive statistics, correlations, and reliability coefficients for these measures appear in . Details on the development of the survey instrument are reported elsewhere.
Provider Satisfaction and Intention to Quit.
Satisfaction and intention to quit were each measured via a single survey item, as suggested by Cortese and Quaglino.
Facility-level voluntary turnover rates for 2010 were obtained from the Veterans Health Administration Service Support Center Human Resources Cube, a large administrative database repository maintained centrally by the VA.
Because our primary out-come, turnover, is a facility-level variable, it was necessary to aggregate all predictors (monitoring/feedback, supportive norms, PPOV, training, intention to quit, and provider satisfaction) to the facility level (n = 131) in order to successfully assess their impact on facility-level turnover. To test whether within-facilities responses were sufficiently homogeneous to justify aggregation, we calculated r, a measure of interrater agreement,for each relevant variable for each facility. Average rscore was 0.71, suggesting sufficient agreement to warrant aggregation.
We used structural equation modeling to test our hypothesized path model. Figure 1 presents the model tested; purple lines and green lines denote the initial model; red and green lines denote the final, best-fitting model. We additionally computed simple bivariate correlations to further explore the data. All analyses were conducted using SPSS Amos 17.
Table 3 presents means, standard deviations, and correlations among study variables. Of note, bivariate correlations indicated a significant positive relationship between intention to quit and facility level turnover (= 0.169, <.05); and a significant negative relationship between provider satisfaction and facility-level turnover (= —0.167, <.05). Additionally, supportive norms and PPOV each correlated with provider satisfaction (= 0.286, <.01 and = 0.495, <.01, respectively) and intention to quit (= —0.170, <.05 and = —0.383, <.01, respectively), whereas monitoring/feedback correlated only with intention to quit (= 0.185, <.05). These significant correlations suggest testing a complete model is warranted.
Test of Hypothesized Model
Initial model fit.
The hypothesized relationships (ie, each factor independently predicts provider satisfaction and intention to quit, both of which intercorrelate and in turn predict turnover, depicted in Figure 1 in the purple and green lines) resulted in poor fit (RMSEA = 0.21, PCLOSE <.001) when tested as a cohesive model. Of note, facility-level turnover was unrelated to provider satisfaction or intentions to quit.
An important feature of the original model is that the factors in the model were considered orthogonal, independent predictors of satisfaction, intention to quit, and turnover. Bivariate correlations, however, suggested this was an incorrect assumption. Consequently, based on the initial model results and the simple bivariate correlations, we trimmed unnecessary relationships from the model, and allowed the predictors to covary. The resulting model showed good fit (RMSEA = 0.04, PCLOSE = 0.47), and is presented in Figure 1 (depicted by the green and red lines). As can be seen from the figure and consistent with the bivariate correlations analyses, the 4 factors are significantly correlated, and thus cannot be treated as independent predictors of pro- vider satisfaction. After accounting for intercorrelations amongst the independent variables, monitoring/feed- back significantly predicted intention to quit (b = 0.30, <.01), and PPOV predicted both provider satisfaction b = 0.58, <.01) and facility level turnover (b = —0.19, <.05), all without relying on either provider satisfaction or intention to quit as intermediary mechanisms. Of note, high levels of monitoring and feedback were associated with intentions to quit.
This study sought to examine the impact of user acceptance factors of electronic health record-based alert notification systems on the satisfaction, intentions to quit, and turnover of providers who used them. Contrary to existing theory (both the JDRM and the UTAUT), we found that monitoring/feedback on EASs practices, training on the use of EASs, and supportive norms about EAS had little impact on provider satisfac- tion. However, monitoring/feedback were associated with increased intention to quit.
Our results suggest that EASs, and by extension EHRs, could become catalysts for turnover, unless providers clearly understand their value to delivering high-quality care effectively and efficiently. As evidenced by the non-significant relationships between monitoring/feedback and provider satisfaction, as well as the nonsignificant relationship between training and both satisfaction and intention to quit, our data suggest that the aforementioned facilitating conditions may be insufficient to accomplish this goal, though we have no specific details in our data about the quality of the feedback or training. More importantly, when providers do not perceive the value of these electronic aids to their practice, they might become dissatisfied with their work environment, and potentially seek work elsewhere altogether.
EASs likely represent one of the most frustrating components of EHRs for providers—compared with paper communication systems, they are perceived to “increase the number of work items, inflate the time to process each, and divert work previously done by office staff to them.”Other work has shown that providers perceive many of the alerts they receive to be unnecessary,and has documented variable physician ac- ceptance of features like computerized reminders and electronic alerts.Therefore, future work should target the problem from multiple angles, such as content and design of feedback, effectiveness of training, and social influence factors, in addition to already ongoing efforts to optimize EAS design, so that it is inherently perceived as valuable by providers. The United States already has a shortage of primary care providers,and research shows dissatisfied providers are both leaving primary care for other specialties and/or leaving medi- cine completely.
Several possible reasons might exist to explain the positive effect of monitoring/feedback on intention to quit. First, participants might have reacted more strongly to the monitoring aspect than to the feedback aspect of this construct. Second, the nature of the feedback provided could minimize feedback’s impact on satisfaction. Feedback characteristics can have a significant impact on its effectiveness at changing cognitions and behavior.Our ongoing research in another domain has found that feedback is often delivered primarily via written reports providing only numeric scores without correct solution information(one of the most powerful single characteristics of feedback interventionsThird, both feedback delivery mechanisms and providers’ perceptions of being monitored constantly by the organization could have led to the observed result.
In contrast, PPOV showed a direct positive relation- ship to provider satisfaction (providers who perceived greater value in electronic notifications were more likely to be satisfied); a direct negative relationship to turnover (providers who perceived greater value in electronic notifications were less likely to quit); and an indirect link to intention to quit via provider satisfaction (providers who perceived greater value in alert notifications were more likely to be satisfied, and in turn less likely to express intentions to quit.) The relationship between provider satisfaction and intention to quit is not surprising, as it has been well documented in the literature.The more novel finding in this research is the direct, negative relationship between PPOV and turnover (ie, providers at facilities with higher provider turnover rates have lower perceptions of value for EASs. We are not aware of any studies directly linking these types of perceptions to turnover, particularly at the organizational level with a national sample as large as this one: 2590 respondents at 131 facilities. From a scientific perspective, this finding links the JDRM and UTAUT: if users do not perceive EASs to be of value, EASs are more likely to be considered a demand rather than a resource (and thereby less likely to be accepted), thus leading to increased turnover. From a practical perspective knowing that EASs have to be perceived as performance enhancing by physicians in order for them to not nega- tively affect turnover should signal facility leadership to take care regarding how such systems are designed, marketed within the facility, and supported.
In addition to this important finding, we are also not aware of any studies simultaneously examining the ef- fects of satisfaction, intention to quit, and turnover in the healthcare setting. Understanding the interrelationships among user acceptance of technological tools intended to help providers, factors that impact this acceptance, and provider outcomes can help the design and implementation of HIT tools with which providers will to work.
The study was conducted within the VA system, representing one of the largest and most sophisticated healthcare systems in the United States. Hence, in the spirit of constructive replication and to enhance external validity, we recommend that our findings be replicated in subsequent studies at other facilities without centralized organizational structure. Nevertheless, alerting systems such as the one we studied are being increasingly used across commercial EHRs. Second, the structure of the archival turnover data obtained for this study limits turnover analysis over time and prevents the application of statistical techniques such as survival analysis that lead to the most informative results for turnover-type data. Third, our sample consisted of employees who were all using the same EAS-capable EHR, limiting our ability to generalize results to other commonly used EAS-capable EHR systems (eg, Epic Systems, Verona, Wisconsin) that are officially certified (by an Office of the National Coordinator for Health Information Technology-Authorized Testing and Certification Body). Hence, we recommend that future research focus on more heterogeneous samples, examining different types of EASs and EHRs. Finally, although this study identified a new, very specific source of dissatisfaction and potential turnover among providers, future studies should examine the incremental contribution of this source in the context of more traditional predictors of provider satisfaction such as supervisory relations, availability of resources, and work environment condi- tions.We further encourage future research to closely investigate how providers’ perceptions of EHR variables develop over time, and whether system characteristics or more distal factors (eg, supervisory behavior) impact these perceptions.
We conclude that designing and implementing EHR-based notification systems effectively may no longer simply be assumed to be an antecedent to efficiency, safety, or quality of care; how these systems are implemented, accepted, and used in real-world practice, as our research shows, might also impact provider satisfaction and retention. Given the recent HITECH stimulus and the new healthcare law, EHRs will be a reality nationwide in a few short years and will connect members of the healthcare team like never before. In fact, one reason for the heavy emphasis on EHR adoption is to improve communication. Depending on how the EHR is designed and implemented, it can become a source of competitive advantage (or turnover) for clinical practices. In addition, how an organization creates and manages its internal policies can make or break both the safety and efficiency of the clinicians’ work. As EHRs become more widespread and providers increasingly communicate clinical information through EASs, institutions should consider strategies to help providers perceive greater value in these vital clinical tools.
Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey VA Medical Center, Houston, TX; Baylor College of Medicine, Houston, TX (SJH, DE, HS); University of Houston, TX (CS); University of Texas Health Science Center at Houston, (DFS).
Source of Funding:
This work was supported by the VA National Cen- ter of Patient Safety and partially supported by the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, and the Center for Innovations in Quality, Effectiveness and Safety (#CIN 13-413).
The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.
Concept and design (SJH, CS, HS, DFS); acquisition of data (SJH, CS, DE, DFS, HS); analysis and interpretation of data (SJH, CS, DE, DFS, HS); drafting of the manuscript (SJH, CS, DE, DFS, HS); critical revision of the manuscript for important intellectual content (SJH, CS, DFS, HS); statistical analysis (SJH, CS); obtaining funding (HS); administrative, technical or logistic support (SJH, CS, DE, DFS); and supervision (HS).
Address correspondence to:
Sylvia J. Hysong, PhD, Center for Innovations in Quality, Effectiveness and Safety (152), Michael E. DeBakey VA Medical Center, 2002 Holcombe Blvd, Houston, TX 77030. E-mail: firstname.lastname@example.org.
1. Felice ME. Reflections on why pediatrics does not have a primary care physician shortage at present.
2. Association of American Medical Colleges. Physician Shortages to Worsen Without Increases in Residency Training. 2010. https://www. aamc.org/download/15316
/data/physician_shortages_to_worsen_ without_increases_in_residency_tr.pdf. Published 2010. Accessed April 26, 2013.
3. Landon BE, Reschovsky JD, Pham HH, Blumenthal D. Leaving medicine: the consequences of physician dissatisfaction. 2006;44:234-242.
4. Rabinowitz HK, Diamond JJ, Markham FW, Paynter NP. Critical factors for designing programs to increase the supply and retention of rural primary care physicians. 2001;286:1041-1048.
N Engl J Med.
5. Blumenthal D. Stimulating the adoption of health information technology. 2009;360:1477-1479.
6. Singer S, Shortell SM. Implementing accountable care organizations: ten potential mistakes and how to learn from them. 2011;306: 758-759.
N Engl J Med.
7. Barnes KA, Kroening-Roche JC, Comfort BW.The developing vision of primary care. 2012;367:891-893.
Handbook of Industrial and Organizational Psychology
Health Care Manage Rev.
8. Johns G.The psychology of lateness, absenteeism, and turnover. In: Anderson N, Ones DS, Sinangil HK, Viswesvaran C, eds. . London: Sage; 2001:232-252. 9. Menachemi N, PowersTL, Brooks RG.The role of information tech- nology usage in physician practice satisfaction. 2009;34:364-371.
10. Sittig DF, Singh H. Rights and responsibilities of users of electronic health records. 2012;184:1479-1483.
11. Ryan AM, Bishop TF, Shih S, Casalino LP. Small Physician Practices In NewYork Needed Sustained HelpTo Realize Gains In Quality From Use Of Electronic Health Records. 2013;32:53-62.
Am J Med.
12. Murphy DR, Reis B, Sittig DF, Singh H. Notifications received by primary care practitioners in electronic health records: a taxonomy and time analysis. 2012;125:209-207.
Am J Manag Care.
13. Jha AK, Burke MF, DesRoches C et al. Progress toward meaningful use: hospitals’ adoption of electronic health records. 2011;17:SP117-SP124.
Int J Med Inform.
14. McAlearney AS, Robbins J, Hirsch A, Jorina M, Harrop JP. Perceived efficiency impacts following electronic health record implementation: an exploratory study of an urban community health center network. 2010;79:807-816.
Ann Intern Med.
15. Chaudhry B, Wang J, Wu S et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. 2006;144:742-752.
Ann Fam Med.
16. Elder NC, Vonder MM, Cassedy A. The identification of medical errors by family physicians during outpatient visits. 2004;2: 125-129.
Qual Saf Health Care.
17. Hickner J, Graham DG, Elder NC et al.Testing process errors and their harms and consequences reported from family medicine practices: a study of the American Academy of Family Physicians National Research Network. 2008;17:194-200.
18. Canon SJ, Purifoy JA, Heulitt GM et al. Results: Survey of pediatric urology electronic medical records-use and perspectives. 2011; 186:1740-1744.
Ann Intern Med.
19. Delbanco T, Walker J, Bell SK et al. Inviting patients to read their doctors’ notes: a quasi-experimental study and a look ahead. 2012;157:461-470.
Perspect Health Inf Manag.
20. Elder KT, Wiltshire JC, Rooks RN, Belue R, Gary LC. Health information technology and physician career satisfaction. 2010;7.
Health Care Manage Rev.
21. Williams ES, Skinner AC. Outcomes of physician job satisfaction: a narrative review, implications, and directions for future research. 2003;28:119-139.
22. Bitton A, Flier LA, Jha AK. Health information technology in the era of care delivery reform: to what end? 2012;307:2593-2594.
23. Singh H,Thomas E, Mani S et al.Timely follow-up of abnormal diagnostic imaging test results in an outpatient setting: are elec-
Arch Intern Med.
tronic medical records achieving their potential? 2009;169:1578-1586.
24. Committee on Patient Safety and Health InformationTechnology, Board on Health Care Services. Health IT and Patient Safety: Building Safer Systems for Better Care. Washington, DC: National Academies Press; 2011. RefType: Report
25. HHS. Health InformationTechnology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology. . 2012;77:45 CFR Part 170 RIN 0991-AB82 p. 13845.
26. Best R, Hysong SJ, Moore FI, Pugh JA. Evidence based approaches to primary care staffing [final report]. 01-185. 10-7-2005. San Antonio, TX, Veterans Evidence-Based Research Dissemination and Implementation Center.
27. Hysong SJ, Best RG, Pugh JA, Moore FI. Are we underutilizing the talents of primary care personnel? a job analytic examination. 2007;2:1-13.
28. Hysong SJ, Amspoker A, Khan M, Johnson K, Gribble G. VISN 6 Ambulatory Care System Redesign Improvement Capability Project- Evaluation. Final Report for Fiscal Year 2011 to the VA Mid Atlantic Health Care Network. September 21, 2011.
Acad Manag J.
29. Harrison DA, Newman DA, Roth PL. How important are job attitudes? meta-analytic comparisons of integrative behavioral outcomes and time sequences. 2006;49:305-325.
JAMA Intern Med.
30. Singh H, Spitzmueller C, Petersen NJ, Sawhney MK, Sittig DF. Information overload and missed test results in electronic health record- based settings. 2013;1-3.
J Am Med Inform Assoc.
31. Hysong SJ, Sawhney M, Wilson L et al. Provider management strategies of abnormal test result alerts: a cognitive task analysis. 2010;17:71-77.
Arch Intern Med
32. Singh H, Spitzmueller C, Petersen N, Sawhney M, Sittig D. Socio-technical predictors of missed test results in EHR-based settings: a national survey of primary care practitioners. . In press.
J Am Med Inform Assoc.
33. Singh H, Spitzmueller C, Petersen NJ et al. Primary care practitioners’ views on test result management in EHR-enabled health systems: a national survey. 2012.
Journal of Applied Psychology.
34. Demerouti E, Bakker AB, Nachreiner F, Schaufeli WB. The job demands-resources model of burnout. 2001;86:499-512.
35. Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. 2003;27:425-478.
36. Hsiao C, Beatty PC, Hing ES, et al. Electronic medical record/electronic health record use by office-based physicians: United States, 2008 and preliminary 2009. December 1, 2009. Hyattsville, MD: Divi- sion of Health Care Statistics, National Center for Health Statistics.
Best Care Anywhere, 3rd Edition: Why VA Health Care Would Work Better For Everyone
37. Longman P. . San Francisco: Berrett-Koehler Publishers; 2012.
38. Hynes DM, Whittier ER, Owens A. Health information technology and implementation science: partners in progress in the VHA. 2013;51:S6-S12.
Am J Manag Care.
39. Fung CH, Woods JN, Asch SM, Glassman P, Doebbeling BN. Variation in implementation and use of computerized clinical reminders in an integrated healthcare system. 2004;10:878-885.
Health Serv Res.
40. Hysong SJ, Pugh JA, Best RG. Clinical practice guideline implementation patterns in VHA outpatient clinics. 2007;42:84-103.
Int J Med Inform.
41. Brown SH, Lincoln MJ, Groen PJ, Kolodner RM. VistA--U.S. Department of Veterans Affairs national-scale HIS. 2003; 69:135-156.
Journal of Healthcare Management.
42. Best RG, Hysong SJ, Pugh JA, Ghosh S, Moore FI.Task overlap among primary care team members: opportunity for system redesign? 2006;51:295-307.
J Am Med Inform Assoc.
43. Ash JS, Gorman PN, Seshadri V, Hersh WR. Computerized physician order entry in U.S. hospitals: results of a 2002 survey. 2004;11:95-99.
Arch Intern Med.
44. Boohaker EA, Ward RE, Uman JE, McCarthy BD. Patient notification and follow-up of abnormal test results. A physician survey. 1996;156:327-331.
Ann Intern Med.
45. Campbell EG, Regan S, Gruen RL et al. Professionalism in medicine: results of a national survey of physicians. 2007;147: 795-802.
Health Aff (Millwood).
46. Cutler DM, Feldman NE, Horwitz JR. U.S. adoption of computerized physician order entry systems. 2005;24: 1654-1663.
47. Jha AK, Ferris TG, Donelan K et al. How common are electronic health records in the United States? A summary of the evidence. 2006;25:w496-w507.
J Am Med Inform Assoc.
48. Lyons SS, Tripp-Reimer T, Sorofman BA et al. VA QUERI informatics paper: information technology for clinical guideline implementation: perceptions of multidisciplinary stakeholders. 2005;12:64-71.
Arch Intern Med.
49. Poon EG, Gandhi TK, Sequist TD, Murff HJ, Karson AS, Bates DW. “I wish I had seen this test result earlier!”: dissatisfaction with test result management systems in primary care. 2004;164: 2223-2228.
BMC Fam Pract.
50. Wahls TL, Cram PM. The frequency of missed test results and associated treatment delays in a highly computerized health system. 2007;8:32.
Testing, Psychometrics, Methodology in Applied Psychology.
51. Cortese C, Quaglino G.The measurement of job satisfaction in organizations: a comparison between a facet scale and a single-item measure. 2006;13: 305-316.
Journal of Applied Psychology.
52. James LR, Demaree RG, Wolf G. Estimating Within-Group Interrater Reliability With and Without Response Bias. 1984;69:85-98.
53. [Version 17.0]. Chicago: SPSS; 2006.
Int J Med Inform.
54. Russ AL, Zillich AJ, McManus MS, Doebbeling BN, Saleem JJ. Pre- scribers’ interactions with medication alerts at the point of prescribing: A multi-method, in situ investigation of the human-computer interaction. 2012;81:232-243.
Int J Med Inform.
55. Saleem JJ, Russ AL, Justice CF et al. Exploring the persistence of paper with the electronic health record. 2009;78: 618-628.
Int J Med Inform.
56. Saleem JJ, Russ AL, Neddo A, Blades PT, Doebbeling BN, Foresman BH. Paper persistence, workarounds, and communication break- downs in computerized consultation management. 2011;80:466-479.
Arch Intern Med.
57. McDonald CJ, McDonald MH. Electronic medical records and preserving primary care physicians’ time: comment on “electronic health record-based messages to primary care providers”. 2012;172:285-287.
BMC Med Inform Decis Mak.
58. Hysong SJ, Sawhney MK, Wilson L et al. Understanding the management of electronic test result notifications in the outpatient setting. 2011;11:22.
J Gen Intern Med.
59. Fung CH, Tsai JS, Lulejian A et al. An evaluation of the Veterans Health Administration’s clinical reminders system: a national survey of generalists. 2008;23:392-398.
60. Hysong SJ. Meta-analysis: audit & feedback features impact effectiveness on care quality. 2009;47:356-363.
61. Kluger AN, DeNisi A. The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. 1996;119:254-284.
62. Hysong SJ,Teal CR, Khan MJ, Haidet P. Improving quality of care through improved audit and feedback. 2012;7:45.
Journal of Vocational Behavior.
63. Conway N, Briner RB. Full-time versus part-time employees: Understanding the links between work status, the psychological contract, and attitudes. 2002;61:279-301.
J Organ Behav.
64. Turnley WH, Feldman DC. Re-examining the effects of psychological contract violations: Unmet expectations and job dissatisfaction as mediators. 2000;21:25-42.
65. Hysong, SJ, Best RG, Bollinger M.The Impact ofVA’s Intramural Research Program on Physician Recruitment and Retention: annual meeting of the Veterans Administration Health Services Research & Development Program and Special Network Directors’ Poster Session of the 2007 Annual Meeting of the Veterans Administration Health Services Research and Development Service. 2007.
Am J Manag Care.
66. Buchbinder SB, Wilson M, Melick CF, Powe NR. Primary care physician job satisfaction and turnover. 2001;7: 701-713.