Publication

Article

The American Journal of Managed Care
Special Issue: Health Information Technology - Guest Editors: Michael F. Furukawa, PhD; and Eric Poon, MD, MPH
Volume 17
Issue SP

Qualitative Evaluation to Explain Success of Multifaceted Technology-Driven Hypertension Intervention

A qualitative process evaluation attributes the success of a technology-driven hypertension intervention to the combination of multiple intervention components framed as quality improvement.

Objectives:

This study sought to examine the implementation of an electronic health record— based intervention to improve quality of hypertension care in community health centers. The primary goal was to use qualitative analysis to explain how different components of the intervention contributed to positive patient-level outcomes.

Study Design:

Qualitative process evaluation.

Methods:

The intervention included alerts, order sets, templates, clinical reminder algorithms, and provider performance feedback. Semi-structured interviews were conducted with primary care providers before (n = 16) and after (n = 16) intervention, and with key staff and leadership involved in the implementation (n = 6). The research team applied an iterative systematic qualitative coding process to identify salient themes. Several constructs from IT implementation theories guided the analysis.

Results:

The analysis focused on: (1) satisfaction and perceived usefulness of intervention components, (2) perceived proximal changes resulting from intervention, and (3) perceived facilitators of change. Different participants found different components useful. Proximal impact manifested in multiple ways (eg, more aggressive follow-up

appointments and prescribing) and in increased overall attention to hypertension. Facilitators of success included leadership, organizational culture, provider engagement, rigorous implementation process, framing of intervention as quality improvement (QI), and health center capacity to process data.

Conclusions:

We attribute the success of the intervention to a multifaceted approach where the combination of multiple intervention components resulted in across the-board change in hypertension care practices. In contrast with research that attempts to isolate the impact of circumscribed health information technology (HIT) tools, our experience suggests that HIT can achieve success in patient outcomes when rigorously implemented as a multifaceted intervention and framed as QI activity.

(Am J Manag Care. 2011;17(12 Spec No.):SP95-SP102)

Qualitative process evaluation of a multifaceted technology-driven hypertension intervention in community health centers resulted in the following key findings and conclusions:

  • Multiple intervention components contributed to changes in hypertension care practices, supporting the multifaceted approach to health information technology implementation.

  • Framing the technology-driven intervention as quality improvement was a facilitator of success.

  • Other facilitators included leadership, organizational culture, provider engagement, rigorous implementation process, and health center capacity to process electronic health record data.

  • Qualitative process evaluation was found to be a feasible and useful method for investigating reasons for the success of a multifaceted technology-based intervention.

Hypertension affects one-third of the American adult population.1 Appropriate management of hypertension presents a significant opportunity to improve cardiovascular health outcomes through interventions in primary care settings. The costs of healthcare, medication, and missed days of work from hypertensionrelated symptoms and complications were $76.6 billion in 2010.1 Improved control of blood pressure among those diagnosed with hypertension would pay off in lowered incidence of stroke and heart disease.

Health information technologies (HITs) are a promising tool for improving quality of care in primary care settings, including underserved settings such as community health centers (CHCs).2 Much of HIT research tests the efficacy of isolated technology components, such as a decision support tool or computerized order entry. Less is known about the effectiveness of HIT-based interventions in community-based settings that combine multiple intervention components in a comprehensive effort to improve quality-of-care outcomes.3 Evidence of effectiveness of HIT to improve hypertension outcomes is mixed.4-6

Effective interventions to improve community health outcomes call for complex multi-level and multi-component approaches.7 However, it is challenging to design studies that can attribute quantitative health outcomes to particular intervention components.8 Instead, the outcomes indicate success of the entire “package.” Process evaluation is an avenue to explore questions about the effective “ingredients” of such interventions. One important purpose of process evaluation is to explain the results of an outcome evaluation,9 also called “interpretive evaluation.”10 Process evaluation using qualitative methods may be particularly wellsuited for understanding why a complex intervention led to its outcomes.

In the domain of information technology (IT) implementation, several models posit factors that explain success of IT. The Technology Acceptance Model (TAM)11-13 builds on general social-behavioral theories and proposes constructs such as perceived usefulness and perceived ease of use as predictors of end-user acceptance and use of IT. The DeLone & McLean Information System Success Model14 predicts organizational and individual impact of IT systems from system use, user satisfaction, system quality, and information quality. A third model proposed by Ovretveit et al15 identifies factors that help or hinder HIT implementation, including characteristics of the HIT system, the implementation process, leadership, resources, and organization culture and climate. These models are helpful in categorizing and labeling factors of importance in HIT implementation research.

This qualitative study examined a multi-component quality improvement (QI) intervention in a CHC with several practice sites in New York that included clinical decision support within an electronic health record (EHR) and provider feedback. The intervention had a positive impact on provider adherence to hypertension guidelines16 and hypertension control among patients.17 The goal of the qualitative analysis was to explain how different facets of the multi-component intervention contributed to the positive impact of improved hypertension control.

METHODS

Setting and Patient Population

The study setting was Open Door Family Medical Centers, a federally qualifying CHC with 4 primary care sites located in suburban communities in New York. The CHC provides primary care to approximately 40,000 patients annually. It is a safety-net provider with a patient population approximately 74% Hispanic, 15% non-Hispanic white, and 9% non-Hispanic black; 35% of the patients have Medicaid, 4% Medicare, 4% private insurance, and 57% no insurance. In May 2007, the CHC started using an EHR system (eClinicalWorks). The leadership of the CHC has a strong interest in systematic QI activities and was eager to use the new EHR as a QI tool. At baseline of the study, 14% of all CHC patients were diagnosed with hypertension and an additional 5% were found to have undiagnosed hypertension. About half of hypertensives had controlled blood pressure at their last visit.18

Intervention

The CHC leadership had identified hypertension as a QI target because it was the most prevalent adult chronic condition, a significant contributor to leading causes of mortality, and appeared feasible to improve through appropriate management. The CHC had recently engaged in a successful QI diabetes program and wanted to now turn their attention to hypertension. The intervention was designed by a collaborative team that included the administrative and medical staff from the CHC and public health researchers. Components of the intervention were informed by pre-intervention interviews with all providers practicing at the 4 study sites. Several key components of the intervention were implemented in the CHC’s EHR system, including alerts of high blood pressure readings, and templates, order sets, and clinical reminder algorithms for hypertension management. The templates and order sets offered standard sets of items to be included in hypertension visits while the clinical reminder algorithm prompted specific actions based on patient characteristics. Providers also received trainings and quarterly report cards (feedback) indicating levels of hypertension control among their panel of patients. The report cards were discussed in group meetings, followed up by one-on-one meetings with the CHC Medical Director for providers who performed below average. Additional details of all components of the intervention are described elsewhere.17

The outcome evaluation design compared repeated measures of hypertension control before and after the intervention. The proportion of visits in which hypertension was controlled was 51% at baseline and increased significantly to 61% in the post-intervention period.17 The process evaluation included quantitative surveys of providers and qualitative interviews of providers and key informants. The quantitative surveys were primarily used to measure provider attitudes preintervention and to inform intervention design. This paper reports results based on the qualitative interviews.

Provider and Key Informant Interviews

All providers practicing at the 4 health centers (n = 16) were interviewed about 6 months prior to and at 3 to 4 months following the launch of the intervention. The interview questions focused on perceptions and experiences regarding hypertension care, hypertension guidelines, EHRs, organizational environment, intervention components, and intervention implementation. Key informants (n = 6) representing leadership and staff actively involved in implementation of the intervention were interviewed about the implementation experience about 5 to 6 months after the intervention began. The semistructured interviews were conducted by the evaluators and lasted about 30 to 45 minutes. They were audio-recorded and transcribed for analysis. Human subject participation was approved by the Institutional Review Boards at New York University and Columbia University Medical Center. Informed consent was obtained from all interview participants.

Data Analysis

A team of 4 researchers conducted an iterative process of identifying a set of thematic codes and then applying them systematically to all interview transcripts. The team first reviewed key concepts in the 3 theoretical models (TAM, De- Lone & McLean’s model, and Øvretveit’s model). They then read all interview transcripts, identifying salient themes and notable quotations, and collaboratively drafted a hierarchically organized list of thematic codes. The final coding manual included a total of 78 themes and sub-themes. Using Atlas.ti qualitative analysis software, the researchers applied the codes systematically to all transcripts. To verify agreement, 10% of transcripts were coded by 2 researchers. For this paper, the codes were further reorganized under key process evaluation questions, as described below and shown in the Table.

RESULTS

The findings were organized into 3 main domains that emerged as we examined the data in light of process evaluation research questions and implementation theories:

1. Satisfaction and perceived usefulness of intervention components

2. Perceived proximal changes resulting from the intervention

3. Perceived facilitators of change Each of the 3 domains contributes answers to why the intervention had a successful outcome. The Table shows how the 3 domains relate to key functions of process evaluation and to major constructs of the 3 theoretical models used to guide the interpretation of the data.

1. Satisfaction and Perceived Usefulness of Intervention Components

Assessing satisfaction among participants (or users) is a staple of process evaluation. The TAM construct of “perceived usefulness,” which is widely used in IT research,11 broadens an assessment of satisfaction to include reasons why the person likes an IT tool, and in particular how it facilitates work activities. When asked why they liked the different intervention components, the providers phrased their answers in terms of perceived usefulness. The interviews prompted respondents about their satisfaction with each of the 5 major intervention components: alert, order set, template, clinical reminders, and provider report card. In addition, respondents reported on their satisfaction with the training associated with the intervention and the development of hypertension guidelines tailored for the CHC.

Alert. Having elevated blood pressure readings listed in red was generally well accepted and considered highly effective in bringing attention to blood pressure.

“It does help. Sometimes you may just glance over, but when it’s red it does stand out much more. Especially when the blood pressure was just, say, 130 over 90. A lot of times we would have just ignored it, but having that red does help.”

“The alerts are superb.”

Order Set. The hypertension order set was not universally endorsed, but 9 of the 16 providers discussed ways they found it to be useful. It was reported as helpful for remembering recommended orders and validating the treatment plan:

“I sometimes try to remember to use that order set because I do find that that helps me track when labs were done. That to me is the most helpful.”

“I like to be validated in what I do, and since this is not my typical patient that I see, I like to see that. The little hint for the labs, the immunizations, and the appointments are all the pros of it.”

Template. Most providers elected not to use the hypertension template, but 3 of the 16 providers reported they had become regular users. The choice between using a template and order set, or neither, varied based on the provider’s documentation style and expertise level.

Clinical Reminder Algorithm. One-half of the providers discussed ways they found the reminders triggered by the decision support algorithm helpful. They were found effective in catching what a provider might have missed.

“It gives you the real little tips, the hints, and what you missed, which is great.”

Performance Feedback. Satisfaction with the provider report cards was mixed. One-fourth of the providers raised questions about the validity and reliability of the methodology, particularly in terms of which patients got counted for each provider. Others acknowledged the benefits of the performance feedback, and at least 1 provider felt that this was the most effective intervention component. The report cards were seen as promoting vigilance and aggressive action with hypertension patients.

“If I see I’m not performing as well as my colleagues they must be doing something more than I am, so I need to be more aggressive.”

“I don’t think that those report cards are that fair or accurate.”

Other Components: Training and Definition of Hypertension Guidelines. The training sessions were seen as helpful not only in conveying information but also in rallying everyone around a common goal to address hypertension control and in motivating buy-in across the organization.

“The training session that we had was a big thing. We all came together as a team. It was a whole-team approach.”

Providers also felt that it was important that “the standards are put on paper for us,” referring to the process of defining hypertension guidelines that were tailored to their health center. Together with the trainings, the guidelines facilitated getting everyone in the organization “on the same page.”

While it was clear that different providers saw value in different intervention components, it should be noted that some components received more consistent praise than others. These components were sometimes elevated to play a major role in the project’s overall success. The blood pressure alerts emerged as a particularly popular part of the intervention, and were seen by several providers as the most powerful component:

“I think that’s the number-one intervention we actually have. I think without it our project would probably not have shown results.”

2. Perceived Proximal Changes Resulting From the Intervention

Figure

This domain of the results corresponds with questions evaluators ask about short-term or intermediate outcomes. Logic models19 are often used to depict how change in these outcomes is expected to lead to the intervention’s ultimate outcomes, such as improved hypertension control among patients. The logic model in the organizes our process evaluation findings to depict the causal progression toward the patient outcomes. We turned to the constructs of perceived individual impact and perceived organizational impact from the DeLone & McLean model of IT implementation14 to conceptualize proximal change in provider and organizational behavior. The pertinent interview segments were those where providers described what they were doing differently in hypertension care as a result of the intervention.

More Aggressive Care. The interview responses consistently indicated that providers believed the intervention led them to be more aggressive in their management of hypertension. This was sometimes described in general terms:

“I am a lot more aggressive in hypertension care. We’re all in that mode. That’s my vibe, that we’re really trying to get this in control. I guess I’m like the hypertensive police now, in that mode.”

Other providers discussed specific ways in which they had become more aggressive, including, for example, bringing patients back after 1 elevated blood pressure reading, prescribing more medications, giving more referrals to the nutritionist, and giving out patient-education handouts:

“In some ways, I’m a little more aggressive with the multiple medications.”

“I’ve been on this kick now that I have to give patients printouts. And it’s available on that order set in English and in Spanish.”

More Systematic, Consistent Care. A pattern closely related to more aggressive patient management was the provision of care that was more consistent with guideline recommendations. For example, providers reported ordering electrocardiograms (EKGs) more regularly for hypertensive patients and being more likely to schedule follow-up visits at recommended intervals.

“We’re doing more EKGs. I don’t think we were routinely doing that for every hypertensive patient. Now we are trying to have one on record for every single hypertensive patient, so that is something that’s new.”

“I think the follow-ups— that’s really impacted. Because it’s all spelled out when they should come back. I think before I wasn’t really thinking of, oh, in 3 months rather than 6 months or 4 months.”

More Overall Attention to Hypertension. Several respondents spontaneously commented that the success of the intervention was due to a combination of many factors coming together.

“I think it’s a combination of everything. Medical assistants are reminding us, oh, he’s a hypertensive and he hasn’t done an EKG yet. The patient advocates. We’re more aware of abnormal blood pressures. It’s a combination of everything. I don’t think it’s just 1 or 2 things.”

Some also noted that the intervention generally increased the level of attention that the providers and the health centers as a whole gave to hypertension.

“What we’ve been concluding is that there are many different pieces to this sort of package that we’re implementing here and that it’s just overall these things together plus just paying more attention to hypertension. That seems to be making a difference here.”

3. Perceived Facilitators of Change

Information technology implementation models, such as the TAM11 and Øvretveit’s model,15 posit factors that serve as facilitators of successful implementation. Drawing on these models, we identified 3 facilitators important for our implementation: leadership, organizational culture, and provider engagement. In addition, we discovered 3 facilitator themes that emerged from the interview data: rigorous implementation process, framing of intervention as QI, and health center capacity to process data.

Leadership. Full buy-in and participation by top leadership of the health centers was seen as critical for implementation success. This ensured that the project remained a priority and that frequent communication was provided to staff:

“It does seem like the leadership has the trust of the staff. And so the approach works because there’s a lot of communication between the providers and staff, and the management.”

Organizational Culture. The health centers had a culture that, in several ways, was seen as a facilitator of the implementation. Several participants noted that their organization was “amenable to change” and particularly hospitable to QI:

“I think it’s helpful in our organization to have clinicians who generally like to work and organize themselves around quality when possible.”

Provider Engagement. Provider input was actively sought during intervention design and implementation. The implementation team, including the health center leadership, recognized that provider buy-in was essential for success.

“If they don’t own it in their hearts, it’s just not going to happen. Or it’s going to be a very painful process.”

“If it’s something that doesn’t help their work, they’re not going to do it.”

The leadership achieved a balance of mandating use of intervention components and provider independence in choosing to use the components.

“I do not think it would be okay to just leave it there and see if anybody tried to use it. I think it should be mandated that they at least try.”

“It’s discussed fairly regularly at meetings and we’re encouraged to use them.”

Rigorous Implementation Process. The design and implementation process was perceived as more rigorous compared with other QI initiatives at the CHC. Participants described the process as “methodical,” “systematic,” “comprehensive,” and “persistent.” They reported that more time and resources were allocated than in typical QI activities.

Framing of Intervention as QI. The leadership and providers at the health centers regarded the project as “a special QI project” implemented by the CHC in consultation with the academic partners. It was described as something closer to a rigorous internal quality initiative than a controlled research study where academic partners use the health center as a setting where they test the efficacy of a specific HIT intervention.

Health Center Capacity to Process Data. In the context of HIT research, the feasibility of obtaining data from the “back-end” of the IT system is not a simple proposition. What emerged as a critically important facilitator of success was the health center’s interest in developing internal capacity to process the wealth of data that gets captured in the EHR. About 1 year into the EHR implementation, the CHC became determined to get useful data back from the EHR and sought out the help of an IT consultant/programmer. With the assistance of the consultant, it was possible to produce clean and reliable “provider report cards,” as well as data the evaluation researchers used to measure patient outcomes of the intervention.

“Over time it’s improved. I think in the beginning we just couldn’t get anything out of it. We didn’t know how to get information out, but as we learned how to structure the data that we’re putting in, we can get that out in a better way, and we’re continuing to improve our information gathering out of it. I think that it’s been a slow but steady process of being able to get information out, and I think we’re pretty happy with the way we’re able to manipulate our reports to meet our needs.”

In addition to the programming expertise of the consultant, the CHC received guidance from the academic evaluation partners of the hypertension intervention project who reinforced the importance of structured data elements and strategies for critical assessment of data quality.

Because the intervention had a positive impact on the desired patient outcome, what we have emphasized above are the factors explaining the success. However, as with any implementation in a real setting, many challenges were also reported. Unreliability of the hardware platform was perhaps the most acute challenge. Participants reported a fair amount of frustration over the slow and crashing system, to the point where it sometimes discouraged them from using the additional decision support components that consumed more time and computer processing power. The problem was ameliorated during the study period by the purchase of a new server. Another notable challenge was the limited ability to design customized intervention components when the software vendor was not available to provide programming resources. Finally, several providers felt that additional one-on-one hands-on training to navigate EHR-based intervention components was called for.

DISCUSSION

A systematic qualitative process evaluation was able to illuminate the reasons why a multifaceted HIT-based intervention improved hypertension control in CHCs. Incorporating multiple components into the intervention was important for success. Among the EHR tools, there was something that worked for everyone. Our evidence suggests it was the combination and synergy of the intervention components that led to the positive outcomes. Similarly, impact on care processes was perceived to manifest in many different ways. The changes participants described in care practices were consistent with quantitative process measure outcomes.17 The impact of the QI project was characterized as an increase in “overall attention to hypertension” and being generally “more aggressive.” Again, participants believed that it was the combination of several changes that contributed to the overall outcome. We interpret these findings to mean that it is precisely this multifacetedapproach that explains the success in improving the outcome of hypertension control.

HIT tools were embedded in a QI framework that allowed for a comprehensive organizational approach to improving hypertension care. HIT tools often fail to improve health outcomes, 2 but that may be partly because they are implemented as circumscribed interventions that do not become part of the context of a QI effort. Several other factors were identified as facilitators of success. Consistent with previous findings in the field of HIT implementation,20,21 they included organization culture, leadership, rigorous implementation process, and provider engagement. Strength in these areas enhances success of HIT interventions.

Our experience suggests that the healthcare setting’s ability to process patient data is an additional important facilitator of implementation success for EHR-based QI interventions. CHCs should be offered access to resources that help them use EHR data. Such resources include guidance in use of structured data elements, strategies for critical assessment of data quality, and programmers who can extract data files from EHR systems. Many CHCs are actively seeking such resources as they strive for compliance with the recently defined Meaningful Use criteria for obtaining federal incentives of EHR adoption.22

The process analysis encompassed: satisfaction and perceived usefulness of intervention components, perceived proximal changes resulting from the intervention, and perceived facilitators of change. This kind of analysis can be particularly helpful when attempting to understand how different factors contribute to positive outcomes of multicomponent interventions. An intervention logic model depicting causal linkages between the process factors and the outcomes helped organize the qualitative findings. The coding was guided by constructs from 3 different IT implementation theories that each helped conceptualize a different segment in the intervention logic. We analyzed perceived usefulness11 of intervention components, proximal changes in terms of individual and organizational impact,14 and facilitating conditions.11,15 Our findings corroborate key constructs in these models, including the role of leadership, organizational culture, and provider engagement as specific facilitating conditions. We also discovered facilitators that have received less attention in the field of HIT implementation,namely, framing of the project as QI and agency capacity to process EHR data.

While our qualitative study provides insight into processes common across healthcare settings, it has limited generalizability. In particular, the intervention was implemented in a setting that stands out for its innovativeness and positive QI culture among similar CHCs. We argue that qualitative interviews of participants can be a highly useful and valid methodology to understand why a multi-component intervention worked, but we also acknowledge that they have limitations stemming from the subjectivity with which participants interpret the causes of success. Another methodological limitation is the lack of quantitative data on use of the EHR-based intervention components. Future studies should pursue access to such data from the back end of the EHR system.

We used qualitative process evaluation to understand why our intervention worked. We cannot attribute the success of the intervention to any particular component or set of components. Rather, our experience suggests that HIT can be successful if implemented using an approach that offers multiple components, promotes overall attention to the problem, and is framed as QI in a way that is meaningful to community healthcare providers.

Acknowledgment

The authors would like to acknowledge the contributions of their research assistant Rachel Ferat, MA, and the providers and staff at Open Door Family Medical Center.

Author Affiliations: From Department of Clinical Sociomedical Sciences (MM), Mailman School of Public Health, Columbia University, New York, NY; Division of General Internal Medicine (DS, T-YT), New York University School of Medicine, New York, NY; Open Door Family Medical Center (DW, PF), New York, NY; Primary Care Development Corporation (HK), New York, NY.

Funding Source: Agency for Healthcare Research and Quality (1R18HS017167-01).

Author Disclosures: The authors (MM, DS, DW, PF, T-YT, HK) report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (MM, DS, DW, PF, T-YT, HK); acquisition of data (MM, DS, PF, HK); analysis and interpretation of data (MM, DW, PF, T-YT, HK); drafting of the manuscript (MM, DS); critical revision of the manuscript for important intellectual content (MM, DS); provision of study materials or patients (DW, PF); obtaining funding (DS, HK); administrative, technical, or logistic support (DW, PF, TY-T, HK); and supervision (DW, HK).

Address correspondence to: Mari Millery, PhD, Assistant Professor of Clinical Sociomedical Sciences, Mailman School of Public Health, Columbia University, 722 W 168th St, Rm 543, New York, NY 10032. E-mail: mm994@columbia.edu.

1. Lloyd-Jones D, Adams RJ, Brown TM, et al. Heart disease and stroke statistics—2010 update: a report from the American Heart Association. Circulation. 2010;121(7):e46-e215.

2. Millery M, Kukafka R. Health information technology and quality of health care: strategies for reducing disparities in underresourced settings. Med Care Res Rev. 2010;67(5 suppl):268S-298S.

3. Persell SD, Kaiser D, Dolan NC, et al. Changes in performance after implementation of a multifaceted electronic-health-record-based quality improvement system. Med Care. 2011;49(2):117-125.

4. Hicks LS, Sequist TD, Ayanian JZ, et al. Impact of computerized decision support on blood pressure management and control: a randomized controlled trial. J Gen Intern Med. 2008;23(4):429-441.

5. Bosworth HB, Olsen MK, Dudley T, et al. Patient education and provider decision support to control blood pressure in primary care: a cluster randomized trial. Am Heart J. 2009;157(3):450-456.

6. Choma NN, Huang RL, Dittus RS, Burnham KE, Roumie CL. Quality improvement initiatives improve hypertension care among veterans. Circ Cardiovasc Qual Outcomes. 2009;2(4):392-398.

7. Stokols D. Translating social ecological theory into guidelines for community health promotion. Am J Health Promot. 1996;10(4):282-298.

8. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.

9. Linnan L, Steckler A. Process evaluation for public health interventions and research. In: Steckler A, Linnan L, eds. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass; 2002.

10. Stetler CB, Legro, MW, Wallace CM, et al. The role of formative evaluation in implementation research and the QUERI experience. J Gen Intern Med. 2006;21(s2):S1-S8.

11. Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. MIS Quart. 2003;27(3): 425-478.

12. Venkatesh V, Davis FD. A theoretical extension of the technology acceptance model: four longitudinal field studies. Management Sci. 2000;46(2):186-204.

13. Holden RJ. The technology acceptance model: its past and its future in health care. J Biomed Inform. 2010;43(1):159-172.

14. DeLone WH, McLean ER. The DeLone and McLean model of information systems success: a ten-year update. J Manag Inform Syst. 2003; 19(4):9-30.

15. Øvretveit J, Scott T, Rundall TG, Shortell SM, Brommels M. Implementation of electronic medical records in hospitals: two case studies. Health Policy. 2007;84(2-3):181-190.

16. US Department of Health and Human Services; National Heart Lung and Blood Institure. The Seventh Report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure—Complete Report. http://www.nhlbi.nih.gov/ guidelines/hypertension/jnc7full.htm. Published 2004. Accessed August 15, 2011.

17. Shelley D, Tseng, T, Matthews A, et al. Technology-driven intervention to improve hypertension outcomes in community health centers. Am J Manag Care. 2011;17(12 Spec No.):SP103-SP110.

18. Shelley D, Tseng T, Andrews H, et al. Controlled hypertension among a low income diverse population with access to care in community health centers. Am J Hypertens. In press.

19. Frechtling JA. Logic Modeling Methods in Program Evaluation. San Francisco, CA: Jossey-Bass; 2007.

20. Lorenzi NM, Riley RT. Managing Technological Change: Organizational Aspects of Health Informatics. New York, NY: Springer; 2004.

21. Ludwick DA, Doucette J. Adopting electronic medical records in primary care: lessons learned from health information systems implementation experience in seven countries. Int J Med Inform. 2009;78(1):22-31.

22. Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med. 2010;363(6):501-504.

Related Videos
Benjamin Scirica, MD, MPH, associate professor of medicine at Harvard Medical School and director of quality initiatives at Brigham and Women’s Hospital’s Cardiovascular Division
Glenn Balasky during a video interview
dr joseph alvarnas
Michael Lynch, MD, UPMC
dr alex jahangir
Fahad Tahir, MAS, MBA, FACHE, Ascension St Thomas
Leland Metheny, MD, University Hospitals Seidman Cancer Center
Andrew Cournoyer
Kelly Harris, APRN
Michael A. Choti, MD, MBA
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo