Currently Viewing:
The American Journal of Managed Care November 2004 - Part 2
Screening for Depression and Suicidality in a VA Primary Care Setting: 2 Items Are Better Than 1 Item
Kathryn Corson, PhD; Martha S. Gerrity, MD, MPH, PhD; and Steven K. Dobscha, MD
The Veterans Health Administration: Quality, Value, Accountability, and Information as Transforming Strategies for Patient-Centered Care
Jonathan B. Perlin, MD, PhD, MSHA; Robert M. Kolodner, MD; and Robert H. Roswell, MD
VA Health Services Research: Lessons for the World's Healthcare Organizations
Steven J. Bernstein, MD, MPH
Currently Reading
Variation in Implementation and Use of Computerized Clinical Reminders in an Integrated Healthcare System
Constance H. Fung, MD, MSHS; Juliet N. Woods, MS; Steven M. Asch, MD, MPH; Peter Glassman, MBBS, MSc; and Bradley N. Doebbeling, MD, MSc
Assessing the Accuracy of Computerized Medication Histories
Peter J. Kaboli, MD, MS; Brad J. McClimon, MD, PharmD; Angela B. Hoth, PharmD; and Mitchell J. Barnett, PharmD, MS
The Relationship of System-Level Quality Improvement With Quality of Depression Care
Andrea Charbonneau, MD, MSc; Victoria Parker, DBA; Mark Meterko, PhD; Amy K. Rosen, PhD; Boris Kader, PhD; Richard R. Owen, MD; Arlene S. Ash, PhD; Jeffrey Whittle, MD, MPH; and Dan R. Berlowitz, MD,
Designing an Illustrated Patient Satisfaction Instrument for Low-literacy Populations
Janet Weiner, MPH; Abigail Aguirre, MPA; Karima Ravenell, MS; Kim Kovath, VMD; Lindsay McDevit, MD; John Murphy, MD; David A. Asch, MD, MBA; and Judy A. Shea, PhD
Problems Due to Medication Costs Among VA and Non-VA Patients With Chronic Illnesses
John D. Piette, PhD; and Michele Heisler, MD, MPA

Variation in Implementation and Use of Computerized Clinical Reminders in an Integrated Healthcare System

Constance H. Fung, MD, MSHS; Juliet N. Woods, MS; Steven M. Asch, MD, MPH; Peter Glassman, MBBS, MSc; and Bradley N. Doebbeling, MD, MSc

Objectives: To identify patterns of use of computerized clinical reminders (CCRs) across an integrated healthcare system and describe institutional factors associated with their implementation.

Study Design: Cross-sectional study.

Methods: At a national electronic health record (EHR) meeting, we surveyed 261 participants from 104 Veterans Health Administration (VHA) healthcare facilities regarding the number and types of CCRs available at each facility. Potential explanatory measures included perceived utility and ease of use of CCRs, training and personnel support for computer use, EHR functionalities, and performance data feedback to providers at each facility.

Results: The number of conditions with CCRs in use at a facility ranged from 1 to 15; most reported implementation of reminders for 10 of the 15 conditions surveyed. The most commonly implemented CCRs, used in more than 85% of facilities, were for conditions with VHA national performance measures (eg, tobacco cessation, immunizations, diabetes mellitus). The least commonly implemented CCRs were for post-deployment health evaluation and management, medically unexplained symptoms, and erectile dysfunction. Facilities that had implemented greater numbers of clinical reminders had providers who reported greater ease of use and utility of the reminders (P = .01).

Conclusions: VHA facilities vary markedly in their implementation of CCRs. This effect may be partly explained by greater incorporation of clinical reminders for conditions with performance measures. Further study is needed to determine how to best implement clinical reminders and the institutional factors important in their use.

(Am J Manag Care. 2004;10(part 2):878-885)

Computerized clinical reminders (CCRs) have been widely publicized as potential tools for changing behavior1 and improving quality of care.2-4 They have been particularly effective in improving adherence with preventive care and screening guidelines,2,5 monitoring diabetes,6 and treating hypertension.7 However, factors such as workload, time, and perceived reduction of the quality of the provider-patient interaction may be barriers to effective use of computerized reminders.8

The Veterans Health Administration (VHA), the United States' largest integrated healthcare delivery system, has invested heavily in the informatics infrastructure necessary to support CCRs. It has been on the forefront of developing these tools and incorporating them into the computerized patient record system (CPRS), which is the VHA's electronic health record (EHR). The VHA developed CPRS, an application used throughout VHA facilities that enables clinicians to review and analyze patient data and supports clinical decision making.9

Computerized clinical reminders may be developed and distributed at the national, regional, or local level. Although VHA mandates use of a few clinical reminders such as assessing hepatitis C risk and possible sexual trauma during military service, most CCRs have been locally initiated.10 After reminders are created, they are tested and activated. Some reminders are applicable to all patients, while others apply to a particular group of patients. VHA facilities can generate clinical reminder reports, which provide information, for example, about the number of patients in a clinic with completed clinical reminders and the number of patients eligible for the reminder.

Although CCR technology is in widespread use in VHA, data regarding CCR patterns of use in VHA and institutional factors associated with their use have not been available. A description of the different types of reminders implemented in VHA may help other large organizations interested in promoting this technology. Identifying factors influencing widespread implementation of CCRs is a necessary step to promote their dissemination. Hence, our primary objective was to survey VHA facilities for variations in the numbers and types of CCRs used across VHA facilities nationally. Our secondary objective was to identify institutional factors associated with increased implementation of CCR technology.

METHODS

Study Population and Data

We assessed CCR use across the VHA by distributing a survey instrument to participants at the national Camp CPRS meeting that took place in Georgia in May 2003. This meeting focused on the VHA's EHR and in 2003, included 1304 representatives from 136 of the 142 VHA medical facilities participating in the VHA's External Peer Review Program (EPRP). The representatives, who are nominated by their facilities, may be clinical staff (eg, physician, nurses, others), administrative personnel (eg, chief of staff), or informatics experts. Many attendees are opinion leaders at their facility and have extensive experience with local CPRS capabilities, either as users or developers of clinical applications for the EHR and its decision support tools.

We received responses from 261 participants (20%) representing 104 VHA facilities (76%). Nonrespondents were not tracked because only those who volunteered were given the survey instrument. Sixty-five of the 104 facilities that responded to our survey had more than 1 respondent.

The American Hospital Association database provided additional background information about VHA facilities. It provided information about geographic distribution of participating facilities, teaching status affiliation, and size of the facilities.

Questionnaire Development

After discussion with the VHA staff that oversee, develop, or perform research on clinical reminders, including members of the National Clinical Practice Guideline Council and members of its ad hoc Clinical Reminders Committee, we chose to evaluate facilities' use of 15 different types of CCRs. We selected reminders that represent a broad range of conditions clinicians might encounter in various clinic settings. Some reminders were for conditions that have VHA national performance measures (eg, addressing tobacco cessation), whereas others (eg, low back pain) were for conditions without VHA national performance measures.

The remainder of the 77 items in the survey instrument focused on institutional factors hypothesized from our ongoing studies to be important in implementation of clinical guidelines11 or CCRs. The instrument assessed different forms of computer use (provider education, performance feedback, and clinical support such as the ability to retrieve radiological images), perceived utility and ease of use of CCRs, adequacy of computer training and CCR training, organizational support, hospital culture/climate, and availability of feedback mechanisms for modifying CCRs. Responses for this portion of the survey instrument were either dichotomous (yes or no) or measured on a 5-point Likert-type scale. Respondents who answered "don't know" were recoded as "missing" or "no." The questionnaire is available, upon request, from Dr. Doebbeling.

The University of Iowa/Iowa City VA institutional review boards approved the study protocol. The data management plan underwent subsequent review and approval from the institutional review board of the VA Greater Los Angeles Healthcare System.

Outcome Measures

Outcome measures were obtained from questionnaire responses. For these measures, we aggregated individual responses at the facility level. Although 39 facilities (37.5%) had only a single respondent, the other facilities had multiple respondents, including 29 facilities with 2 respondents, 23 with 3 respondents, 6 with 4 respondents, 1 with 5 respondents, 3 with 6 respondents, and 3 with 10 or more respondents. When more than 1 response per facility was available, we used mean scores for questions that had Likert-like response scales. For dichotomous variables, we assumed that respondents from any given facility, while knowledgeable about their home facility, would be unlikely to report seeing or using a clinical reminder that they have never seen or used. In contrast, it is possible that they may not have seen or used a clinical reminder that was available at their facility, but a colleague at their facility may have reported seeing or using the clinical reminder. Because our goal was to determine whether a clinical reminder exists or not at a given facility, we weighted "yes" responses more than "no" responses. We reasoned that the union rather than intersection of positive responses would more accurately reflect, for example, the actual number of conditions with clinical reminders in place at a particular facility. Therefore, we treated any positive answer among respondents at the facility as a true-positive facility response. For example, if there were 3 responses and 2 stated that CPRS was used for requesting consults and 1 did not (or vice versa), the facility score was based on a "yes" answer. In a sensitivity analysis, we weighted negative responses more heavily than positive responses.

The first outcome measure represents which CCRs were available at the facility. The second outcome measure, a "facility clinical reminder score" (minimum possible score = 0, maximum possible score = 15), was created by summing "yes" responses to questions surveying whether a facility had at least 1 clinical reminder for a particular health condition. This variable represented the level of implementation of CCR technology in VHA facilities. Although some conditions, such as diabetes, may have more than 1 clinical reminder (eg, glycosylated hemoglobin, microalbumin/creatinine ratio), we counted each condition only once in our facility clinical reminder score. We used t tests to determine whether responses to each question differed significantly for facilities with complete versus incomplete responses. There were 72 facilities with complete data and 32 facilities with incomplete data (missing data or "don't know" response) for the questions used to construct the facility clinical reminder score. The mean facility clinical reminder score was 9.1 (95% confidence interval [CI]: 8.6, 9.7) for facilities with complete responses versus 7.4 (95% CI: 6.3, 8.5) for those with incomplete responses (a difference of 1.7; 95% CI: 0.7, 2.8). Table 1 presents comparison data for VHA facilities that participated in Camp CPRS and had respondents to the survey versus VHA facilities that did not participate in Camp CPRS or did not have respondents to the survey. Facilities with complete and incomplete responses were similar with respect to number of acute-care beds, trainee and resident full-time equivalents, academic affiliation, and urban versus rural location (P > .05 for all comparisons).

Figure

Potential Explanatory Measures

To identify potential explanatory variables, we developed scales from questionnaire items. The process of developing scales also served as a method of data reduction. Existing literature and conceptual models, as well as factor analysis, guided development of the scales.12,13 We relied on a priori hypotheses developed from literature review, our conceptual framework, and clinical experience in the identification and labeling of a 5-factor solution. The resulting domains included:

  1. Computer training and personnel support, measuring perceived adequacy of training and personnel support (8 items; alpha = .842).
  2. EHR functionality, measuring the number of features available online to clinicians at the point of care and at the facility in general (12 items; alpha = .86).
  3. Clinical reminders utility and ease of use (6 items; alpha = .75).
  4. Graphical data feedback, measuring availability of graphical display of individual and clinic performance (2 items; alpha = .95).

For each scale, higher scores indicated greater perceived support, ease of use/utility, functionality, or availability (Appendix).

Statistical Analysis

 
Copyright AJMC 2006-2019 Clinical Care Targeted Communications Group, LLC. All Rights Reserved.
x
Welcome the the new and improved AJMC.com, the premier managed market network. Tell us about yourself so that we can serve you better.
Sign Up