Improving Laboratory Monitoring of Medications: An Economic Analysis Alongside a Clinical Trial

Although some interventions may enhance medication safety, an electronic medical record reminder to providers may not be an efficient use of resources.

Published Online: May 07, 2009
David H. Smith, RPh, PhD; Adrianne C. Feldstein, MD; Nancy A. Perrin, PhD; Xiuhai Yang, MS; Mary M. Rix, RN; Marsha A. Raebel, PharmD; David J. Magid, MD; Steven R. Simon, MD; and Stephen B. Soumerai, ScD

Objective: To test the efficiency and cost-effectiveness of interventions aimed at enhancing laboratory monitoring of medication.


Study Design: Cost-effectiveness analysis.


Methods: Patients of a not-for-profit, group-model HMO were randomized to 1 of 4 interventions: an electronic medical record reminder to the clinician, an automated voice message to patients, pharmacy-led outreach, or usual care. Patients were followed for 25 days to determine completion of all recommended baseline laboratory monitoring tests. We measured the rate of laboratory test completion and the cost-effectiveness of each intervention. Direct medical care costs to the HMO (repeated testing, extra visits, and intervention costs) were determined using trial data and a mix of other data sources.


Results: The average cost of patient contact was $5.45 in the pharmacy-led intervention, $7.00 in the electronic reminder intervention, and $4.64 in the automated voice message reminder intervention. The electronic medical record intervention was more costly and less effective than other methods. The automated voice message intervention had an incremental cost-effectiveness ratio (ICER) of $47 per additional completed case, and the pharmacy intervention had an ICER of $64 per additional completed case.


Conclusions: Using the data available to compare strategies to enhance baseline monitoring, direct clinician messaging was not an efficient use of resources. Depending on a decision maker’s willingness to pay, automated voice messaging and pharmacy-led efforts can be efficient choices to prompt therapeutic baseline monitoring, but direct clinician messaging is probably a less efficient use of resources.


(Am J Manag Care. 2009;15(5):281-289)

Patients of a not-for-profit, group-model HMO were randomized to 1 of 4 interventions: an electronic medical record reminder to the clinician, an automated voice message to patients, pharmacy-led outreach, or usual care.

  • The electronic medical record intervention was more costly and less effective than other
    methods.
  • Depending on a decision maker’s willingness to pay, pharmacy-led outreach or automated
    voice messages to patients can be cost-effective methods to enhance laboratory monitoring of
    medications at initiation of therapy compared with usual care.
Recent studies indicate that laboratory monitoring of medications at initiation of therapy is below the level recommended by guidelines, with as many as 39% of patients not receiving recommended testing.1 Lack of monitoring is a concern because of potential adverse events (eg, hyperkalemia associated with inhibitors of angiotensin) and because of failure to achieve therapeutic benefit due to inadequate blood levels of medication. Additionally, failure to establish baseline levels makes it difficult to determine a patient’s trends in laboratory values.

These concerns regarding patient safety and clinical effectiveness have led researchers to test methods to enhance laboratory-based medication monitoring. Several types of interventions (including pharmacyled efforts, electronic reminders to clinicians, and automated telephone call reminders to patients) to improve laboratory monitoring of medications at therapy initiation have been effective in randomized trials.1,2 Although all these interventions improve monitoring, the most efficient intervention methods are not clear, and no economic analyses have been done to inform policy makers in this area. Some efforts may be particularly resource intensive, but could be worth the added expenditure when the potential adverse outcome is severe. Without careful analysis of the balance between costs and benefits, one cannot determine which (if any) interventions ought to be funded by healthcare payers. The efficiency of alternative approaches to therapeutic monitoring is of growing importance to healthcare providers as this monitoring is now a focus of quality measurement.3 To help with decision making regarding laboratory-monitoring interventions, we undertook a preplanned costeffectiveness analysis of a randomized trial that tested several interventions aimed at enhancing laboratory monitoring of medication.2

METHODS

Trial Design

Complete details of the trial design are available elsewhere.2 The study was conducted at a not-for-profit, group-model HMO and was approved by its institutional review board. All HMO patients who had not received baseline laboratory tests (defined as within 6 months before or 5 days after a newly dispensed study medication) were randomized to 1 of 4 conditions: an electronic medical record reminder to the patient’s primary care provider (EMR arm), an automated voice message to patients (AVM), pharmacy team outreach (Pharmacy), or usual care (UC) (Figure 1). In the EMR intervention, a patient-specific electronic message was sent to the primary care clinician from the chair of the HMO’s patient safety committee stating that computer records indicated the patient had received a new medication, that laboratory monitoring was recommended, and the patient had not received the test(s) between 6 months before and 5 days after the dispensing. The message referenced internal and external guideline resources, recommended specific tests, and provided a sample letter the clinician could send to the patient. The AVM intervention included telephone messages advising the patient that laboratory tests were required for a medication the patient had received; the patient was advised that the testing had been ordered and could be completed at any HMO laboratory. The Pharmacy intervention began with a telephone call from a nurse in the pharmacy department to the patient to encourage laboratory testing. If the nurse successfully contacted the patient, a follow-up letter reminded the patient to obtain the laboratory test(s). If telephone contact was not successful, the nurse sent a letter suggesting that the patient go in for testing. If patients had questions or concerns about their medication during the contacts, a pharmacist was available for consultation.

Study medications (and lab tests required) were angiotensinconverting enzyme inhibitors or angiotensin receptor blockers (serum creatinine, serum potassium), allopurinol (serum creatinine), carbamazepine (aspartate aminotransferase [AST] or alanine aminotransferase [ALT], complete blood count, serum sodium), diuretics (serum creatinine, serum potassium), metformin (serum creatinine), phenytoin (AST/ALT, complete blood count), pioglitazone (AST/ALT), potassium supplements (serum potassium, serum creatinine), statins (AST/ALT), and terbinafine (AST/ALT, serum creatinine). The primary outcome was laboratory test completion, defined as the proportion of patients with all recommended baseline laboratory monitoring tests completed at 25 days after the intervention date. In the year before randomization, the laboratory-monitoring rates at the initiation of therapy (those who had initiated a study medication and had completed all recommended baseline laboratory testing) were similar in the study groups (about 60%). Other characteristics of the study groups also were similar, but the AVM group had a smaller proportion of female primary care physicians (24% vs ~40%). A total of 961 patients were included in the clinical trial. By day 25 after the intervention, 22.4% (53 of 237 patients) in the UC arm, 48.5% (95 of 196 patients) in the EMR arm, 66.3% (177 of 267 patients) in the AVM arm, and 82.0% (214 of 261 patients) in the Pharmacy arm all had completed recommended monitoring (P <.001). A total of 72 abnormal test results were found among the 961 patients (7.5%).

We followed best practice in economic evaluation as outlined by the US Public Health Service.4 Our economic analysis examined the incremental cost per additional case completed (defined as enrollees that had all guideline-specified laboratory tests completed) and the incremental cost per abnormal case detected. We calculated the incremental cost-effectiveness ratio (ICER) by dividing the difference in cost by the difference in cases completed (or abnormal cases detected); interventions with lower ICERs are a better value for the money. Interventions with a higher ICER also may be cost-effective, depending on a decision maker’s willingness to pay for each additional unit of effect. Interventions were ranked on cost, and dominated options (ie, more costly but less effective) were identified. To account for the uncertainty due to sampling variation in cost-effectiveness analysis, we plotted cost-effectiveness acceptability curves5; these curves show the probability of each intervention being cost-effective at a given willingness to pay for an additional completed case (or abnormal case detected). All analyses were conducted using STATA release 9.0 (StataCorp, College Station, TX) and Microsoft Excel 2003 (Microsoft, Redmond, WA).

Most of the cost data were collected directly from the trial, with some expert opinion based on formal data-gathering techniques as described below. The perspective of the analysis was the HMO. Thus, we included only direct medical care costs incurred by the HMO. The scope of the analysis included the costs within the 25 days after the intervention of (1) all recommended laboratory tests (including repeated testing), (2) extra visits associated with abnormal tests (validated from pharmacist chart review), and (3) performing the intervention. The analysis does not include potential offsets of poor outcomes averted (eg, lactic acidosis, liver toxicity) because those data were too sparse to answer those questions effectively. Also excluded from the analysis were development costs and patient costs, like travel time and copayments. To maintain consistency with the efficacy analysis, the primary outcome was the cost per completed case within 25 days of dispensing, with a secondary analysis of the cost per addition al enrollee with 1 or more abnormal laboratory tests within 25 days of dispensing.

Costs

Table 1 details the unit costs (ie, prices), activities, data sources, and resource assumptions used in the analysis. To improve generalizability to other systems, salary costs were taken from sources reflecting the prevailing wage rate in Portland, Oregon, with a fringe benefit rate of 30% and overhead rate of 20% added to fully allocate the costs. Laboratory testing costs come from the HMO’s laboratory accounting system and include patient intake, phlebotomy, testing, and reporting. Mailing costs were applied based on estimates for bulk mailing, and costs for clinic visits came from the HMO’s cost structure.

Resources used in the performance of all the interventions included chart review to ensure patient eligibility, tracking systems for patient follow-up, and noting the intervention delivery in the patient’s medical record. Additional tasks were study arm specific. In the Pharmacy arm they included time for mailings and outreach phone calls and their documentation. The AVM intervention required time to upload files. The cost of maintaining the automated telephone system was embedded in the vendor charge to the HMO. The EMR intervention costs included nurse time to send the message and clinician followup activities. Because existing EMR functionality was used to provide the EMR messages, no incremental programming resources were necessary to provide the intervention. The time (in minutes) taken to complete these tasks was recorded for a sample of patients in the Pharmacy, AVM, and EMR arms. To establish patient-level resource use, each patient in the appropriate arm was assigned an imputed value randomly from the sample, preserving the sample’s underlying (observed) distribution. Analyst time to create and maintain the patient lists from automated data were taken from the trial.

PDF is available on the last page.
Feature
Recommended Articles