The development and deployment of an autopend functionality within an existing health maintenance system took more than 3 years and cost $201,500 (2013 US$).
Objectives: Sutter Health developed a novel autopend, or automated laboratory test ordering, clinical decision support (CDS) tool to coordinate the patient and physician process of completing preventive services. This study estimated the costs of developing and implementing the autopend functionality within an existing electronic health maintenance (HM) reminder system.
Study Design: Human resource time was measured by triangulating in-depth key informant interviews with Microsoft Outlook Calendar metadata (meetings attended) for managers and hourly data from a time-based project management tool (Project Web App) for Epic programmers. Employee time spent was multiplied by the Bureau of Labor Statistics California state hourly wages. Sutter Health is an integrated health care delivery network with more than 12,000 physicians across 100 communities serving 3 million patients.
Methods: Activity-based costing methodology was used to divide the implementation into activities and the human resources required to complete them.
Results: Developing and implementing the autopend CDS took more than 3 years, involved 6 managers and 3 Epic programmers, and cost $201,500 (2013 US$) (2670 total hours), which excluded the costs of implementing the initial HM reminder system. Managers spent 90.5% of the total costs (86.6% of total hours) integrating autopend into the health system compared with 9.5% of the total costs (13.4% of total hours) spent programming the functionality.
Conclusions: The autopend CDS might be similarly costly for other organizations to implement if their managers need to complete comparable activities. However, electronic health record vendors could include autopend as a standard package to reduce development costs and improve the uptake of this promising CDS tool.
Am J Manag Care. 2020;26(7):e232-e236. https://doi.org/10.37765/ajmc.2020.43766
Autopend is a novel effective clinical decision support (CDS) tool that coordinates the patient and physician process of completing preventive health maintenance services.
Innovative clinical decision support (CDS) offers a promising approach to improve preventive medicine in ambulatory settings. Meaningful use requirements, changing payment models, and the drive for value are generating both pressure and opportunities for practices to adopt novel CDS.1,2 However, practices face limited budgets and human resources, raising questions about which technologies to invest in. One barrier to the adoption of novel CDS is lack of knowledge about the implementation requirements and costs.3 Implementation costs are defined as the “start-up” or 1-time costs associated with acquiring and implementing a system.4-6 From a practice’s perspective, CDS implementation is typically not billable to third-party payers and requires organizational investment with uncertain outcomes, and cost savings from successful implementation may accrue to payers or patients, not the practice.7,8 In the absence of significant cost savings to practices or new reimbursement streams, implementation costs may remain a significant barrier to the adoption of new CDS. However, greater knowledge of implementation costs can improve the prioritization of potential investment opportunities and their economic evaluations.6
Little is known about the costs to practices of implementing health information technology (HIT) in ambulatory settings.9 The literature mainly documents the costs associated with implementing electronic health records (EHRs)10-13 and computerized physician order entry14; the only studies related to the costs of implementing CDS are from countries other than the United States.15,16 The literature utilizes a variety of costing methodologies and measures, limiting the comparability and generalizability of cost estimates across technologies, settings, and implementation practices.
The deployment of autopend CDS in a large integrated health care delivery network offers an interesting case study to document practice-related implementation costs in the United States. Autopend CDS is a novel EpicCare EHR application that aimed to improve the completion of 6 guideline-based routine preventive tests—creatinine, glycohemoglobin, lipid screening, microalbumin, potassium, and thyroid hormone—through nudging patients and providers by simplifying workflow, removing barriers, and coordinating actions. Specifically, autopend (1) routed provider alerts to a separate electronic folder, (2) auto­matically populated preauthorization forms, and (3) linked the timing and content of electronic patient health maintenance (HM) topic reminders to the provider authorization. A previous study includes an in-depth description of the functionality and found that autopend CDS improved glycated hemoglobin test completion rates for patients with diabetes by up to 33.9%.17 In this study, we used a mixed-methods approach to retrospectively record the human resources required to develop and implement autopend CDS in an existing electronic HM reminder system.
The study was conducted at Sutter Health, an integrated health care delivery network with more than 12,000 physicians serving 3 million patients in 100 communities in Northern California. The autopend CDS was activated on November 13, 2012, and implemented into a fully integrated EpicCare EHR with existing electronic HM topic patient reminders.
Activity-Based Costing Methodology
Our study is grounded in activity-based costing (ABC), also known as micro-costing, methodology.18,19 ABC has been used to estimate HIT implementation costs because it can achieve better accuracy than standard accounting in costing organizational change and it provides greater visibility into the organizational process, providing stakeholders with transparency to inform their own implementation cost estimates.10 ABC first identifies activities associated with the implementation. In this case, the cost of each implementation activity is then calculated as the sum of all time spent by each employee multiplied by the employee’s hourly wage. However, methodological challenges associated with this approach include identifying appropriate activities and measuring employee time. Because there is no standard implementation cost instrument, the current practice is that each study develops its own.20 We followed a mixed-methods approach triangulating qualitative information from clinic key informants (eg, how clinics complete the implementation process, which personnel are involved, how employee time is accounted for) with quantitative data (eg, budgets, project planning documents). Our ABC includes only costs that are not billable to third-party payers and excludes employee benefits and building maintenance costs.10 The time horizon includes the early discovery stage in November 2010; the systemwide deployment on November 13, 2012; and post rollout through December 2013.
We used purposive sampling to identify key employees involved in implementation. We also used snowball sampling by asking respondents to name others. Our sample includes stakeholders (N = 9) from the HIT and business departments.
Qualitative and Quantitative Data
L.P. and C.D.S. interviewed stakeholders using a semistructured guide to obtain their explanations of activities, other participants, and recalled estimates of time spent on each task. Sample questions included “What were your responsibilities in the development of autopend?” with a follow-up probe of “How much time did it take to complete each responsibility?” (eAppendix [available at ajmc.com]). Interviews were 30 to 60 minutes and participants were not compensated. We also collected Microsoft Outlook Calendar metadata (eg, meeting dates, length, and titles)21 for some managers and the number of project-specific hours for Epic programmers from a web-based tool, Project Web App. Data were collected from October 2013 to March 2015. Bureau of Labor Statistics (BLS) 2013 California state averages were used to determine costs of employee time.22
Employee types. Employee job titles were linked to the BLS employee types and categorized as “managers” and “Epic programmers.”
Hours. Self-reported time spent pre- vs post deployment for 9 stakeholders and additionally self-reported time spent on specific activities for 2 stakeholders were recorded.
Costs. Cost of employee self-reported time was calculated as number of hours multiplied by the hourly wage for the specific employee type.
Interviews were audio-recorded and transcribed for detailed analysis. The researchers used a descriptive grounded theory approach23 to construct codes and categories from the data. A qualitative sociologist (C.D.S.) created initial codes for activities, which were later merged into categories based on common activity groupings determined by the study team. These codes were then applied and refined on subsequent transcripts. When similar comments and themes emerged from coding, we concluded that thematic saturation had been achieved.24 The interview findings were triangulated with downloaded Microsoft Outlook and Project Web App data.
Total Time and Costs​​​​​​​
We divided the 9 respondents into 2 mutually exclusive groups: managers and Epic programmers (Table 1). The 6 managers ranged from a business analyst to a member of senior leadership and represented business, clinical, and technology subject areas. Five of the 6 managers were involved predeployment, with 4 involved post deployment. The 3 Epic programmers included 1 senior information analyst and 2 information system analysts. The senior information analyst participated solely during the predeployment period, whereas the 2 information system analysts were involved both before and after deployment.
The total cost of developing and implementing the autopend CDS over the 3-year period was $201,500 (2670 hours). The managerial cost was 90.5% ($182,408 / $201,500) of the total; the director and the project manager alone cost 76.1% ($153,219 / $201,500) of the total. Approximately 77.6% ($153,389 / $201,500) of the total costs was spent during the 2-year predeployment period. Within each employee type, 75.5% ($137,696 / $182,408) of the managers’ total cost was spent during the predeployment period, whereas 82.2% ($15,693 / $19,092) of the Epic programmers’ total cost was spent during this time.
In the predeployment period, general discovery (Table 1) included determining opportunities “to leverage the technology [of the patient portal] and see what we could do using technology to just improve care in general.” Managers felt that autopend CDS could “assist the physicians in leveraging their time…[as] it’s hard for them with their busy practices to make sure that they’re keeping all their patient panel up-to-date with their screenings and other quality measures.”
During the development and deployment of the updated HM reminder messages and the autopend order functionality, managers participated in meetings that included planning for implementing autopend into clinics, coordinating the test ordering process with clinic laboratories, and training providers. Preparing the communication and training materials “took an intense amount of time” due to the complexity of the functionality from the user’s perspective. The managers created “frequently asked questions” for general use, a Microsoft PowerPoint presentation for clinic leaders, and an implementation toolkit; they also held a live WebEx training. Working with the clinics was important because it was “definitely not a 1-size-fits-all” approach because there were “different nuances in every location.” The managers also coordinated with the team responsible for online patient portal messaging, prepared steering team and quality committee meeting report-outs, and held legal team meetings. The director and the project manager divided up their time and costs by the stage of the project over the 3-year period (Table 2). Their joint effort alone contributed $44,375 to the costs of developing the updated HM reminder message and $39,989 to the development of the autopend order functionality.
Post deployment, managers met about systemwide performance issues and check-ins with the clinics. Setting up the maintenance model required relatively little of the director’s and the project manager’s effort ($5017) (Table 2). One manager reported that overall, providers did have some “hesitation” about using autopend, but once they heard about “P4P [pay for performance] measures” and “patient satisfaction,” they found it to be a “triple win and really successful.”
Epic Programmers’ Activities
Predeployment, the senior information analyst spent 220 hours mostly programming autopend because it was quite an “intricate, nuanced functionality.” Once completed, 2 other information system analysts made it “more scalable…to keep the implementation [in clinics] as simple as possible.” The programmers also maintained “every enhancement and expansion” to the functionality as it was implemented in each clinic, testing components before a change was made, and fixing “breakages along the way” so that autopend is now “operational” and on “autopilot.”
Post deployment, the 2 information system analysts continued to work on enhancements to the algorithm and coded the maintenance model. Overall, the programmers felt that it has been “a major success” because they “have not gotten much negative feedback from any users.”
To our knowledge, we provide the first measure of the human resource costs involved in implementing a novel CDS in the United States. We found that developing and implementing an autopend CDS into an existing HM reminder system took more than 3 years; involved at least 9 employees from HIT, clinical, and business departments; and cost $201,500 (2013 US$). We also found that 90.5% of the total costs (86.6% of the total hours) was spent by managers integrating autopend into the health system compared with 9.5% of the total costs (13.4% of the total hours) spent programming the functionality.
We encountered numerous challenges in applying ABC to retrospectively measure the implementation costs, generating several limitations to our study. Self-reported hours may be subject to recall bias. The validity and reliability of Microsoft Outlook Calendar metadata and Project Web App hourly data have not been established, and Microsoft Outlook Calendar metadata do not capture whether an employee attended the meeting. Second, collecting data for ABC was extremely resource intensive, and we were unable to gather time estimates for legal department personnel and provider trainings. Thus, we provide a lower bound of the total implementation cost. Better methodology and data collection tools are needed to provide more reliable cost estimates of implementing HIT.
Limited conclusions can be drawn from comparing our results with the sparse previous literature on the costs of implementing CDS in ambulatory settings. Reilly et al report a single lump sum cost of CAD$486,699 (2010 CAD$) for developing and implementing a CDS linked to evidence-based treatment recommendations for type 2 diabetes in 47 primary care practices in Ontario.15 Poley et al found that developing and implementing an evidence-based CDS for ordering blood tests in 118 practices in the Netherlands cost €79,000 (€2007) and took 2545 personnel hours.16 These authors also provided the hours according to activities and reported that 75.5% (1921) of the total hours was spent integrating the CDS into the practices, whereas developing the software took approximately 24.5% (624) of the total hours.
The findings of this case study suggest that implementing autopend CDS might be similarly costly for other organizations if their managers need to complete comparable activities to integrate the technology into their health system. However, EHR vendors could include autopend as a standard package to reduce development costs and improve the uptake of this promising CDS tool.
The authors gratefully acknowledge Paul Gotz for his patient explanation of the autopend order functionality.Author Affiliations: Hutchinson Institute for Cancer Outcomes Research, Fred Hutchinson Cancer Research Center (LP), Seattle, WA; Palo Alto Medical Foundation Research Institute, Sutter Health (CDS, ASC), Palo Alto, CA; Stanford Center for Biomedical Informatics Research, Stanford School of Medicine, Stanford University (ASC), Stanford, CA; University of California, San Diego, School of Medicine (MT-S), La Jolla, CA.
Source of Funding: Agency for Healthcare Research and Quality R03 HS022631.
Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.
Authorship Information: Concept and design (LP, CDS, MT-S); acquisition of data (LP, CDS); analysis and interpretation of data (LP, CDS, MT-S); drafting of the manuscript (LP, CDS); critical revision of the manuscript for important intellectual content (CDS, ASC, MT-S); obtaining funding (MT-S); administrative, technical, or logistic support (ASC); and supervision (ASC, MT-S).
Address Correspondence to: Laura Panattoni, PhD, Hutchinson Institute for Cancer Outcomes Research, Fred Hutchinson Cancer Research Center, 1100 Fairview Ave N, Seattle, WA 98109. Email: email@example.com.REFERENCES
1. Clinical decision support: more than just ‘alerts’ tipsheet. CMS. 2014. Accessed August 14, 2019. https://www.cms.gov/regulations-and-guidance/legislation/EHRincentiveprograms/downloads/clinicaldecisionsupport_tipsheet-.pdf
2. Hillary W, Justin G, Bharat M, Jitendra M. Value based healthcare. Adv Manage. 2016;9(1):1-8.
3. Ross J, Stevenson F, Lau R, Murray E. Factors that influence the implementation of e-health: a systematic review of systematic reviews (an update). Implement Sci. 2016;11(1):146. doi:10.1186/s13012-016-0510-7
4. Arlotto P, Oakes J. Return on Investment: Maximizing the Value of Healthcare Information Technology. HIMSS; 2003.
5. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. 2008;34(4):228-243. doi:10.1016/s1553-7250(08)34030-6
6. Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci. 2008;3:26. doi:10.1186/1748-5908-3-26
7. Donahue KE, Newton WP, Lefebvre A, Plescia M. Natural history of practice transformation development and initial testing of an outcomes-based model. Ann Fam Med. 2013;11(3):212-219. doi:10.1370/afm.1497
8. Gill JM, Bagley B. Practice transformation? opportunities and costs for primary care practices. Ann Fam Med. 2013;11(3):202-205. doi:10.1370/afm.1534
9. Bassi J, Lau F. Measuring value for money: a scoping review on economic evaluation of health information systems. J Am Med Inform Assoc. 2013;20(4):792-801. doi:10.1136/amiajnl-2012-001422
10. Fleming NS, Culler SD, McCorkle R, Becker ER, Ballard DJ. The financial and nonfinancial costs of implementing electronic health records in primary care practices. Health Aff (Millwood). 2011;30(3):481-489. doi:10.1377/hlthaff.2010.0768
11. Miller RH, West C, Brown TM, Sim I, Ganchoff C. The value of electronic health records in solo or small group practices. Health Aff (Millwood). 2005;24(5):1127-1137. doi:10.1377/hlthaff.24.5.1127
12. Patil M, Puri L, Gonzalez CM. Productivity and cost implications of implementing electronic medical records into an ambulatory surgical subspecialty clinic. Urology. 2008;71(2):173-177. doi:10.1016/j.urology.2007.09.024
13. Wang SJ, Middleton B, Prosser LA, et al. A cost-benefit analysis of electronic medical records in primary care. Am J Med. 2003;114(5):397-403. doi:10.1016/s0002-9343(03)00057-3
14. Johnston D, Pan E, Walker J. The value of CPOE in ambulatory settings. J Healthc Inf Manag. 2004;18(1):5-8.
15. O’Reilly D, Holbrook A, Blackhouse G, Troyan S, Goeree R. Cost-effectiveness of a shared computerized decision support system for diabetes linked to electronic medical records. J Am Med Inform Assoc. 2011;19(3):341-345. doi:10.1136/amiajnl-2011-000371
16. Poley MJ, Edelenbos KI, Mosseveld M, et al. Cost consequences of implementing an electronic decision support system for ordering laboratory tests in primary care: evidence from a controlled prospective study in the Netherlands. Clin Chem. 2007;53(2):213-219. doi:10.1373/clinchem.2006.073908
17. Panattoni L, Chan A, Yang Y, Olson C, Tai-Seale M. Nudging physicians and patients with autopend clinical decision support to improve diabetes management. Am J Manag Care. 2018;24(10):479-483.
18. Kaplan RS, Cooper R. Cost & Effect: Using Integrated Cost Systems to Drive Profitability and Performance. Harvard Business School Press; 1988.
19. Canby JB 4th. Applying activity-based costing to healthcare settings. J Healthc Financ Manage. 1995;49(2):50-52, 54-56.
20. Estimating the costs of primary care transformation: a practical guide and synthesis report. Agency for Healthcare Research and Quality. February 2017. Accessed January 15, 2019. https://www.ahrq.gov/ncepcr/grants/estimating-costs-grants/practical-guide/practical-guide.html
21. Panattoni L, Dillon E, Hurlimann L, Durbin M, Tai-Seale M. Cost estimates for designing and implementing a novel team care model for chronically ill patients. J Ambul Care Manage. 2018;41(1):58-70. doi:10.1097/JAC.0000000000000209
22. May 2014 occupational employment estimates: California. Bureau of Labor Statistics. Accessed June 6, 2020. https://www.bls.gov/oes/tables.htm
23. Charmaz K. Constructing Grounded Theory: A Practical Guide Through Qualitative Analysis. Sage; 2006.
24. Glaser BG, Strauss AL. The Discovery of Grounded Theory: Strategies for Qualitative Research. Aldine; 1967.