• Center on Health Equity and Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Contributor: How Payers Are Applying AI to Solve the Problem of Incomplete Data—3 Use Cases

Article

In this column, Calum Yacoubian, MD, outlines how artificial intelligence, specifically, natural language processing, can help address gaps in patient data.

Amidst a business environment where regulations are constantly shifting and costs are ever escalating, keeping members healthy is an ongoing challenge for payers. Adding to payers’ burdens is the need to manage messy and incomplete patient data.

For payers, comprehensive, accurate member data is a must in order to evaluate patients’ health status, predict future risk and identify care gaps that must be closed. However, patient data that fit these criteria is rare, and part of the challenge in harnessing health care data lies in its immense volume: The typical hospital generates 50 petabytes of data per year – equivalent to about 11,000 4K movies.

While most in the health care field are aware of the enormous amount of data created by the industry, few understand how much of that data is either unstructured or semi-structured—up to 80%, according to a report in Healthcare Informatics Research. That means a substantial portion of the industry’s data is effectively locked up in the notes sections of electronic health records (EHR) systems and not readily accessible to guide clinical decision-making. In many cases, semi-structured and unstructured data include a range of important information, such as patients’ symptoms, disease progression, lifestyle factors and lab tests.

Recently, payers have come under regulatory pressure to deliver patients seamless access to their own data, as a result of approval of the Interoperability and Patient Access final rule. Payers have understandably focused initial compliance on providing access to structured data, but now, they should include unstructured data to enhance the usefulness of the data to patients.

NLP Automation, Not Manual Chart Reviews

Rather than expending resources on time-consuming manual chart reviews, an increasing number of payers are looking to artificial-intelligence (AI)-powered tools such as natural language processing (NLP) to overcome the restrictions of searching through mountains of data.

NLP augments chart review and information extraction by helping machines “read” text—simulating the human ability to understand a natural language, enabling the analysis of unlimited amounts of text-based data without fatigue in a consistent, unbiased manner. Essentially, NLP allows computers to understand the nuanced meaning of clinical language within a given body of text, such as identifying the difference between a patient who is a smoker, a patient who says she quit smoking 5 years ago and a patient whose record says she is trying to quit.

Payers have leveraged these previously hidden insights to uncover real-world evidence that feeds predictive models that ultimately enhance patient outcomes, reduce costs and improve risk adjustment. The following are 3 use cases describing how.

Enhancing Medicare Advantage Risk Adjustment

Information on member health that is used to ensure Medicare Advantage plans receive appropriate funds to care for their patients is captured in hierarchical condition categories, or HCC codes. Inaccurate HCC coding can result in significant costs for health plans, prompting many payers to employ large groups of chart reviewers to manually comb through records in search of new insights into these data.

Confronted with this challenge, Independence Blue Cross has deployed NLP to augment its chart review program. The health plan created 2 clear goals for its NLP initiative: first, accelerating the review process so chart reviewers can review more documents per hour, and second, to capture diagnoses associated with HCC codes that may have been missed by chart review teams.

An initial pilot of the project identified features for HCC codes with over 90% accuracy, processing documents between 45 and 100 pages long per patient. NLP helps Independence Blue Cross process hundreds of thousands of complex medical records, speeding up chart reviews and enabling reviewers to boost efficiency and productivity.

Predicting Patient Risk of Diabetic Foot Ulcers

Financial challenges created by the COVID-19 pandemic have convinced the health care industry of the importance of predictive modeling. For example, a late 2020 PwC survey of health care executives revealed that nearly 75% of respondents said their organizations would invest more in predictive modeling in 2021. As one PwC executive noted, “The pandemic amplified the presence of unprocessed data and the lack of effort to do enough with it.”

One health plan leveraged NLP to mine unstructured data to feed a model that predicted patient risk of developing diabetic foot ulcers, a costly condition that can lead to amputation if left untreated. The payer’s data science team mined patient EHR notes for clues of impending risk, such as body mass index data, lifestyle factors, comments on medications and documented foot diseases. Thus far, the model has identified 155 at-risk patients who could be proactively managed, translating to potential annual savings of $1.5 million and $3.5 million because of prevented amputations, according to internal payer data.

Solving Social Determinants of Health (SDoH) Challenges

Social determinants of health, such as access to housing, food, transportation, and employment, play an essential role in patients’ overall health. However, much of the data pertaining to these subjects is locked in unstructured sources such as admissions, discharge, and progress notes.

One organization has used NLP to search through the unstructured notes in records of patients with prostate cancer to identify those at risk of social isolation, with 90% accuracy. By adopting NLP to screen for this vital social characteristic, payers can establish outreach campaigns aimed at creating connections with patients deemed to be at risk of missing appointments and suffering from unchecked disease progression.

Despite laudable industry-wide efforts toward achieving greater interoperability of health care information systems, the industry is unlikely to escape the problem of fragmented, disparate data any time soon. To get full value from the data at their fingertips and improve patient outcomes and reduce cost, payers will increasingly look to AI-powered technologies like NLP.

Related Videos
Pat Van Burkleo
Screenshot of Jennifer Vaughn, MD, in a Zoom video interview
Pat Van Burkleo
dr krystyn van vliet
Patrick Vermersch, MD, PhD
dr mitzi joi williams
Stephen Speicher, MD, MS
dr dalia rotstein
dr marisa mcginley
James Robinson, PhD, MPH, University of California, Berkeley
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.