Improving Quality Measure Maintenance: Navigating the Complexities of Evolving Evidence

An exploration of potential negative effects from delays in measure maintenance when changes in clinical evidence affect measure use found that delays may affect patient care and outcomes.

ABSTRACT

To be effective, healthcare quality measures must communicate clear, evidence-based standards to promote improved quality of care and outcomes. When the evidentiary foundation for measures changes, revisions must be made quickly and communicated clearly; otherwise, measures can confuse providers who are trying to reconcile the evidence-based care they deliver with outdated measure specifications. Outdated measures can also affect clinical decision making, potentially harming patients if the measures promote care that is not the best treatment for their condition according to the most recent evidence.

This case study focuses on 2 measures for which the evidence base changed, yet implementation of revised specifications lagged and subsequently affected the payment programs in which the measures are used. The case study is shared to motivate collaboration among quality measurement stakeholders to advance shared responsibility for timely measure updates when evidence changes and to avoid confusion in measure implementation.

Multiple parties share the responsibility for ensuring that measures are updated and aligned with evidence and practice recommendations. Issues of coordination among clinical experts, measure developers or stewards, and program implementers, including health plans, are not unique to any steward or implementer. The timing of new evidence releases and guidelines for the condition, service, or product being measured will always vary regardless of the measure update cycle for any one program. Changes to measure maintenance processes cannot totally negate these underlying challenges but can mitigate their impact. This case study calls for a national conversation to address opportunities for measure update process improvements.

Am J Manag Care. 2019;25(6):e188-e191Takeaway Points

Quality measures play an essential role in driving improvement; however, measurement is fraught with challenges that make consistent application difficult. In this article, we review how approval of the heart failure drug sacubitril/valsartan tablets introduced complexities in the measure life cycle for 2 value-based payment heart failure measures.

  • Effective measures must communicate clear, evidence-based standards of quality care so that providers can understand their role in achieving quality.
  • Inaccurate measures can be confusing when reconciling evidence-based quality of care with outdated specifications.
  • Three principles should be considered to promote coordination between measure developers and implementers: communication, flexibility, and streamlining.

Quality measures are important tools for guiding, assessing, and rewarding improvement in healthcare delivery and outcomes. For measures to be effective, they must communicate clear, evidence-based standards of quality care so that those providers being measured can understand what they must do to meet those standards. When the evidentiary foundation for a measure changes, revisions to the measure must be made quickly and communicated clearly; otherwise, measures can confuse providers who are trying to reconcile the evidence-based quality of care they deliver with outdated specifications. Beyond frustrating providers, outdated measures can affect clinical decision making, potentially harming patients or prohibiting patients from receiving treatment that could optimize intended clinical outcomes. This case study examines the paths of 2 measures used in value-based payment programs where specification revisions lagged significantly behind clinical guideline updates and subsequently generated frustration and confusion from misalignment.

The responsibility for ensuring that measures are updated and aligned with evidence and practice recommendations is shared by multiple parties. Responsibility may begin with measure stewards but ultimately requires action on the part of program implementers, including health plans, to ensure that providers are reporting on currently accurate specifications. Ideally, the timelines and processes of the measure stewards who maintain measures and those who implement measures in programs are coordinated. However, synchronization between measure stewards and implementers has proved difficult to achieve in practice. Stewards have their own update schedules and rigorous processes for evaluating evidence, guidelines, and feedback about technical issues with their measures. Likewise, each program has its own schedule and means for receiving updates from different stewards, often on an annual or semiannual basis. Measures may come in multiple formats (eg, claims based or electronic health record [EHR] based), each of which must be updated and implemented in slightly different ways, adding further complexity. With so many variables at play, updates needed to keep measures current can be delayed, frustrating providers rather than facilitating higher-quality care.

This case study illustrates how an update to 2 measures prompted by new guidelines was delayed, resulting in implementation confusion. The case focuses on 2 heart failure measures included in Medicare programs, but it is relevant across the healthcare system, including to health plans. The examples presented could be replicated anytime the evidence base for a measure is significantly altered. In addition, there may be multiple opportunities to improve coordination between measurement stakeholders, and this case study attempts to provoke collaboration on a broader scale to shorten the time lag between evidence emergence and application to measures. The case highlights the necessity for measure stewards and implementers to communicate more effectively, remain flexible, and streamline processes.

CASE STUDY

Medicare Quality Measurement Programs

The Medicare Access and CHIP Reauthorization Act of 2015 requires eligible clinicians to report quality data to CMS to earn financial incentives under the Quality Payment Program (QPP). One track of the QPP is the Merit-based Incentive Payment System (MIPS). MIPS participants earn a performance-based payment adjustment, in part based on their performance on quality measures. Clinicians can choose which quality measures to report from a measure set selected by CMS. The overarching objective of the QPP is to improve Medicare outcomes by rewarding clinicians for higher-quality care and outcomes.

Updating the measure sets for value-based payment programs, such as MIPS, is a complex process that is primarily dictated by CMS’ annual rulemaking and standardized specification release. Given the complexity of the measure update processes, coordination between measure stewards and CMS is critical to ensure that updates are made in a timely manner and communicated to the providers participating in the quality programs.

Heart Failure Measures

The MIPS measure set includes 2 measures designed to promote better outcomes for patients with reduced cardiac ejection fraction by assessing use of appropriate drug therapy (see Table). These measures are designated as 0066 and 0081 by the National Quality Forum (NQF). Specifically, the measures address the prescription of angiotensin-converting enzyme (ACE) inhibitors and angiotensin receptor blockers (ARBs), which are drug classes indicated for lowering blood pressure and increasing blood flow in patients with heart failure.1 Both measures have been endorsed by the NQF.2,3 The NQF endorses and updates measures through a voluntary consensus process that includes rigorous endorsement criteria.

The data for both measures are reportable to CMS through qualified registries or qualified clinical data registries, such as the American College of Cardiology (ACC) PINNACLE clinical data registry.4 The data for measure 0081 are also reportable as an electronic clinical quality measure (eCQM),5 meaning that the data can be obtained and reported through a clinician’s EHR system. Both measures are reviewed on regular cycles by their respective stewards to update their specifications when necessary.

For eCQM reporting, the lists of drugs that are considered part of the ACE inhibitor and ARB classes are maintained in value sets. Value sets include the specific codes (eg, RxNorm, Current Procedural Terminology, and International Classification of Diseases, Tenth Revision) used to calculate the measure and are housed in the Value Set Authority Center (VSAC).6 However, for registry reporting, the lists of drugs are coded within the registry itself, and registries do not always use the value sets found within the VSAC.

The New Evidence and Need for Reassessment

In July 2015, the FDA approved a new drug, sacubitril/valsartan tablets, for use in patients with chronic heart failure (New York Heart Association class II-IV) and reduced ejection fraction. In a clinical trial, sacubitril/valsartan was found to significantly reduce the risk of cardiovascular death and hospitalizations related to heart failure. This new class of drug is called an angiotensin receptor antagonist—neprilysin inhibitor (ARNI). An ARNI is an ARB with an additional active ingredient to further relax blood vessels and decrease sodium and fluid in the body. Sacubitril/valsartan is the only product currently in the ARNI class.7

As is customary as evidence evolves, in May 2016, the ACC/American Heart Association (AHA) Task Force on Clinical Practice Guidelines issued a focused update on pharmacological therapy for heart failure with reduced ejection fraction, based on study data that had been recently published.8 The guideline update indicated that ARNIs could be appropriately prescribed in place of ACE inhibitors or ARBs. The recommendation was class I (strong) and was supported by moderate-quality evidence.

Guideline updates often motivate review of existing performance measures to ensure alignment of the evidentiary foundation between measures and clinical guidelines. The ACC/AHA focused update presented an opportunity for the stewards of measures 0066 and 0081 to consider a specification revision. If ARNIs are appropriate treatment in place of ACE inhibitors or ARBs under the new guidelines, how should an ARNI prescription be captured by clinical quality measures?

The Measure Update Process

Acting in accordance with their established processes and after critical review of available evidence, the stewards of both measures determined that because an ARNI is similar to an ARB and because it is applied to the same population for the same indications, prescription of an ARNI should satisfy the requirements of the measures. To ensure accurate reporting of the measures, both the value set used for the eCQM and the numerator definition for the registry measures required updating so that clinicians reporting on either or both measures could count their prescription of ARNIs as compliant.

For the eCQM version of measure 0081, the stewards updated the value set to include ARNIs as part of the annual measure update process with CMS, which concluded in March 2016. Although the value set update meant the eCQM version of the specification was aligned with the guidelines for CMS’ May 2017 publication of the eCQM MIPS specifications for 2017 MIPS reporting, CMS’ update to the registry specification lagged until fall 2017 for 2018 MIPS reporting.

NUMERATOR NOTE: Eligible clinicians who have given a prescription to the patient or whose patient is currently taking a combination medication therapy, which contains either an angiotensin converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) (eg, angiotensin receptor neprilysin inhibitor [ARNI, sacubitril/valsartan], ACEI+diuretic, ARB+diuretic, ACEI+calcium channel blocker) would meet performance for this measure…9

For the registry specification, the stewards or program implementers needed to add sacubitril/valsartan to the numerator drug list table or provide specific guidance to clinicians that an ARNI should be captured as an ARB when reporting measure results. In late 2016, the Physician Consortium for Performance Improvement, the steward for measure 0081, convened its technical expert panel and determined that it was premature to add the ARNI language to the measure numerator statement. A full year later, in late 2017, CMS, the program implementer, completed the second and critical step for clarifying reporting for the registry measures with the following statement:

Summary of the Problem

For the registry-reported measures, the time from publication of the updated guideline indicating that ARNIs could be used in place of ACE inhibitors and ARBs (May 2016) to the time when the 2 measures were fully updated and integrated into MIPS by CMS (late 2017) was more than a year. In the interim, clinicians who referenced the specifications while they were reporting measures were confronted with the question of how to account for their patients who were prescribed ARNIs. Some may have chosen to claim a denominator exception for patients who received an ARNI by reporting that the patients did not receive an ACE inhibitor/ARB for medical reasons, although this is inaccurate. Other clinicians who did not exclude patients who received an ARNI had inaccurate performance results because of it. This scenario is illustrated in the Figure.

This situation created additional confusion for clinicians because the text descriptions of the measures did not specifically refer to ARNIs; the descriptions referred only to ACE inhibitors and ARBs. Therefore, even after the value set and registry specifications were updated, it was not obvious to clinicians that prescription of ARNIs met the standard of the measure. As a result, clinicians may have been less likely to prescribe ARNIs due to the perceived impact on their performance results.

Principles for Improvement

Issues of coordination among measure stewards and program implementers are not unique to CMS or any particular steward, health plan, or other measurement program. Any federal, state, commercial, employer, or internal healthcare quality program that incorporates measures from multiple stewards is subject to coordination issues for each measure included in the program. The timing of new evidence and guideline updates for a condition, service, or product that is being measured will always vary regardless of the update cycle for any one program. Changes to measure maintenance processes cannot totally negate these underlying challenges but can help mitigate their impact.

This case study demonstrates a need for a broader dialogue among measurement stakeholders about mechanisms to ensure that quality measures are maintained to keep pace with emerging evidence. CMS’ efforts toward reducing provider burden and promoting meaningful measures are congruent with this conversation. Measures are only meaningful and worth reporting when they are aligned with current evidence and practice standards. The principles of open communication, flexibility in program operations, and streamlining program processes should be considered in pursuit of closer coordination among measure stewards, health plans, and measurement program implementers.

CONCLUSIONS

Quality measures that are important, evidence-based, methodologically sound, feasible, usable, and harmonized are the cornerstone of a value-based healthcare delivery and payment system. As such, it is imperative that measures are maintained in a timely manner to ensure that they reflect the most current evidence base. Measure stewards, CMS, health plans, other measurement program implementers, clinicians, and all other stakeholders invested in the promise of quality measurement must work together to continuously improve the processes for maintaining measures.

This case study focuses on just 2 measures, and it is not known how often nor how many measures may be caught in similar circumstances. To tackle the numerous issues raised by the case study, participants in a multistakeholder forum, perhaps convened by CMS or NQF, could continue this discussion and develop mechanisms for ensuring that measures are maintained based on a strong evidentiary foundation throughout their life cycle. Additional questions that could be explored in such a forum include: What is the extent of the coordination problem in terms of numbers of measures affected? What are the impacts on patient care and outcomes? What are the root causes that need solutions? Who should coordinate all the interested stakeholders? Should this be CMS’ role or another organization’s? Are issues different for private payers compared with CMS? How can measurement stakeholders efficiently and continuously work together?Author Affiliations: Discern Health (TBV, SS, DMS), Baltimore, MD; Novartis (JVM), Washington, DC.

Source of Funding: Novartis Pharmaceuticals.

Author Disclosures: Dr Valuck, Ms Sampsel, and Dr Sloan are employed by Discern Health, a consulting firm that contracts with life sciences companies, and received payment for involvement in the preparation of this manuscript as a condition of that employment. Dr Van Meter is employed by and holds stock in Novartis, which manufactures and sells sacubitril/neprilysin, and received payment for involvement in the preparation of this manuscript as a condition of that employment.

Authorship Information: Concept and design (TBV, SS, DMS, JVM); analysis and interpretation of data (SS, DMS); drafting of the manuscript (TBV, SS, DMS, JVM); critical revision of the manuscript for important intellectual content (TBV, SS); obtaining funding (TBV, SS, JVM); and supervision (TBV).

Address Correspondence to: Thomas B. Valuck, MD, JD, Discern Health, 1120 N Charles St, Ste 200, Baltimore, MD 21201. Email: tvaluck@discernhealth.com.REFERENCES

1. Hunt SA; American College of Cardiology; American Heart Association Task Force on Practice Guidelines (Writing Committee to Update the 2001 Guidelines for the Evaluation and Management of Heart Failure). ACC/AHA 2005 guideline update for the diagnosis and management of chronic heart failure in the adult: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines (Writing Committee to Update the 2001 Guidelines for the Evaluation and Management of Heart Failure) [erratum in J Am Coll Cardiol. 2006;47(7):1503-1505]. J Am Coll Cardiol. 2006;46(6):e1-e82. doi: 10.1016/j.jacc.2005.08.022.

2. 0066: coronary artery disease (CAD): angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) therapy - diabetes or left ventricular systolic dysfunction (LVEF <40%). National Quality Forum website. qualityforum.org/QPS/0066. Updated December 9, 2016. Accessed October 11, 2017.

3. 0081: heart failure (HF): angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) therapy for left ventricular systolic dysfunction (LVSD). National Quality Forum website. qualityforum.org/QPS/0081. Updated March 28, 2017. Accessed October 11, 2017.

4. Outpatient registries. American College of Cardiology Quality Improvement for Institutions website. cvquality.acc.org/NCDR-Home/Registries/Outpatient-Registries. Accessed October 11, 2017.

5. Heart failure (HF): angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) therapy for left ventricular systolic dysfunction (LVSD). Electronic Clinical Quality Improvement Resource Center website. ecqi.healthit.gov/ecqm/measures/cms135v6. Updated May 3, 2018. Accessed May 10, 2019.

6. Search value sets. Value Set Authority Center website. vsac.nlm.nih.gov/valueset/expansions?pr=all. Accessed October 11, 2017.

7. FDA approves new drug to treat heart failure [news release]. Silver Spring, MD: FDA; July 7, 2015. wayback.archive-it.org/7993/20180126023455/https://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm453845.htm. Accessed October 11, 2017.

8. Yancy CW, Jessup M, Bozkurt B, et al. 2016 ACC/AHA/HFSA focused update on new pharmacological therapy for heart failure: an update of the 2013 ACCF/AHA guideline for the management of heart failure: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Failure Society of America [erratum in J Am Coll Cardiol. 2016;68(13):1495. doi: 10.1016/j.jacc.2016.08.013]. J Am Coll Cardiol. 2016;68(13):1476-1488. doi: 10.1016/j.jacc.2016.05.011.

9. 2018 quality measure specifications [zip file]. CMS Quality Payment Program website. qpp-cm-prod-content.s3.amazonaws.com/uploads/114/2018+Quality+Measure+Specifications.zip. Accessed May 10, 2019.