Article

Overcoming Challenges and Concerns of Using CER

Author(s):

Although there is a lot of promise for comparative effectiveness research as a tool to help healthcare providers, policy makers, and patients make better decisions, there are a number of issues to be resolved, according to speakers at the Academy of Managed Care Pharmacy's 27th Annual Meeting & Expo.

Although there is a lot of promise for comparative effectiveness research (CER) as a tool to help healthcare providers, policy makers, and patients make better decisions, there are a number of issues to be resolved, according to speakers at the Academy of Managed Care Pharmacy’s 27th Annual Meeting & Expo.

During the session “Using Comparative Effectiveness to Improve the Credibility of Evidence for Payers,” the speakers discussed payer evidence needs, developing a framework to improve that payer evidence, evaluating evidence credibility, and overcoming barriers to implementation of CER.

Peter M. Penna, PharmD, president of US managed care at Formulary Resources, explained that payers have 4 basic questions:

  1. Is the technology under review more effective, less effective, or about the same as comparator technologies?
  2. Is the technology under review safer and more tolerable, less safe and tolerable, or about as safe and tolerable as comparator technologies?
  3. Are there identifiable subpopulations?
  4. And if all else is equal, which is the better value?

However, there are still concerns remaining, including the old issue of death panels—although these concerns are fading—legislative restrictions on the use of CER, and the slow roll-out of results, he said.

Rafael Alfonso-Cristancho, MD, PhD, MSc, director of value evidence analytics at GlaxoSmithKline, discussed building an analytic framework and keeping in mind not only patient factors, but things like hospital technology capabilities and hospital volume. In addition, not only can models be used independently but they can be put together to answer multiple questions.

“There is not a right or wrong analytic framework,” he said. “It depends on … whatever is relevant for that specific research question.”

David Veenstra, PharmD, PhD, professor at the University of Washington, highlighted the challenges of evaluating CER data and discussed the GRACE (Good ReseArch for Comparative Effectiveness) Initiative, which has created principles that layout the elements of good practice for the design, conduct, analysis, and reporting of observational CER studies, as well as an 11-point checklist for screening the quality of observational comparative effectiveness studies.

Related Videos
Ibrahim Aldoss, MD, associate professor, City of Hope
James Chalmers, MD
Vincent Picozzi, MD, medical oncologist and director of the pancreaticobiliary program, Virginia Mason
Manmeet Ahluwalia, MD, MBA, FASCO, chief of medical oncology, chief scientific officer, and deputy director of the Miami Cancer Institute of Baptist Health South Florida
 Priscilla Tsondai, MD, MPH, International AIDS Society/CIPHER
ASCO Preview 2025
James Chalmers, MD
Nini Wu, MD, Navista
ATS 2025
Bridgette J. Picou, LVN, ACLPN, The Well Project
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo