• Center on Health Equity and Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Overcoming Challenges and Concerns of Using CER

Article

Although there is a lot of promise for comparative effectiveness research as a tool to help healthcare providers, policy makers, and patients make better decisions, there are a number of issues to be resolved, according to speakers at the Academy of Managed Care Pharmacy's 27th Annual Meeting & Expo.

Although there is a lot of promise for comparative effectiveness research (CER) as a tool to help healthcare providers, policy makers, and patients make better decisions, there are a number of issues to be resolved, according to speakers at the Academy of Managed Care Pharmacy’s 27th Annual Meeting & Expo.

During the session “Using Comparative Effectiveness to Improve the Credibility of Evidence for Payers,” the speakers discussed payer evidence needs, developing a framework to improve that payer evidence, evaluating evidence credibility, and overcoming barriers to implementation of CER.

Peter M. Penna, PharmD, president of US managed care at Formulary Resources, explained that payers have 4 basic questions:

  1. Is the technology under review more effective, less effective, or about the same as comparator technologies?
  2. Is the technology under review safer and more tolerable, less safe and tolerable, or about as safe and tolerable as comparator technologies?
  3. Are there identifiable subpopulations?
  4. And if all else is equal, which is the better value?

However, there are still concerns remaining, including the old issue of death panels—although these concerns are fading—legislative restrictions on the use of CER, and the slow roll-out of results, he said.

Rafael Alfonso-Cristancho, MD, PhD, MSc, director of value evidence analytics at GlaxoSmithKline, discussed building an analytic framework and keeping in mind not only patient factors, but things like hospital technology capabilities and hospital volume. In addition, not only can models be used independently but they can be put together to answer multiple questions.

“There is not a right or wrong analytic framework,” he said. “It depends on … whatever is relevant for that specific research question.”

David Veenstra, PharmD, PhD, professor at the University of Washington, highlighted the challenges of evaluating CER data and discussed the GRACE (Good ReseArch for Comparative Effectiveness) Initiative, which has created principles that layout the elements of good practice for the design, conduct, analysis, and reporting of observational CER studies, as well as an 11-point checklist for screening the quality of observational comparative effectiveness studies.

Related Videos
Shawn Kwatra, MD, dermatologist, John Hopkins University
Dr Laura Ferris Discusses Safety, Efficacy of JNJ-2113 in Patients with Plaque Psoriasis
dr krystyn van vliet
Martin Dahl, PhD, senior vice president, AnaptysBio
Jeff Stark, MD, vice president, head of medical immunology, UCB.
Jonathan Silverberg, MD, PhD, MPH, FAAD, professor of dermatology, director of clinical research and patch testing, George Washington University School of Medicine and Health Sciences
Monica Li, MD, University of British Columbia
Robert Sidbury, MD, MPH, FAAD, professor of pediatrics, division head of dermatology, Seattle Children's Hospital, University of Washington School of Medicine
Raj Chovatiya, MD, PhD, associate professor at the Rosalind Franklin University Chicago Medical School, founder and director of the Center for Medical Dermatology and Immunology Research
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.