Currently Viewing:
The American Journal of Accountable Care December 2016
Tobacco Control in Accountable Care: Working Toward Optimal Performance
Edward Anselm, MD
High-Dose Flu Vaccine Prevents Symptomatic Influenza and Reduces Hospitalizations
Currently Reading
Quality in Healthcare and National Quality Awards
Beth M. Beaudin-Seiler, PhD, and Kieran Fogarty, PhD
Evaluating the Efficacy of a Bundled Chronic Condition Management Program
Sean Bowman, MPH; Mazi Rasulnia, PhD, MBA, MPH; Dhiren Patel, PharmD, CDE, BC-ADM, BCACP; David Masom, BSc; Margaret Belshé, BA; and William Wright, MBA, MPH
Can a Virtual Coach Activate Patients? A Proof of Concept Study
Sheri D. Pruitt, PhD, and Dannielle E. Richardson
Beyond Regulatory Requirements: Designing ACO Websites to Enhance Stakeholder Engagement
Esther Hsiang, BA; Andrew T. Rogers, BS; Daniel J. Durand, MD; Scott Adam Berkowitz, MD, MBA
Treating Behavioral Health Disorders in an Accountable Care Organization
Neil D. Minkoff, MD
The Post-Election Future of ACOs
Anthony D. Slonim, MD, DrPH, and Amber M. Maraccini, PhD

Quality in Healthcare and National Quality Awards

Beth M. Beaudin-Seiler, PhD, and Kieran Fogarty, PhD
Hospitals receiving national quality awards showed better performance in selected outcomes variables compared with hospitals that had not received these awards.

Objectives: To examine the differences between national quality award–winning hospitals’ quality outcomes compared to with the outcomes of non–national quality award–winning hospitals in 11 selected key variables.

Study Design: Data from the Medicare Hospital Compare database was collected for over 3000 hospitals across the United States. National quality award–winning hospitals are defined as those that have received either the Malcolm Baldrige National Quality Award for Health Care or the Healthgrades Distinguished Hospital for Clinical Excellence Award in 2012, 2013, or 2014. Hospitals that did not receive either of these 2 awards were categorized as non–national quality award–winning. Variables from the Medicare Hospital Compare database were refined to include only key variables in clinical quality.

Methods: This quantitative study used the Mann-Whitney U nonparametric test to examine the differences between hospitals that have and have not received national quality awards and their clinical performance outcomes.

Results: Hospitals that had received a national quality award showed statistically significant better performance on 6 of 11 identified quality variables compared with those that had not. One variable (emergency department wait time) had statistically significant better performance in non–national quality award–winning hospitals. The remaining 4 variables did not show a statistically significant difference in performance between hospitals that had and had not won a national quality award.

Conclusions: The findings suggest that national quality award–winning hospitals have better performance outcomes on 6 of 11 specific quality variables compared with hospitals that did not win 1 of the 2 defined national quality awards.

The American Journal of Accountable Care. 2016;4(4):26-32
The Institute of Medicine’s (IOM) 1999, To Err is Human, as well as subsequent follow-up reports in 2005 and 2009, indicate that patient safety and quality continue to be problems within the healthcare industry in the United States.1-3 One in 3 patients experiences adverse events (AEs) during the course of their hospital stay,4 and the most recent research suggests that upwards of 440,000 individuals die each year due to medical errors, which can cost the US economy as much as $980 billion.5,6 Empirical evidence of medical errors within the hospital setting is not always transparent—a fact that may deter the identification and the tackling of AEs. Hospitals may fear reprisals if they publish data on medical errors; however, concealing such information may not hold a benefit, as indicated by medical researchers like Marty Makary (a specialist at Johns Hopkins Hospital and professor at Johns Hopkins Bloomberg School of Public Health), who is regarded as an international expert in patient safety, and others.7

Although logic suggests that the hospitals receiving national quality awards have lower levels of medical errors and have better performance on quality indicators, thus making the cost in resources to obtain or publicize these awards worthwhile, the literature does not indicate whether this is truly the case.8 Performance measurements have shown an increase in patient safety, but arguably not as much as is wanted or needed. In the 2014 National Healthcare Quality & Disparities report from the Agency for Healthcare Research and Quality (AHRQ), the median change in quality in 2012 was only 3.6% per year among measures of patient safety, and only 1.7% per year among measures of effective treatment.9  

This study examined over 3000 hospitals across the United States, looking at 11 specific variables from the timely and effective care category of the Medicare Hospital Compare database8 in order to better understand if there is a relationship between hospitals that have won national quality awards compared with hospitals that have not. The question of whether it is worthwhile for hospitals to invest resources in obtaining and marketing national quality awards is also discussed. In addition, a connection to highly reliable organizations is a theme drawn through the conclusions. Our findings support the research of Weick and Sutcliffe (2007) that highly reliable organizations (HROs) make continuous efforts to analyze their systems and look for errors; they recognize the near misses and dangers of complacency and continually look at the sensitivity of their operations, developing situational awareness for their organization.10

The Joint Commission is the United States’ oldest and largest accrediting and standard-setting organization in healthcare, with a number of quality initiatives and programs offered to healthcare organizations.11 Nevertheless, the quality metrics of The Joint Commission were purposefully not used in this study; instead, quality awards from independent sources—from the accrediting body—were chosen. There are also a number of independent quality awards given; however, the Malcolm Baldrige National Quality Award (Malcolm Baldrige Award) and the Healthgrades Distinguished Hospital for Clinical Excellence Award (Healthgrades Award) were selected as nationally recognizable for quality.

Study Design

A quantitative study using secondary data sources from the Medicare Hospital Compare database was used to determine performance outcomes on selected variables. Hospitals having received either a Malcolm Baldrige Award or a Healthgrades Award were categorized as national quality award–winners; all other hospitals were categorized as non–national quality award-winners. Data housed in the Medicare Hospital Compare database were collected through a number of different sources, including voluntary reporting from the CMS Abstraction and Reporting Tool, Medicare Enrollment and Claims data, Veterans Affairs administrative data, National Healthcare Safety Network by the CDC, and AHRQ Patient Safety Indicators and listed as 86 separate variables.12 The variables were categorized into multiple areas, including: patients’ experience; timely and effective care; readmissions, complications, and deaths; use of medical imaging; payment and value of care; and number of Medicare patients.12

This study focused on variables categorized in timely and effective care. These variables indicated the following: percentage of patients who received best-practice treatments for their condition, timeliness of treatment for patients with certain medical emergencies, and recommendations for preventive treatments. The measures reported reflect the accepted standard of care based on current scientific evidence; they are regularly reviewed and revised to ensure they are current. The measures do not have a risk adjustment calculation, but are reported as percentages.12 There are 50 variables listed in the timely and effective care category.13 A refinement process of cross-referencing quality indicators from various national organizations and empirical research14-22 reduced the number to 11 key quality variables (Table 1).

There are 4861 healthcare organizations that report in to the Medicare Hospital Compare database. Not all, however, offer the same services; therefore, any reporting organization that had missing data for 6 or more of the selected key quality variables was excluded from the analysis. This exclusionary criterion resulted in 3118 healthcare organizations eligible for analysis in this study.8

The healthcare organizations were then separated into 2 groups: national quality award–winning (those hospitals having received the Malcolm Baldrige Award and/or the Healthgrades Award) and non–national quality award-winning (all others who had not won at least 1 of the aforementioned awards). When a Malcolm Baldrige Award–winning system—meaning that more than 1 hospital was included in the application—occurred, all hospitals within that system were depicted as a national quality award–winner. A total of 493 national quality award–winning organizations and 2625 non–national quality award–winning organizations was identified. Of the 493 national quality award–winning organizations, 11 received both the Malcolm Baldrige and Healthgrades awards.


Each variable was evaluated separately using a Mann-Whitney U test at the .05 level of significance. The Mann-Whitney U test was selected because the variable data violated the assumptions of the independent t test. However, the data for all the variables in this project satisfied the assumptions of the Mann-Whitney U test (ie, the dependent variable is continuous, there is 1 independent variable with 2 categories, cases cannot be in both categories, and whether the distribution of data has the same shape). Distribution of the scores for national quality award–winning hospitals and nonwinning hospitals were similar, as assessed by visual inspection.

In order to establish homogeneity among hospital subjects, several delimitations were selected. The entire population of hospitals that report into the US Medicare Hospital Compare database was initially selected; however, hospitals that had missing data for half or more of the refined variables were excluded. The Medicare Hospital Compare database collects information on 86 variables, but only those in the timely and effective care category were selected, because they were unaltered in any way. Next, the variables were refined from 86 to 11 through a search of quality indicators from the literature and professional societies and association; however, selection of different variables may yield different results. Finally, it is also noted that hospitals that have self-selected to apply for a national quality award, such as the Malcolm Baldrige, may have better funding, more staff, and more resources—that are not necessarily directly related to quality—that were not being measured in this study.8


Findings indicated that, overall, there were statistically significant better performances in more variables from hospitals that had received national quality awards than those that had not. The variables specified in this study reflect the percentage of correct procedures and total minutes to execution of an event. For the majority of the variables, the higher the percentage, the more often the treatment occurred in the correct time and in the correct manner; the exception is the obstetrics variable, which better performance reflected in a lower percentage. For the variables tracking minutes, a lower number of minutes to execution of an event was considered better performance.

In a comparison of hospitals that had received a national quality award with those that had not, on average, for hospitals that had received an award, a higher percentage of patients received appropriate medications for heart failure (97.8% and 96.5%, respectively; P = .036) and pneumonia (97.3% and 95.8%; P = .000), had their urinary catheters removed after surgery in a timely manner (97.9% and 96.7%; P = .003), received appropriate medication to prevent blood clots after surgery (98.9% and 98.0%; P = .000), and received antibiotics prior to surgery in a timely manner (97.8% and 96.7%; P = .018).

In addition, the percentage of babies delivered too early was statistically significantly lower in hospitals that had won national quality awards (3.4% and 5.0%, respectively; P = .001). However, the number of minutes spent in the emergency department (ED) was statistically significantly higher in hospitals having received a national quality award compared with that of hospitals that had not received one (299.3 and 277.3 minutes, respectively; P = .000). 

Table 2 provides an overview of the statistical analysis for the identified key quality variables in this study.


Overall, the findings of the study were expected, as it is logical that the hospitals that received national quality awards would be better-performing than those that did not receive an award. The only unexpected finding was that the number of minutes spent in the EDs of national quality award–winning hospitals was statistically significant higher than the number of minutes spent in the EDs of nonwinning hospitals. We speculated there may be a couple of reasons for this finding: 1) the national quality award–winning hospitals have a reputation for quality and, therefore, are the hospital of choice for people, causing more of a wait; and/or 2) the national quality award–winning hospitals may also be functioning more as an HRO, refusing to simplify and, in doing so, increasing time in the ED.

Pope (2015) found that hospitals that had won the Malcolm Baldrige Award were not quite yet HROs, but were closer to reflecting an HRO than other hospitals.23 However, as the data were analyzed a step further and the type of national award received (ie, either the Malcolm Baldrige or the Healthgrades awards) was examined, findings indicated there were more variables with statistically significant better performances among Healthgrades Award recipients than in Malcolm Baldrige Award recipients. This may be due to the fact the Malcolm Baldrige Quality Award is a holistic process, focusing on the entire organization, whereas the Healthgrades Award focuses on outcomes only.

Copyright AJMC 2006-2019 Clinical Care Targeted Communications Group, LLC. All Rights Reserved.
Welcome the the new and improved, the premier managed market network. Tell us about yourself so that we can serve you better.
Sign Up