• Center on Health Equity and Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Actions to Improve Quality: Results From a National Hospital Survey

Publication
Article
The American Journal of Managed CareDecember 2021
Volume 27
Issue 12

Hospitals reported widespread adoption of quality improvement (QI) changes to improve on CMS quality measures, and QI adoption was associated with improved performance on quality measures.

ABSTRACT

Objectives: CMS measures and reports hospital performance to drive quality improvement (QI), but information on actions that hospitals have taken in response to quality measurement is lacking. We aimed to develop national estimates of QI actions undertaken by hospitals and to explore their relationship to performance on CMS quality measures.

Study Design: Nationally representative cross-sectional survey of acute care hospitals in 2016 (n = 1313 respondents; 64% response rate).

Methods: We assessed 23 possible QI changes. Using multivariate linear regression, we estimated the relationship between reported QI changes and performance on composite measures derived from 26 Hospital Inpatient Quality Reporting Program measures (scaled 0-100), controlling for case mix and facility characteristics.

Results: Hospitals reported implementing a mean of 17 QI changes (median [interquartile range], 17 [15-20]). Large hospitals reported significantly higher adoption rates than small hospitals for 18 QI changes. Most hospitals that reported making QI changes (63%-96% for the 23 changes) responded that the specific change made helped improve performance. In multivariate regression analyses, adoption of 92% of QI changes (90th percentile among hospitals), compared with adoption of 50% of QI changes (10th percentile), was associated with a 2.3-point higher overall performance score (95% CI, 0.7-4.0) and higher process (8.7 points; 95% CI, 5.7-11.7) and patient experience (3.0 points; 95% CI, 0.1-5.9) composite scores.

Conclusions: Hospitals reported widespread adoption of QI changes in response to CMS quality measurement and reporting. Higher QI adoption rates were associated with modestly higher process, patient experience, and overall performance composite scores.

Am J Manag Care. 2021;27(12):554-551. https://doi.org/10.37765/ajmc.2021.88793

_____

Takeaway Points

Hospitals reported widespread adoption of quality improvement (QI) changes to improve on CMS quality measures, and higher QI adoption was associated with higher process, experience, and overall quality performance scores.

  • Most hospitals (ranging from 63% to 96% for the 23 changes) that reported making QI changes responded that the specific change made helped improve performance.
  • The strongest association between QI adoption and performance was seen with clinical process measures.
  • Given that CMS measurement programs have shifted away from process measures toward outcomes, hospitals need to determine which QI actions provide the greatest impact on improving outcomes.

_____

Annually, CMS measures the performance of more than 3000 acute care hospitals as part of the Hospital Inpatient Quality Reporting (IQR) Program with the goal of driving improved outcomes, better care, and lower costs. CMS publicly reports the hospital performance results on the CMS Hospital Compare website.1-4 To further encourage improvement, CMS adjusts hospitals’ reimbursements to reflect the facilities’ performance on measures used by programs such as the Hospital Value-Based Purchasing (HVBP) Program, Hospital-Acquired Condition Reduction Program, and Hospital Readmissions Reduction Program.5-7 The collective measurement, reporting, and payment activities aim to encourage hospital quality improvement (QI) interventions or changes, which include efforts to change the structure, process, and/or outcomes of care using an organizational or structural change.8 QI also encompasses the combined efforts of providers, patients and families, payers, and other stakeholders to continually improve outcomes, system performance, and education.9 Previous studies showed that several QI changes improved hospital quality. For example, adopting electronic health record (EHR) systems was associated with reduced readmissions.10,11 EHR use was also associated with improved performance on measures of overall process of care,12 patient satisfaction,13,14 venous thromboembolism prevention,15 and mortality,14 but not ischemic stroke or surgical care.16,17 Another study found that hospitals receiving assistance from Medicare QI Organizations had greater improvement on 19 of 21 hospital quality measures.18 Most hospitals have implemented Lean practice improvement strategies (with variable intensity and maturity), and most reported Lean practices to be helpful in improving performance.19,20 However, although hospital leaders reported efforts to reduce readmissions in response to the Hospital Readmission Reduction Program,10 there has been limited study on whether hospitals are undertaking a wide variety of QI actions in response to CMS programs and whether those efforts are associated with improved quality performance.

Our study sought to develop national estimates of the number and type of QI changes (among 23 options) that hospitals reported making in response to CMS measurement programs using a cross-sectional national survey of nonfederal acute care hospitals.21 To investigate whether reported QI changes were associated with improvements in performance on quality scores, we described hospitals’ self-assessments of whether QI changes were helpful in improving performance and explored the association between hospital-reported QI change adoption and performance on composite measures derived from 26 CMS quality measures.21,22

METHODS

We used data from the 2016 National Hospital Provider Survey, which we conducted for CMS as part of its 2018 National Impact Assessment of CMS Quality Measures.21 The survey assessed how hospitals were responding to CMS quality measurement, reporting, and payment initiatives. We developed the survey content based on a review of the hospital QI literature, formative interviews with hospitals, and iterative pilot testing to ensure that respondents correctly interpreted the terminology and intent of the survey questions.21 Based on the literature review and formative interviews,6,10-17,19,20,23-26 we identified 23 QI changes that we grouped into 7 categories: organizational culture, health information technology (IT), care process redesign, provider incentives, changes to staffing levels or responsibilities, performance monitoring, and measure-specific QI initiatives and technical assistance. (See eAppendix A for survey questions [eAppendices available at ajmc.com].) The survey asked hospitals to report whether they had implemented any of the 23 potential QI changes to improve performance on CMS quality measures and whether the reported changes were perceived to be helpful in improving performance on the CMS measures.21,22

The survey was sent to a nationally representative sample of US nonfederal acute care hospitals that reported Hospital IQR Program quality measures to be displayed on CMS Hospital Compare as of December 2015.2,27 We excluded hospitals without sufficient performance data for computing an overall performance score (n = 138) and hospitals that faced significantly different CMS quality reporting requirements: critical access hospitals (n = 1264), hospitals in US territories (n = 54), children’s hospitals (n = 22), and veterans hospitals (n = 129). After these exclusions, the universe eligible for sampling included 3198 hospitals, of which we sampled 2045. We applied a stratified random sample design that used as strata bed count (< 100, 101-299, and ≥ 300 beds, based on the Medicare Provider of Services File from December 2015) and quality performance (described below), given that QI efforts might vary based on hospital size (a proxy for greater resources) and quality.28

Because no overall composite performance measure existed for all eligible hospitals to stratify hospital performance as low, medium, or high performing for sampling, we constructed an overall quality performance score using an approach similar to the fiscal year (FY) 2016 Total Performance Score (TPS) used by CMS for its HVBP Program.29 We used 26 CMS quality measures from 4 measure domains used by CMS in 2016: clinical processes, outcomes (including patient safety outcomes), patient experience (from the Hospital Consumer Assessment of Healthcare Providers and Systems survey), and efficiency.1 (See Table 1 for quality measures and performance domains.) We transformed each measure score into a percentile, which reduces the influence of outliers30; we then calculated composite domain scores by taking the mean of the transformed measure scores within each of the 4 domains. Eligible hospitals had to have at least 2 measures each in at least 3 domains, which is less stringent than the FY 2016 TPS, which required at least 4 measures in 2 domains. We computed a weighted average of the 4 domain scores using the FY 2016 TPS weights: 10% for clinical processes, 40% for outcomes, and 25% each for patient experience and efficiency. The overall composite performance measure was highly correlated (correlation coefficient = 0.83) with the HVBP Program TPS but allowed 157 additional hospitals to be included in the universe for sampling.

We fielded the survey between June 1, 2016, and January 6, 2017, targeting senior quality leaders responsible for quality measurement (eg, leaders with titles such as vice president for quality).21,22 Each sampled hospital’s quality leader was invited by email to complete a web-based survey, drawing on the individual’s understanding of quality measurement and improvement activities at the hospital. To increase response rates, we sent phone and email reminders and mailed paper surveys to nonresponders31-35; we also designed the survey to be completed in less than 1 hour to reduce respondent burden, although this limited our ability to determine the intensity of QI implementation.

Analysis

For analysis, respondents needed to have completed at least 7 substantive items. To generate nationally representative estimates, we applied sampling weights, which are the product of nonresponse weights and design weights (ie, weights that account for the stratified sampling design).36 We used logistic regression to derive nonresponse weights, based on the following hospital-level characteristics from the Medicare Master Beneficiary Summary File, the Medicare Provider of Services File, and Hierarchical Condition Category (HCC) file (all represent administrative data collected by CMS in 2015): mean age, mean HCC community risk score, proportion Black, proportion Hispanic, proportion female, proportion dually eligible for Medicare and Medicaid, proportion with end-stage renal disease, proportion enrolled for disability, mean income, nonprofit status, urban status (metropolitan/mid-size city, based on the hospital’s zip code), facility size, major teaching status, disproportionate share hospital (DSH) quintile,37 and cardiac surgery capabilities.

We estimated adoption rates of each of the 23 QI changes in the overall hospital population. In addition, we calculated weighted means within subgroups defined by size and quality performance, and we used F tests to test for equality of the means among subgroups being compared.

Next, we explored whether adoption of QI changes was associated with performance. We first calculated the proportions of hospitals (among those adopting each change) that reported that the change was “definitely or somewhat” helpful in improving performance on 1 or more measures. We then used multivariate linear regression analyses to estimate the relationship between QI changes reported and hospital performance. We controlled for differences across hospitals in case mix, patient socioeconomic characteristics, and facility characteristics described above; we also controlled for having at least 1 (self-reported) competitor hospital. These features have previously been used to account for performance variation independent of QI efforts.18,24,38-40

In the multivariate analysis, the outcome was the overall composite performance score recomputed using August 2016 Hospital Compare data to align the time frame of survey responses with performance scores available to survey respondents. The primary explanatory variable of interest was the proportion of all 23 potential QI changes that hospitals reported adopting. Our primary analysis results can be interpreted as the association between adopting all QI changes and the percentage increase in performance, accounting for patient or facility characteristics that might affect performance independent of QI actions. To facilitate interpretation of the findings, we also estimated the change in performance associated with a hospital moving from the 10th percentile of QI changes adopted to the 90th percentile.

Additionally, we estimated the association between the proportion of QI changes adopted and each of the 4 performance composite domains (clinical processes, outcomes, patient experience, and efficiency). Each analysis controlled for the same patient and hospital characteristics as the primary multivariate analysis described above.

Analyses were performed in SAS version 9.4 (SAS Institute) and R version 3.4.0 (R Foundation for Statistical Computing).

RESULTS

The survey response rate was 64.2% (n = 1313 responses from 2045 sampled hospitals); no significant differences in response rates were observed by hospital size or quality performance. Table 2 shows unweighted characteristics of respondents.

QI Change Adoption Rates

More than 99% of hospitals reported implementing at least 1 QI change, with a median of 17 (interquartile range, 15-20 QI changes). Among the 23 potential QI changes, the most frequently adopted were providing routine feedback to physicians and other clinical staff on performance and implementing standardized care protocols, which were each reported by 97% of hospitals. In addition, 95% of hospitals reported implementing a culture of safety. In contrast, only 29% of hospitals described implementing changes to deployment of nursing staff (Table 3).

In subgroup analyses, large hospitals reported significantly higher adoption rates than small hospitals for 18 of the 23 QI changes, but no QI changes were adopted at higher rates among small hospitals compared with large hospitals (Table 4). For example, 51% of large hospitals reported implementing pay based on performance for clinical staff, compared with 30% of small hospitals; 64% of large hospitals reported implementing internal incentives for senior leaders, compared with 34% of small hospitals. Hospitals in the highest DSH quintile reported significantly higher adoption rates than hospitals in the lowest quintile for 19 QI changes; no individual QI change was adopted at a higher rate among hospitals in the lowest DSH quintile (results not shown). QI change adoption was not significantly greater among hospitals in the high-performance stratum compared with those in the low- and medium-performance strata (eAppendix B).

Perceived Helpfulness of QI Changes in Improving Performance

High proportions of hospitals that reported making individual QI changes (ranging from 63% to 96% for the 23 changes) perceived the changes to be “definitely or somewhat” helpful in improving performance on 1 or more CMS measures (Table 3). The 2 QI changes that hospitals perceived to be most helpful were the implementation of standardized care protocols (96%) and QI initiatives for specific measures (96%).

Association Between QI Changes Undertaken
and Quality Performance

Implementation of all 23 QI changes (vs no changes) was associated with a 5.9-point higher composite measure of performance (95% CI, 1.7-10.0 points) (Figure). For context, the range for the overall hospital performance composite was 13.2 to 92.2 points, and an increase of 5.9 points would shift a hospital at the 50th percentile to the 62nd percentile of performance. We also estimated that adoption of 92% of QI changes (90th percentile among hospitals), compared with adoption of 50% of QI changes (10th percentile), was associated with a 2.3-point higher overall performance score (95% CI, 0.7-4.0 points).

The Figure also shows results examining the associations between QI changes made by hospitals and the 4 subcomposite domains of performance: clinical processes, outcomes, patient experience, and efficiency. The clinical processes domain showed a 22-point increase (95% CI, 14.3-29.6 points) in scores associated with implementing all QI changes vs no changes. Increasing adoption of QI changes from the 10th percentile (50% of changes) to the 90th percentile (92% of changes) was associated with an 8.7-point increase in performance (95% CI, 5.7-11.7 points) on the clinical processes composite.

We found that implementing all changes (vs none) was associated with a 7.6-point increase in the patient experience composite score (95% CI, 0.3-14.8 points) and a statistically nonsignificant 5.4-point increase in the outcomes composite score (95% CI, –0.4 to 11.3 points). These increases can be translated into shifts from the 50th percentile to the 61st and 63rd percentiles of their respective composite measure domains. Increasing adoption of QI changes from the 10th percentile (50% of changes) to the 90th percentile (92% of changes) was associated with smaller increases: an increase in performance of 3.0 points (95% CI, 0.1-5.9 points) on the patient experience composite and 2.2 points (95% CI, –0.2 to 4.5 points) on the outcomes composite score.

DISCUSSION

We found that hospitals report having adopted a substantial number of QI changes in response to CMS quality measurement initiatives, with the median hospital adopting 17 of 23 potential QI changes. Larger hospitals were more likely to report having adopted a greater number of QI changes, which may be due to having more resources to make investments (such as EHR implementation) and greater structural capacity. We also found that hospitals in the highest DSH quintile adopted QI changes at higher rates than those in the lowest quintile. Although hospitals in higher DSH quintiles are generally considered to be disadvantaged and might seem less likely to adopt high-cost QI changes, such hospitals are also often large hospitals and may have additional resources for implementing QI changes.19

In investigating the association between having adopted QI changes and quality performance, we found that the majority of hospitals adopting each QI change reported the change to have been helpful in improving their performance on CMS quality measures. Although the latter findings are self-reported, in quantitative multivariate analyses we found that greater adoption of QI changes was associated with statistically significant but modestly higher performance on an overall composite score. Implementing QI changes was strongly associated with better performance on clinical process measures and with smaller improvements on outcome and patient experience measures. There were no significant improvements on the efficiency and outcome measures. It is possible that QI changes reported by hospitals (such as care protocols and health IT order entry prompts) are better suited for improving usage of clinical processes of care rather than improving clinical outcomes and efficiency, which is consistent with findings of prior studies suggesting that providers improve performance on process measures more easily than on other measure types.5,41 Hospitals that have already implemented half of the 23 QI changes might experience only modest performance improvements with additional QI change adoption compared with hospitals that have not yet adopted any QI changes. In addition, improving overall performance might become more challenging as process measures are deemphasized in programs such as the HVBP Program. However, other QI improvement strategies beyond those measured in the survey or more robust implementations of QI strategies may show a relationship with performance on outcome and efficiency measures.29

Previous studies identified wide use of several QI changes (especially implementation of health IT capabilities and organizational changes) and showed modest associations between some QI changes and performance on individual performance measures.12,23,24,42 For example, the results of one study found organizational features such as computerized physician order entry and a focus on identifying system errors to be more prevalent in high-performing hospitals compared with low-performing hospitals, but it did not evaluate the effects of multiple simultaneous interventions.24 Another study’s results noted that direct employment of physicians (which represents a QI change related to staffing) was not associated with improved quality.43 Our findings add to the literature by describing the extent to which hospitals nationwide are using multiple QI changes to influence their performance and by showing an association between implementation of QI changes and quality performance.

Limitations

This study has several limitations. First, the survey captured a cross-sectional (rather than longitudinal) perspective on changes self-reported by hospitals. As a result, the associations between QI changes and outcomes are exploratory; they cannot show definitive causal relationships between QI changes and improved performance. Second, our estimates of QI changes adopted do not capture variation in the duration or intensity of QI implementation by hospitals, so we may have underestimated the benefits of full QI implementation.19 Third, we did not control for case-specific volume or financial resources, which are associated with quality performance25,40,44; however, we controlled for hospital size and urbanicity, which are related. Fourth, although it was the best method available, DSH quintile does not precisely identify safety-net hospitals; safety-net hospitals may face barriers to implementing QI changes that were not identified in the results. Fifth, our study does not address the overall cost-effectiveness of QI activities. Hospitals would incur substantial costs if they were implementing a substantial number of QI changes directed toward quality measures, which might reduce resources available in potentially more clinically important areas.21,45,46 Finally, the study uses self-reported data, so QI adoption may be lower than reported due to social desirability bias, and the perceived helpfulness of QI changes may have been overestimated. To mitigate bias, we assured respondents of confidentiality. In addition, another study (using the same survey data) identified potential unintended consequences of measurement, which implies that respondents were not subject to significant social desirability bias.21

CONCLUSIONS

Using quality measures, CMS has encouraged hospitals to make QI changes to improve clinical care and outcomes. In this study, hospitals reported broad investments in QI changes, and most reported each change to be helpful. Furthermore, implementation of substantially more QI changes was associated with modestly better performance on CMS quality measures. Although QI changes were associated with improved clinical outcomes and patient experience, their strongest association was with clinical process measures. Given that CMS measurement programs for hospitals have shifted away from process measures toward outcomes of care, hospitals will need to identify QI changes that are more effective for improving outcomes. Analyses that identify the costs and benefits from individual QI actions or combinations thereof could help guide hospital investment of limited resources.

Acknowledgments

The authors thank Mary Vaiana at RAND for her review of the draft manuscript; Julie Lai at RAND for research programming; Alice Kim at RAND for her assistance with identifying previous research studies; Eric Gilbertson, Susan Jentz, Patti McKay, and Kristen Turner at the Health Services Advisory Group (HSAG) and Jeff Whitley at Mathematica for assisting with data used in the analyses; and Denise Remus and Eric Metcalf at HSAG for review of the draft manuscript. The authors also are sad to note that their colleague and coauthor Ann Clancy passed away prior to their submitting this study.

Author Affiliations: RAND Corporation (KDS, MWR, AAT, CLD), Santa Monica, CA; Health Services Advisory Group, Inc (KNC, AMC), Phoenix, AZ; CMS (NB, MD), Baltimore, MD.

Source of Funding: This study was funded through HHSM-500-2013-13007I by CMS.

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (KDS, AMC, NB, MD, CLD); acquisition of data (KDS, KNC, AMC, CLD); analysis and interpretation of data (MWR, AAT, KNC, CLD); drafting of the manuscript (KDS, MWR, NB, MD, CLD); critical revision of the manuscript for important intellectual content (KDS, MWR, AAT, KNC, AMC, CLD); statistical analysis (KDS, MWR, AAT, CLD); obtaining funding (KNC, NB, MD, CLD); administrative, technical, or logistic support (KDS, KNC); and supervision (KNC, CLD).

Address Correspondence to: Cheryl L. Damberg, PhD, RAND Corporation, 1776 Main St, Santa Monica, CA 90401. Email: damberg@rand.org.

REFERENCES

1. CMS, HHS. Medicare program; hospital inpatient prospective payment systems for acute care hospitals and the long-term care hospital prospective payment system policy changes and fiscal year 2016 rates; revisions of quality reporting requirements for specific providers, including changes related to the electronic health record incentive program; extensions of the Medicare-dependent, small rural hospital program and the low-volume payment adjustment for hospitals; final rule. Fed Regist. 2015;80(158):49325-49886.

2. Hospital Compare. CMS. Accessed October 24, 2017. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/hospitalqualityinits/hospitalcompare.html

3. Damberg CL, Sorbero ME, Lovejoy SL, et al. An evaluation of the use of performance measures in health care. RAND Corporation. 2011. Accessed October 23, 2019. https://www.rand.org/pubs/technical_reports/TR1148.html

4. Mehrotra A, Damberg CL, Sorbero ME, Teleki SS. Pay for performance in the hospital setting: what is the state of the evidence? Am J Med Qual. 2009;24(1):19-28. doi:10.1177/1062860608326634

5. National Impact Assessment of the Centers for Medicare & Medicaid Services (CMS) Quality Measures Report: 2015 report. CMS. Accessed October 23, 2019. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/QualityMeasurementImpactReports.html#2015NIA

6. Mendelson A, Kondo K, Damberg C, et al. The effects of pay-for-performance programs on health, health care use, and processes of care: a systematic review. Ann Intern Med. 2017;166(5):341-353. doi:10.7326/M16-1881

7. Hospital-Acquired Condition Reduction Program. CMS. Accessed November 9, 2019. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/HAC-Reduction-Program

8. Danz MS, Rubenstein LV, Hempel S, et al. Identifying quality improvement intervention evaluations: is consensus achievable? Qual Saf Health Care. 2010;19(4):279-283. doi:10.1136/qshc.2009.036475

9. Batalden PB, Davidoff F. What is “quality improvement” and how can it transform healthcare? BMJ Qual Saf. 2007;16(1):2-3. doi:10.1136/qshc.2006.022046

10. Joynt KE, Figueroa JE, Oray J, Jha AK. Opinions on the Hospital Readmission Reduction Program: results of a national survey of hospital leaders. Am J Manag Care. 2016;22(8):e287-e294.

11. Charles D, Gabriel M, Furukawa MF. Adoption of electronic health record systems among US non-federal acute care hospitals: 2008-2013. ONC data brief No. 16. May 2014. Accessed November 8, 2021. https://www.healthit.gov/sites/default/files/oncdatabrief16.pdf

12. Jarvis B, Johnson T, Butler P, et al. Assessing the impact of electronic health records as an enabler of hospital quality and patient satisfaction. Acad Med. 2013;88(10):1471-1477. doi:10.1097/ACM.0b013e3182a36cab

13. Kazley AS, Diana ML, Ford EW, Menachemi N. Is electronic health record use associated with patient satisfaction in hospitals? Health Care Manage Rev. 2012;37(1):23-30. doi:10.1097/HMR.0b013e3182307bd3

14. Restuccia JD, Cohen AB, Horwitt JN, Shwartz M. Hospital implementation of health information technology and quality of care: are they related? BMC Med Inform Decis Mak. 2012;12:109. doi:10.1186/1472-6947-12-109

15. Jones SS, Rudin RS, Perry T, Shekelle PG. Health information technology: an updated systematic review with a focus on meaningful use. Ann Intern Med. 2014;160(1):48-54. doi:10.7326/M13-1531

16. Joynt KE, Bhatt DL, Schwamm LH, et al. Lack of impact of electronic health records on quality of care and outcomes for ischemic stroke. J Am Coll Cardiol. 2015;65(18):1964-1972. doi:10.1016/j.jacc.2015.02.059

17. Thirukumaran CP, Dolan JG, Reagan Webster P, Panzer RJ, Friedman B. The impact of electronic health record implementation and use on performance of the Surgical Care Improvement Project measures. Health Serv Res. 2015;50(1):273-289. doi:10.1111/1475-6773.12191

18. Rollow W, Lied TR, McGann P, et al. Assessment of the Medicare quality improvement organization program. Ann Intern Med. 2006;145(5):342-353. doi:10.7326/0003-4819-145-5-200609050-00134

19. Shortell SM, Blodgett JC, Rundall TG, Kralovec P. Use of Lean and related transformational performance improvement systems in hospitals in the United States: results from a national survey. Jt Comm J Qual Patient Saf. 2018;44(10):574-582. doi:10.1016/j.jcjq.2018.03.002

20. Shortell SM, Rundall TG, Blodgett JC. Assessing the relationship of the human resource, finance, and information technology functions on reported performance in hospitals using the Lean management system. Health Care Manage Rev. 2021;46(2):145-152. doi:10.1097/HMR.0000000000000253

21. National Impact Assessment of the Centers for Medicare & Medicaid Services (CMS) Quality
Measures Report: 2018 report. CMS. Accessed October 19, 2019. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/National-Impact-Assessment-of-the-Centers-for-Medicare-and-Medicaid-Services-CMS-Quality-Measures-Reports#2018NIA

22. Hospital National Provider Survey. Office of Management and Budget. 2015. Accessed February 4, 2019. https://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201506-0938-008

23. Jones SS, Adams JL, Schneider EC, Ringel JS, McGlynn EA. Electronic health record adoption and quality improvement in US hospitals. Am J Manag Care. 2010;16(suppl 12):SP64-SP71.

24. Vina ER, Rhew DC, Weingarten SR, Weingarten JB, Chang JT. Relationship between organizational factors and performance among pay-for-performance hospitals. J Gen Intern Med.2009;24(7):833-840. doi:10.1007/s11606-009-0997-6

25. Brand CA, Barker AL, Morello RT, et al. A review of hospital characteristics associated with improved performance. Int J Qual Health Care. 2012;24(5):483-494. doi:10.1093/intqhc/mzs044

26. McConnell KJ, Chang AM, Maddox TM, Wholey DR, Lindrooth RC. An exploration of management practices in hospitals. Healthc (Amst). 2014;2(2):121-129. doi:10.1016/j.hjdsi.2013.12.014

27. Hospital inpatient measures. CMS. Accessed August 30, 2020. https://www.qualitynet.org/inpatient/measures

28. Provider of Services current files. CMS. Accessed October 19, 2019. https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/Provider-of-Services/index.html

29. CMS, HHS. Medicare program; hospital inpatient prospective payment systems for acute care hospitals and the long-term care hospital prospective payment system and policy changes and fiscal year 2018 rates; quality reporting requirements for specific providers; Medicare and Medicaid electronic health record (EHR) incentive program requirements for eligible hospitals, critical access hospitals, and eligible professionals; provider-based status of Indian health service and tribal facilities and organizations; costs reporting and provider requirements; agreement termination notices. Fed Regist. 2017;82(155):37990-38589.

30. Robbins MW. The utility of nonparametric transformations for imputation of survey data. J Off Stat. 2014;30(4):675-700. doi:10.2478/jos-2014-0043

31. Baruch Y, Holtom BC. Survey response rate levels and trends in organizational research. Hum Relat. 2008;61(8):1139-1160. doi:10.1177/0018726708094863

32. Blendon RJ, Schoen C, DesRoches CM, Osborn R, Zapert K, Raleigh E. Confronting competing demands to improve quality: a five-country hospital survey. Health Aff (Millwood). 2004;23(3):119-135. doi:10.1377/hlthaff.23.3.119

33. Cycyota CS, Harrison DA. What (not) to expect when surveying executives: a meta-analysis of top manager response rates and techniques over time. Organ Res Meth. 2006;9(2):133-160. doi:10.1177/1094428105280770

34. Weissman JS, Annas CL, Epstein AM, et al. Error reporting and disclosure systems: views from hospital leaders. JAMA. 2005;293(11):1359-1366. doi:10.1001/jama.293.11.1359

35. Dillman DA, Smyth JD, Christian LM. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Wiley & Sons; 2009.

36. Särndal CE. The calibration approach in survey theory and practice. Surv Methodol. 2007;33(2):99-119.

37. Disproportionate share hospital (DSH). CMS. Accessed November 27, 2019. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/dsh

38. Desai NR, Ross JS, Kwon JY, et al. Association between hospital penalty status under the Hospital Readmission Reduction Program and readmission rates for target and nontarget conditions. JAMA. 2016;316(24):2647-2656. doi:10.1001/jama.2016.18533

39. Gupta A, Allen LA, Bhatt DL, et al. Association of the Hospital Readmissions Reduction Program implementation with readmission and mortality outcomes in heart failure. JAMA Cardiol. 2018;3(1):44-53. doi:10.1001/jamacardio.2017.4265

40. Hoyer EH, Padula WV, Brotman DJ, et al. Patterns of hospital performance on the hospital-wide 30-day readmission metric: is the playing field level? J Gen Intern Med. 2018;33(1):57-64. doi:10.1007/s11606-017-4193-9

41. Rice S. CMS data reinforces that process measures are easier to improve than outcomes. Modern Healthcare. March 3, 2015. Accessed November 8, 2021. https://www.modernhealthcare.com/article/20150303/NEWS/150309965/cms-data-reinforces-that-processes-are-easier-to-improve-than-outcomes

42. Adler-Milstein J, Everson J, Lee SY. EHR adoption and hospital performance: time-related effects. Health Serv Res. 2015;50(6):1751-1771. doi:10.1111/1475-6773.12406

43. Scott KW, Orav EJ, Cutler DM, Jha AK. Changes in hospital-physician affiliations in U.S. hospitals and their effect on quality of care. Ann Intern Med. 2017;166(1):1-8. doi:10.7326/M16-0125

44. Leow JJ, Leong EK, Serrell EC, et al. Systematic review of the volume-outcome relationship for radical prostatectomy. Eur Urol Focus. 2018;4(6):775-789. doi:10.1016/j.euf.2017.03.008

45. Lindenauer PK, Lagu T, Ross JS, et al. Attitudes of hospital leaders toward publicly reported measures of health care quality. JAMA Intern Med. 2014;174(12):1904-1911. doi:10.1001/jamainternmed.2014.5161

46. Werner RM, Asch DA. The unintended consequences of publicly reporting quality information. JAMA. 2005;293(10):1239-1244. doi:10.1001/jama.293.10.1239

Related Videos
James Robinson, PhD, MPH, University of California, Berkeley
James Robinson, PhD, MPH, University of California, Berkeley
James Robinson, PhD, MPH, University of California, Berkeley
Carrie Kozlowski
James Robinson, PhD, MPH, University of California, Berkeley
Carrie Kozlowski, OT, MBA
Miriam J. Atkins, MD, FACP, president of the Community Oncology Alliance (COA) and physician and partner of AO Multispecialty Clinic in Augusta, Georgia.
Carrie Kozlowski, OT, MBA
Shawn Gremminger
Dr Lucy Langer
© 2024 MJH Life Sciences
AJMC®
All rights reserved.