Hospital Responses to DSRIP Program Reforms in New Jersey

July 2, 2020

This study examines the New Jersey Delivery System Reform Incentive Payment (DSRIP) program using hospital web surveys and key informant interviews and finds progress toward data-driven population health management for low-income patients.

ABSTRACTObjectives: The Delivery System Reform Incentive Payment (DSRIP) program was designed to move hospitals toward more effective and accountable care for their Medicaid and charity care patients. Assessing potential changes among DSRIP-participating hospitals in community partnerships, data analytic capabilities related to quality measurement, and engagement in the DSRIP model may provide information for future value-based payment strategies.

Study Design: Statistical analysis of linked hospital data from 2 web surveys, supplemented with findings from 2 rounds of key informant interviews.

Methods: Data from 2 New Jersey hospital web surveys in 2015 and 2018 were analyzed to assess changes in hospital activities, practices, and perceptions related to the DSRIP program. Survey measures included the number of partnerships that hospitals developed with community organizations and outpatient providers, hospitals’ ability to share patient data with outpatient partners, hospitals’ analytic capability to leverage patient data for quality assessment, and perceptions of hospital representatives of the value of their DSRIP projects. Stakeholder feedback from key informant interviews conducted in 2014 and 2017 provided context for survey findings.

Results: Hospital partnerships with community providers did not grow significantly over the DSRIP implementation period, which may be due to constraints in program design. Hospital capacity for collection of data to fulfill DSRIP reporting requirements increased over the study period. Data exchanges with outpatient partners facilitated use of rapid-cycle evaluation tools, and the value of data sharing for improving quality of care and population health was perceived more positively over time. The perceived benefit of DSRIP project activities overall, for patient access to care and overall health, also increased between the 2 surveys.

Conclusions: In New Jersey, available evidence suggests that DSRIP was successful in catalyzing many aspects of a hospital industry transformation toward data-driven population health management. These findings can inform policy makers’ decisions about how to structure future initiatives to help hospitals progress further in delivering value-based care.

The American Journal of Accountable Care. 2020;8(2):4-12The Delivery System Reform Incentive Payment (DSRIP) program, a 2010 federal initiative negotiated between states and CMS under section 1115 waiver authority, provides a financing mechanism for providers to transform care for beneficiaries of Medicaid, the Children’s Health Insurance Program, and charity care.1 By tying funds to measurable quality outcomes, DSRIP establishes incentives for providers (mainly hospitals) to proactively manage the health of their low-income patients, making these providers regional anchors in the Triple Aim quest for better care, better health, and lower costs. DSRIP or DSRIP-like programs have been implemented in 12 states to date and vary in size, reach, provider participation, and funding pools.2 While the original program structure is evolving, the concepts underlying DSRIP are poised to be an integral part of the future Medicaid delivery system landscape as value-based payment structures grow in prominence for health care providers.

Although some early assessments of DSRIP’s impact on patient outcomes and population health indicators are emerging,3-5 opportunities to learn from the implementation experiences of participating providers are more quickly available. Understanding these dynamics can yield valuable information to inform framing and implementation of future policies.6 Some available evidence includes case studies that discuss DSRIP program design and lessons learned from implementation in several states.7-10 These studies demonstrate the positive role that DSRIP played in providing resources and harnessing organizational motivation for improving provider data capabilities and building effective community partnerships.3,8 Providing incentives for these changes was a core DSRIP objective because sustainable, cost-effective health improvements for low-income populations are predicated on integrated and accountable delivery systems.7,11 Consequently, assessing progress made by providers in building infrastructure to support redesigned processes of care delivery is an important metric for evaluating DSRIP’s impact, yet this has not been studied in a systematic way.

In addition, DSRIP promotes engaging community and social services organizations in improving population health by providing funding for services that are not traditionally reimbursed for Medicaid patients. Under DSRIP and similar projects in other states, providers have partnered with such organizations to identify and connect patients directly to community resources addressing social needs.12 The program encouraged hospitals to address patient needs as far “upstream” as possible, shifting their efforts to innovative population health missions, for example, by developing home interventions for children with asthma or arranging transportation for patients with diabetes to aerobics classes. An assessment of hospitals’ perceptions of the effectiveness of the DSRIP model is therefore important for understanding hospital engagement in this reoriented model of patient care.

This study utilizes quantitative and qualitative information about the New Jersey program to examine hospital operational changes brought about by DSRIP and perceptions of hospital leaders about the value of these changes for improving patient health. Specifically, we describe the impact of DSRIP on New Jersey hospitals in terms of (1) increasing collaboration with clinical and community partners, (2) improving data infrastructure and analytic capabilities for meeting data reporting requirements, and (3) strengthening the perceived value and effectiveness of patient care models structured for population health management. New Jersey provides a good venue for such a study because all hospitals in the state were eligible to participate in the program, not just the safety net hospitals (SNHs) with a high proportion of Medicaid and uninsured patients.


DSRIP utilized a pay-for-performance (P4P)/reporting system to encourage hospitals to adopt a population health management approach to improving care for New Jersey’s low-income population. Hospitals had to utilize data to identify a chronic condition relevant for their patient population, adopt a care management project, form partnerships with community organizations and outpatient providers to help establish a continuum of care, and specify relevant validated outcome metrics so that the impact of DSRIP activities on patient and population health could be measured.13 Elements of hospital projects such as improved case management, discharge planning, patient education, and a focus on outcome measurement reflect the population health management principles embedded in DSRIP projects.

After state and CMS approval of care management projects, New Jersey DSRIP-participating hospitals took part in 4 overlapping program stages with defined activities determining incentive payments across demonstration years (DYs): developing technology, tools, and human resources infrastructure (stage 1); piloting, testing, and implementing a chronic disease care project (stage 2); achieving project-based quality improvements and reporting quality metrics capturing the impact of DSRIP projects (stage 3); and reporting prespecified population-focused quality metrics (stage 4). An important stage 1 activity involved engaging project partners, such as schools, doctor’s offices, or federally qualified health centers, to support hospitals with their DSRIP projects. Stage 3 and 4 activities required hospitals to report on several project-specific and population health—related quality metrics for an attributed population of Medicaid and charity care patients. Whereas metrics based on administrative data were prepared by the state, those based on paper or electronic health records (EHRs) were calculated by hospitals, sometimes in coordination with outpatient partners. Collaboration for such clinical data sharing was thus strongly encouraged under DSRIP, and those project partners sharing data with the hospital were designated as data reporting partners. As the program progressed, a larger share of payments was tied to P4P, requiring improvements in project-related outcomes over baseline for hospitals’ attributed population. Hospitals could also receive universal performance pool (UPP) payments for maintaining quality of care in areas outside their project focus.

Leveraging patient data to guide care and assess outcomes was an integral component of the program’s data-driven population health orientation. Data related to patient utilization were used to assess key health challenges in an area, identify patient populations for directing interventions, and calculate metrics for determining incentive payments. Data reporting partnerships, in particular, were expected to promote care integration and shared accountability between hospitals and community health care providers. These data sharing relationships were seen as potentially increasing the effectiveness of hospitals’ rapid-cycle improvement tools by providing more information on patients’ progress toward health improvement goals.


We use data collected from 2 hospital web surveys administered as part of the New Jersey DSRIP evaluation that were fielded during April 2015 (DY3), shortly after implementation of hospitals’ DSRIP projects, and then again 3 years later, after the end of the first round of the demonstration in February 2018 (DY6). The first survey was sent to managers at all DSRIP-eligible hospitals in the state (n = 64), regardless of whether they participated in the program, and the response rate was 65%. The second survey was sent only to hospitals that participated in the DSRIP program (n = 49). The response rate was 86%.

To examine the effects of DSRIP on hospitals over time, our statistical analysis compares data linked over both surveys (n = 22). We used repeated-measures analysis of variance to examine changes in data infrastructure capabilities, community partnerships, and perceptions of the value of the DSRIP program in improving patient health. We also present cross-sectional results from some questions asked only on the second survey. Pairwise deletion was used to deal with item nonresponse.

To determine whether hospitals included in the paired analytic sample differ from those excluded, we compare hospital safety net status and DSRIP performance data for these 2 groups of hospitals using unpaired t tests. We define SNH status based on receipt of Hospital Relief Subsidy Fund dollars in 2011. Performance data are from payment summaries posted to the DSRIP website detailing the number of stage 3, stage 4, and UPP quality metrics for which hospitals were awarded payment out of the total attempted.14 All analyses were done in SPSS with α set at 0.05. P values are reported in the results when differences were significant.

To provide context for our findings, we draw on 2 rounds of key informant interviews (KIIs) with DSRIP stakeholders; they include selected DSRIP personnel in participating hospitals, DSRIP program advisory committee members, officials from the New Jersey Department of Health, outpatient providers who had partnered with DSRIP hospitals, and hospital industry association representatives. We selected stakeholders who we believed were best equipped to answer questions on DSRIP’s impact, potential, and sustainability. The first round of interviews was conducted from October 2014 to February 2015 and consisted of 12 interviews with 13 key informants. The second round was conducted from October to December 2017 and consisted of 10 interviews with 29 key informants. Semistructured interview guides were used in both rounds. Research team members independently analyzed the interviews and came to a consensus on core themes.

This research was approved by the institutional review board of the authors’ home institution, Rutgers, The State University of New Jersey, and informed consent was obtained from all participants.


Of the 64 general acute care hospitals in New Jersey, 9 (14.1%) elected not to participate in DSRIP and 6 (9.4%) dropped out during the implementation period. None of these 15 hospitals were SNHs. Of the 49 hospitals that participated in both rounds of DSRIP, 22 (44.9%) responded to both surveys and were included in the paired sample. The remaining 27 either did not participate in 1 or both survey rounds or were unlinkable due to inadequate identifying information. SNHs were overrepresented in our paired sample compared with DSRIP-participating hospitals not in our analysis (64% vs 44% SNH, respectively). We further examined differences in hospital performance by survey participation that may reflect differences in hospital experience and engagement. We calculated the percentage of stage 3 and stage 4 metrics for which payment incentives were earned in DY5 and found no statistically significant differences; however, UPP achievement in DY5 was slightly lower in our paired sample than among excluded hospitals (62% vs 69%), although this difference fell short of statistical significance (Table 1).

Relationships With Clinical and Community Partners

Hospital informants expressed some uncertainty about program requirements for clinical and community partners during the first round of interviews, possibly explaining limited hospital engagement with partners early on. The survey indicated that hospital partnerships did increase slightly over the DSRIP implementation period. On average, hospitals had 4.7 project partners for their DSRIP programs in the beginning of the implementation period, and this increased to 5.0 by DY6 (Table 2). The strategy of recruiting physician practices as partners doubled over the study period, from 24% of hospitals to 48% of hospitals, and in DY6, physician practices were the most common partner type reported by hospitals, with an average of 1.7 practices per hospital (Table 2). An increasing proportion of survey respondents reported recruiting other (nonclinical) community organizations as partners, going from just under 40% in the first survey to about 60% in the second survey.

The average number of data reporting partners did not increase over the study period. However, we observe an increase in the average number of organizations with which hospitals would have liked to partner for data sharing but could not because of the complexity of data sharing requirements or because a potential partner was already in a data sharing relationship with another hospital (Table 2). The DSRIP program did not provide funding for the extra work required of data reporting partners, thus constraining hospitals’ ability to forge partnerships, according to the second round of KIIs. Finally, it is worth noting that a majority of hospitals surveyed (more than 75% in both rounds) were working with outpatient partners before DSRIP. Of those with existing clinical partners, approximately 70% felt that DSRIP had strengthened these relationships and none felt those relationships were weakened.

Data/Analytic Capabilities

Both rounds of KIIs revealed that preparing the chart-based stage 3 and stage 4 metrics was a challenge for all hospitals, particularly those lacking sophisticated EHR systems. Some stakeholders described the considerable efforts made to extract the data needed to comply with reporting requirements. These learning experiences may also explain improvements in the analytic capabilities of hospitals in our sample over the DSRIP implementation period. Hospitals reported that a higher percentage of stage 4 inpatient or emergency department chart-based metrics were obtainable from EHRs in the second survey than in the first (P&thinsp;=&thinsp;.03) (Table 3). Hospitals also reported less difficulty in the second survey than in the first in collection of inpatient and outpatient metrics to fulfill reporting requirements (P&thinsp;<&thinsp;.01).

Changes in the data capabilities of outpatient reporting partners were less evident, although this is based on the smaller sample of responding hospitals with a data reporting partner. The overall average number of reporting partners with an interoperable EHR did increase slightly from the first survey to the second, and the percentage of stage 4 metrics obtainable from reporting partners’ EHRs increased over this time period. Finally, most hospitals were using rapid-cycle evaluation tools (≥90%), and the percentage of hospitals agreeing that this was facilitated by real-time data exchanges with partners increased from 11% to 42% (P&thinsp;=&thinsp;.04).

Perceived Value and Effectiveness

Survey results showed that hospital respondents’ perceptions of DSRIP’s value and effectiveness were generally positive, and some indicators improved significantly with DSRIP implementation. Hospitals’ reasons for applying for and remaining in the DSRIP program were almost all unchanged, except that DSRIP was increasingly viewed as an opportunity for more financial resources (P&thinsp;=&thinsp;.04) (Table 4). In the first survey, hospital respondents felt that disease management programs under DSRIP had the greatest impact on quality of care and population health outcomes, and this further increased in the second survey. As the program progressed, hospital respondents reported a more positive assessment of the various DSRIP-prescribed activities in improving quality of care and population health outcomes (Table 4). They increasingly saw the value of sharing data with reporting partners, reporting stage 4 metrics, and using rapid-cycle assessment and improvement tools; by the second survey, every aspect was rated as having at least a moderately positive impact, except for reporting on stage 4 metrics. These findings are consistent with both KII rounds in which interviewees expressed enthusiasm for the chronic disease management aspect of DSRIP but saw little value in reporting measures beyond those related to their specific intervention.

The survey assessed whether hospitals regarded DSRIP activities as improving access to care, quality of care, and health, and we observed improvement in perceptions in some areas. In the first survey, hospital respondents perceived that their DSRIP activities improved patient access to health services and patient health only modestly, if at all, with an average rating just shy of “some improvement.” In the second survey, respondents attributed at least “some improvement” in all queried aspects of patient health and care to the beneficial effect of DSRIP activities. Among these, DSRIP’s effects on patient access and overall patient health were statistically significant (P&thinsp;=&thinsp;.02 and .01, respectively) (Table 4).

The second survey further assessed hospital respondent impressions of DSRIP’s overall value and effectiveness in 5 areas: program incentives, payment methodologies, quality metrics, disease management, and community partnerships. The average rating of these statements indicated modest agreement. This was corroborated by findings from the second round of KIIs, in which some of these positive perceptions were also communicated by stakeholders. Hospital informants felt that their DSRIP initiatives underscored the importance of connecting with the community outside the hospital, although progress was sometimes slow because the outpatient component was not emphasized from the beginning of the program. Hospital informants mentioned the value in being able to use DSRIP funds flexibly to pay for services not traditionally reimbursed. By the second set of KIIs, informants were also expressing positive effects on health outcomes from their interventions.


New Jersey was 1 of 12 states implementing a DSRIP or DSRIP-like program between 2010 and 2018, and we have highlighted salient ways in which New Jersey hospitals responded to DSRIP over the first demonstration period. Our findings suggest small, positive improvements in partnership relationship building, data capabilities, and perceived impact on patient care and population health. We found statistically significant increases in hospitals’ capability for quality metric reporting and in belief by respondents that their DSRIP programs were improving patient access and health. Our results also point to areas where there is room for improvement in the design of incentive-based delivery system reforms.

Our surveys revealed that although hospitals had, on average, 4 to 5 project partners from the time of project initiation, there was not significant growth in such partnerships over the DSRIP implementation period. Nevertheless, the majority of hospitals agreed that DSRIP fostered community partnerships that had a positive impact on social determinants of health. As for data reporting partnerships, aspects of DSRIP’s design likely limited growth in these partnerships from reaching their full potential. Foremost among them was the lack of financial resources to compensate outpatient providers for the additional work around data collection or for the ongoing reporting that would be required of them. Future DSRIP and DSRIP-like programs should be designed with an awareness that partners may require external funding to enable their participation, a lesson also echoed in an examination of DSRIP in New York.7 Some of the administrative restrictions that limited outpatient collaboration to just 1 hospital may have also impeded new partnerships.

Our analysis revealed increased hospital ability and greater ease over time in extracting quality metric data from EHRs. Although this may reflect in part an industry-wide trend of improved EHR sophistication, it also suggests that DSRIP spurred growth in data and analytic capabilities for participating hospitals. Moreover, perception of the positive impact of data sharing on quality of care and health outcomes improved as the program progressed. It seems that as data sophistication and analytic capabilities grew, so too did appreciation of their value in enhancing care.

Hospitals’ survey responses indicated that their belief in the value and effectiveness of the DSRIP program in improving access to care and health increased over the implementation period. This is notable given that DSRIP represented a major transition point for the hospital industry to value-based payment arrangements under Medicaid. Notwithstanding this positive outlook, the survey indicated considerable room for improvement in the operationalization of P4P structure. Hospitals respondents only “somewhat agreed” that DSRIP used payment methodologies that “fairly” incentivized hospitals’ investments in chronic disease management processes. Also, the importance of DSRIP as a source of financial resources increased from the first survey to the second. This could indicate greater financial pressure among DSRIP hospitals following investment in their chronic disease management programs and the first P4P and UPP results. Although these findings may be driven in part by the overrepresentation of more resource-constrained SNHs in our sample and hospitals with slightly lower UPP payments, they are also consistent with the findings from KIIs that were based on the overall hospital industry.


This study is the first to quantify changes in hospital practices and perceptions using repeated survey data on DSRIP implementation experiences supplemented by broad stakeholder KIIs. Although our statistical power was limited due to nonresponse and the inability to link data for some hospitals, the findings provide insights on the association between DSRIP program features and the observed changes in hospital operations around patient care and population health improvement. Without similar information for nonparticipating hospitals, we cannot draw inferences about the extent to which DSRIP may have contributed to the changes we observe, particularly in the growth of analytic capabilities, which may reflect an industry-wide trend. We conducted a sensitivity analysis to determine if there was a dose-response relationship between the DSRIP funding level of each hospital and the changes in analytic capabilities, but we did not find a clear pattern. Additionally, caution should be used in extrapolating our findings to all hospitals, as SNHs were overrepresented in our analytic sample. Still, these hospitals are an important target of the DSRIP, so their strong response rate is a strength of our analysis.


The DSRIP program was designed to orient hospitals to population health management using a data-driven approach to identifying community health needs, delivering targeted interventions using evidence-based models of care, and promoting an accountable payment structure based on hospital quality performance and reporting. In New Jersey, the program was successful in catalyzing many aspects of this industry transformation. Processes around data collection for quality metric reporting became better established. Exchanging data with outpatient partners has facilitated use of rapid-cycle evaluation tools, and hospital respondents reported that data sharing and DSRIP activities have positively affected the patients and communities that they serve. Results also point to areas where improvements could be made in the operationalization of a P4P structure.


The authors are grateful to officials at the New Jersey Department of Human Services and the New Jersey Department of Health for contextual information relating to the DSRIP program and to Bram Poquette, MLIS, for assistance in preparing this manuscript. We also appreciate the helpful comments and suggestions provided by the editor and anonymous reviewers of this manuscript.Author Affiliations: Center for State Health Policy, Institute for Health, Health Care Policy and Aging Research (KL, SC, SB, JF, JCC), and Edward J. Bloustein School of Planning and Public Policy (JCC), Rutgers University, New Brunswick, NJ.

Source of Funding: Funding for this work was provided by the New Jersey Department of Human Services (NJDHS) and the Robert Wood Johnson Foundation (RWJF). This work was prepared for NJDHS. Any opinions expressed in this report are those of the authors and do not necessarily represent the view of NJDHS or RWJF.

Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (KL, SC); acquisition of data (KL, SC, SB, JF); analysis and interpretation of data (KL, SC, SB, JF, JCC); drafting of the manuscript (KL, SC, JF, JCC); critical revision of the manuscript for important intellectual content (KL, SC, SB, JF, JCC); statistical analysis (KL, SC, SB); administrative, technical, or logistic support (KL, JF); and supervision (SC, JCC).

Send Correspondence to: Sujoy Chakravarty, PhD, Center for State Health Policy, Institute for Health, Health Care Policy and Aging Research, Rutgers University, 112 Paterson St, 5th Floor, New Brunswick, NJ 08901. Email:

1. Heider F, Kartika T, Rosenthal J. Exploration of the evolving federal and state promise of Delivery System Reform Incentive Payment (DSRIP) and similar programs. Medicaid and CHIP Payment and Access Commission. August 2017. Accessed January 23, 2019.

2. Delivery System Reform Incentive Payment programs. Medicaid and CHIP Payment and Access Commission. April 2020. Accessed May 14, 2020.

3. Pourat N. California public hospitals improved quality of care under Medicaid waiver program. University of California, Los Angeles, Center for Health Policy Research. 2017. Accessed January 23, 2019.

4. Baller JB, Woerheide J, Lane K, Verbitsky-Savitz N, Wrobel MV. Delivery System Reform Incentive Payments: interim evaluation report. Mathematica Policy Research. January 31, 2018. Accessed May 23, 2018.{CA838A52-727C-4AB0-B532-8CE60A7CCD8E}

5. Chakravarty S, Lloyd K, Brownlee S, Farnham J. A summative evaluation of the New Jersey DSRIP program: findings from stakeholder interviews, hospital surveys, Medicaid claims data, and reported quality metrics. Rutgers Center for State Health Policy. April 2018. Accessed May 28, 2020.

6. Gusmano MK, Thompson FJ. Medicaid Delivery System Reform Incentive Payments: where do we stand? Health Affairs. September 28, 2018. Accessed January 23, 2019.

7. Roby DH, Louis CJ, Cole MMJ, et al. Supporting transformation through Delivery System Reform Incentive Payment programs: lessons from New York State. J Health Polit Policy Law. 2018;43(2):185-228. doi:10.1215/03616878-4303527

8. Shaikh U, Kizer KW. Observations from California’s Delivery System Reform Incentive Payment program. Am J Med Qual. 2018;33(1):14-20. doi:10.1177/1062860617696579

9. Begley C, Hall J, Shenoy A, et al. Design and implementation of the Texas Medicaid DSRIP program. Popul Health Manag. 2017;20(2):139-145. doi:10.1089/pop.2015.0192

10. Chakravarty S, Lloyd K, Farnham J, Brownlee S. Medicaid DSRIP in New Jersey: trade-offs between broad hospital participation and safety net viability. J Health Polit Policy Law. 2019;44(5):789-806. doi:10.1215/03616878-7611659

11. Gusmano MK, Thompson FJ. An examination of Medicaid Delivery System Reform Incentive Payment initiatives under way in six states. Health Aff (Millwood). 2015;34(7):1162-1169. doi:10.1377/hlthaff.2015.0165

12. Crumley D, Lloyd J, Pucciarello M, Stapelfeld B. Addressing social determinants of health via Medicaid managed care contracts and section 1115 demonstrations. Center for Health Care Strategies, Inc. December 2018. Accessed December 21, 2018.

13. Delivery System Reform Incentive Payment (DSRIP) planning protocol. New Jersey Department of Health. August 9, 2013. Accessed February 11, 2020.

14. DSRIP DY5 payment schedules and guidance document: DY5 performance summary. New Jersey Department of Health. July 31, 2017. Accessed March 28, 2019.