The authors evaluated a brief assessment tool that accountable care organizations can use to help elementary schools improve student nutrition and increase physical activity.
Objectives: HealthPartners developed a checklist, the School Environment Index (SEI), that it uses to help elementary schools identify opportunities to improve student nutrition and increase physical activity. The objective in this pilot study was to assess whether the SEI, as administered, can be used to measure the progress of these programs.
Study Design: The authors focused their evaluation on the National Quality Forum measure evaluation components of reliability and validity; feasibility; and use and usability to assess the SEI’s performance.
Methods: The authors used data from 214 SEIs completed by the 69 schools that participated in the school challenge in at least 1 of the years 2015 through 2019. Between 29 and 53 schools participated in a particular year.
Results: Cronbach’s α was 0.79, intraclass correlation was 0.36 (95% CI, 0.22-0.53), and sensitivity to change was 0.41 (95% CI, 0.17-0.66) per 1-year change in the standardized SEI score. The median (interquartile range) time required to complete the survey was 11 (7-21) minutes. On only 8 surveys was an entire domain of the SEI skipped or only a single response to the domain recorded.
Conclusions: The SEI shows adequate internal consistency and sensitivity to change in this pilot evaluation. It is also feasible and useful to identify opportunities to improve practices and policies related to student nutrition and physical activity in partnership with the participating elementary schools. However, it lacks reliability as used. Increasing the number of respondents per school might moderate the impact of individual respondents and thereby increase reliability.
Am J Manag Care. 2021;27(11):e366-e371. https://doi.org/10.37765/ajmc.2021.88779
Because accountable care organizations (ACOs) address the root causes of poor health through accountable health community initiatives, they may choose to promote the health and well-being of children.1,2 Obesity is a barrier to this goal, and in the United States the prevalence of pediatric obesity has been increasing since at least 1976.3 Now, nearly 20% of children aged 6 to 11 years are obese.4 Poor nutrition and inadequate physical activity throughout entire communities are contributing to this trend. Therefore, potential solutions need to include programs, policies, and environments in addition to interventions designed to help individuals change their behaviors.
Wanting to thwart the obesity epidemic, but also recognizing that all children—obese or not—benefit from health-promoting environments,5,6 HealthPartners has sponsored a community collaborative, PowerUp, since 2013.7-10 Informed by authoritative documents such as Institute of Medicine reports11-13 and the recommendations of expert committees,14 PowerUp produces and promotes community-wide programs, policies, systems, and environments that make high-quality nutrition and physical activity attractive and easy. Promoting nutrition and physical activity avoids the stigmatizing and other negative effects that result from focusing on a child’s body mass index (BMI).15
School-based interventions are a key component of PowerUp because they have been shown to reduce the prevalence of childhood obesity16 and they provide opportunities to promote healthy behaviors among entire cohorts of children, their siblings, and their parents. Near the end of each school year, PowerUp asks the coordinator in each participating school to complete a checklist, the School Environment Index (SEI), that captures the school’s programs, policies, practices, and physical environments that relate to nutrition and physical activity. In the following year, PowerUp discusses the responses with each school to identify opportunities for improvement. PowerUp provides feedback to the schools in both tabular format, as in Table 1, and in graphic format, as in the Figure.
To measure progress, PowerUp keeps a count of its contacts and the number of community activities that occur. It also assesses parents’ knowledge of PowerUp and their behaviors relative to PowerUp goals with a survey of randomly selected households where elementary-aged children are likely to live.
The authors evaluated whether school policies can be used to measure progress toward PowerUp goals but have not found them to be useful. Typically, the policies only provide broad, district-level recommendations that are not measurable, and implementation at the school level is inconsistent.
It is possible that the data PowerUp collects with the SEI could be used to track changes in programs, policies, and environments in the participating schools. If the tool is not valid and reliable, however, or if schools will not use it, relying on the SEI as a measure of success could incorrectly suggest that PowerUp is not effective when, in fact, it is succeeding. Therefore, the authors have used the National Quality Forum (NQF) measure evaluation components of scientific acceptability (ie, reliability and validity), feasibility, and use and usability17 to assess whether the SEI can detect improvements in school nutrition and physical activity policies and environments in PowerUp communities.
The data that the authors used to assess the preliminary performance of the SEI were collected for program implementation and quality improvement; thus, this secondary analysis was not subject to review by an institutional review board.
The authors characterized the schools that participated in the school challenge by state (Minnesota or Wisconsin); as public, private, or charter; and by community characteristic (urban, suburban, or rural). We also asked the schools to provide us with the number of students who attend the school, and we used Department of Education public data or data provided by the schools to calculate the proportion of students who are eligible for free or reduced-cost lunch at each school. If less than 33% of students were eligible, we considered the school to be in a high-income community; if 33% to 66% of students were eligible, we considered the school to be in a medium-income community; and if more than 66% of students were eligible, we considered the school to be in a low-income community.
SEI Scoring and Time Stamps
When completing the SEI, the respondent is offered 5 potential responses to each statement: “we do this always/in all classrooms”; “we do this most of the time/in most classrooms”; “we do this sometimes/in some classrooms”; “we do not do this yet but are planning to”; and “we don’t do this and are not interested.” Statements were organized in 4 domains: physical activity, food and beverage, screen time, and rewards and celebrations.
The maximum score assigned to each SEI statement is based on how much the policy or activity described by the statement is thought to contribute to good nutrition or adequate physical activity. The score is weighted as follows: 100% for “we do this always/in all classrooms”; 75% for “we do this most of the time/in most classrooms”; 50% for “we do this sometimes/in some classrooms”; 25% for “we do not do this yet but are planning to”; and 0% for “we don’t do this and are not interested” or if the item response was missing. For example, if the maximum score was 3, the potential scores would be 3, 2.25, 1.5, 0.75, and 0. The overall SEI score can range from 1 to 100 as described in Table 1.
Because the SEI is an online survey, a time stamp is generated when the survey is opened and a second time stamp is generated when the survey is closed. We used the time that a survey was open as a measure of feasibility. We also considered the skipping of an entire domain of a survey or providing a response to only a single statement in a domain as an indication that the survey was not feasible.
The test of reliability is the intraclass correlation18 of school SEI scores across all years in which an SEI was submitted. To estimate the intraclass correlation, the authors used a linear mixed model with all available SEI scores per school as the outcome variable, year as a fixed effect, and random intercept for school. We used the ICC9 macro19 to estimate the 95% CI. We evaluated construct validity by calculating the mean of Cronbach’s α for each item’s score relative to the total score and concurrent validity by comparing the content of SEI with that of the CDC’s School Health Index15 and Student Health Policies and Practices Study (SHPPS) survey instrument.20 To calculate sensitivity to change, we created a standardized SEI score that is the SEI score divided by the SD. We evaluated feasibility using the median time and interquartile range (IQR) that the survey link was open for schools that submitted SEIs. We also counted the number of SEIs in which no responses or only 1 response was entered for an entire domain. The test of use and usability is the proportion of schools participating in the PowerUp School Challenge that completed the SEI.
The analysis is based on 214 SEIs submitted by the 69 schools that participated in the PowerUp School Challenge at least once in the years 2015 through 2019. The number of schools in a particular year ranged from 29 to 53, with more schools participating in the more recent years. Although the majority of schools that completed an SEI were public schools in Minnesota, approximately 19% were Wisconsin public schools (Table 2). Only 9 schools were charter or private schools in either state. More than half of the schools were in suburban communities and half were in high-income communities as defined by the proportion of students eligible for free or reduced-cost lunch. Although 3 of the schools had fewer than 100 students and 2 schools had more than 700, the number of students in each school was distributed across this range fairly evenly. Fifty-four schools completed an SEI in at least 2 years.
Performance of the SEI as a Measure of the School Environment
Reliability. Reliability, as measured by the intraclass correlation, is 0.36 (95% CI, 0.22-0.53) (Table 3).
Validity. The mean Cronbach’s α comparing individual statements with the overall score is 0.79. Sensitivity to change is 0.41 (95% CI, 0.17-0.66) per 1-year change in the standardized SEI score. Statements in the SEI survey include environmental structures that are the attributes of a health-promoting school environment and are in concordance with other school health surveys.20-22 The similarity of the SEI statements to those of the CDC’s School Health Index23 and SHPPS20 is evidence of concurrent validity. The SEI statements have been accepted as valid by the personnel of the schools participating in the PowerUp School Challenge.
Feasibility. The median (IQR) time to complete and submit the survey was 11 (7-21) minutes. Only in 8 SEIs was an entire domain left totally blank or only a single response to the domain recorded. This situation occurred for the responses to the statements about screen time in 7 SEIs and to the statements about rewards and celebrations in 1 SEI.
Use and usability. Each year, schools are invited to complete the SEI at the end of their participation in PowerUp. During the period that we have been sponsoring PowerUp, 2015 to 2019, 69 schools have completed the SEI. In 2015, 51% of the 57 invited schools completed the SEI. In 2016, 62% of the 60 invited schools submitted the SEI. In 2017, 79% of the 62 invited schools submitted the SEI. In 2018, 85% of the 62 invited schools submitted the SEI. The proportion of invited schools that completed the SEI declined slightly in 2019 to 79% of the 58.
HealthPartners developed the SEI to engage elementary schools in the process of improving nutrition and increasing opportunities for physical activity for their students. We conducted the current pilot analysis to learn whether we could also use it as a program evaluation tool. Based on components of the NQF measure evaluation framework, we conclude that the SEI is valid, feasible, usable, and used. However, as we are using the SEI, reliability is unacceptably low.
Exactly why the reliability is low in this data set cannot be determined. It is possible that the statements on the SEI are ambiguous. The 1-year interval between submissions creates the possibility that policy and programmatic changes within the schools—both for better and for worse—reduced reliability. We know that the SEIs were not always submitted by the same individual year to year; this could lead to differences in scores caused by differences in perspective or knowledge of the schools’ policies and programs. It is also possible that respondents were willing to answer the survey without knowing the correct answers. Whatever the source of the low reliability, we conclude that, until we better understand why reliability is low and find ways to increase it, we should not use the SEI for program evaluation. The hazard created by using the SEI for program evaluation at this time is that the user might not identify improvements in nutrition and physical activity that have been made by the participating schools and thus may conclude that PowerUp is failing when, in fact, it is succeeding.
Two of the NQF measure evaluation criteria that we did not focus on are importance and related and competing measures. The SEI is important because the Child Nutrition Reauthorization Act requires schools to set nutrition and physical activity goals and adopt a plan for measuring implementation if they wish to receive federal funding for their student nutrition programs.24
We were able to find 3 related tools that might be considered to measure the same constructs as the SEI: the CDC’s School Health Index,15 the SHPPS survey instrument,20 and the School Physical Activity and Nutrition Environment Tool (SPAN-ET).22,25 Although the CDC has tested its School Health Index for readability and user-friendliness, it has not tested it for validity and reliability because the CDC considers the School Health Index to be a worksheet rather than an evaluation tool.23 The CDC developed the SHPPS survey to create a national record of school policies and has tested it for validity and reliability.20,26 However, the length of the SHPPS could challenge usability. The SHPPS is composed of 5 separate questionnaires and PowerUp has not been able to convince the participating schools to complete it. The SPAN-ET requires face-to-face and/or telephone interviews with key informants, on-site direct observations, and content review of various forms of documentation. The resources required to complete the SPAN-ET would be too great for either the PowerUp program or the participating schools.
Evaluating community programs such as PowerUp is challenging for an ACO because community funding agencies are willing to spend only limited resources on evaluation. HealthPartners is tracking trends in BMI, overweight, and obesity because it has access to children’s medical records through its care group, but these indicators are of limited value in assessing intermediate outcomes and program progress because they lag, provide no process information, and cannot determine from the medical record which school a child is attending.
Failure to invest in evaluation can create a scenario in which scarce resources and time are wasted. A classic example is Project D.A.R.E. (Drug Abuse Resistance Education).27,28 Although very popular with adults, this program was found to have no measurable effect on adolescent drug use.28 Failure to invest in evaluation can also result in an effective program being discontinued.
Because HealthPartners lacks the resources to research novel interventions, it uses the results of successful trials and authoritative reports11-14 to generate a menu of interventions that show promise of success in the community. A recent meta-analysis of school-based trials suggests that this strategy is correct: Effective programs combined diet and physical activity interventions; involved the school teacher in program delivery; had a duration of greater than 8 months; and involved parents in the intervention, the education sessions, and school food modifications.29 Some trials promoting healthier nutrition and increased physical activity have concluded that a program associate who can coach the schools may be essential for success.30,31 HealthPartners uses all of these learnings, insights, and conditions when it partners with the schools that participate in the PowerUp School Challenge.
A significant limitation to the pilot evaluation of the SEI as an assessment tool is the fact that the data were collected for programmatic intervention rather than a formal evaluation. A stronger design would be to conduct a concurrent evaluation with a tool that is known to be valid and reliable. Another approach that might increase reliability would be to ask every classroom teacher in participating schools to complete an SEI. This would tend to reduce the impact of extreme or biased scores. ACOs conducting school-based interventions in other parts of the country might not find the SEI to be useful in engaging schools in policy and environment change.
Although the SEI has been useful to identify opportunities to improve practices and policies that promote student nutrition and physical activity in the partnering elementary schools, in this pilot study the team found that it needs to develop additional procedures to increase reliability of the instrument before using it to assess whether the PowerUp School Challenge is achieving its goals.
Author Affiliations: HealthPartners (TEK), Minneapolis, MN; HealthPartners Institute (TEK, GV-B), Minneapolis, MN; Lakeview Health Foundation (HSK, MMC, AJZ, ACA), Stillwater, MN.
Source of Funding: HealthPartners and Lakeview Health Foundation were the sole sources of funding.
Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.
Authorship Information: Concept and design (TEK, HSK, MMC, AJZ, GV-B); acquisition of data (MMC, AJZ, ACA); analysis and interpretation of data (TEK, HSK, MMC, AJZ, ACA, GV-B); drafting of the manuscript (TEK, HSK, MMC, GV-B); critical revision of the manuscript for important intellectual content (TEK, MMC, ACA, GV-B); statistical analysis (HSK, GV-B); provision of patients or study materials (MMC); administrative, technical, or logistic support (ACA); and supervision (TEK).
Address Correspondence to: Thomas E. Kottke, MD, MSPH, HealthPartners, 8170 33rd Ave S, MS 21110X, Minneapolis, MN 55425. Email: Thomas.e.kottke@HealthPartners.com.
1. Gratale DJ, Counts NZ, Hogan L, et al. Accountable communities for health for children and families: approaches for catalyzing and accelerating success. National Academy of Medicine. January 13, 2020. Accessed April 11, 2020. https://nam.edu/accountable-communities-for-health-for-children-and-families-approaches-for-catalyzing-and-accelerating-success/
2. Gratale D, Chang D. Defining an accountable community for health for children and families. National Academy of Medicine. October 30, 2017. Accessed April 11, 2020. https://nam.edu/defining-an-accountable-community-for-health-for-children-and-families/
3. Gortmaker SL, Dietz WH Jr, Sobol AM, Wehler CA. Increasing pediatric obesity in the United States. Am J Dis Child. 1987;141(5):535-540. doi:10.1001/archpedi.1987.04460050077035
4. Ogden CL, Fryar CD, Martin CB, et al. Trends in obesity prevalence by race and Hispanic origin—1999-2000 to 2017-2018. JAMA. 2020;324(12):1208-1210. doi:10.1001/jama.2020.14590
5. Ploughman M. Exercise is brain food: the effects of physical activity on cognitive function. Dev Neurorehabil. 2008;11(3):236-240. doi:10.1080/17518420801997007
6. García-Hermoso A, Ramírez-Vélez R, García-Alonso Y, Alonso-Martínez AM, Izquierdo M. Association of cardiorespiratory fitness levels during youth with health risk later in life: a systematic review and meta-analysis. JAMA Pediatr. 2020;174(10):952-960. doi:10.1001/jamapediatrics.2020.2400
7. PowerUp. Accessed November 20, 2020. http://www.powerup4kids.org/Home
8. Canterbury M, Pronk N, Kottke TE, Zimmerman D. Case study: the power of community in population health: PowerUp for kids. In: Nash DB, Skoufalos A, Fabius RJ, Oglesby WH, eds. Population Health: Creating a Culture of Wellness. 3rd ed. Jones & Bartlett; 2019:427-436.
9. Canterbury M, Hedlund S, Zimmerman D. PowerUp in the St. Croix Valley (MN/WI) case study. In: Institute of Medicine. Cross-Sector Responses to Obesity: Models for Change: Workshop Summary. The National Academies Press; 2015:109-114.
10. Canterbury M, Hedlund S. The potential of community-wide initiatives in the prevention of childhood obesity. Diabetes Spectr. 2013;2013(3):165-170. doi:10.2337/diaspect.26.3.165
11. Institute of Medicine. Cross-Sector Responses to Obesity: Models for Change: Workshop Summary. The National Academies Press; 2015.
12. Institute of Medicine. Early Childhood Obesity Prevention Policies. The National Academies Press; 2011.
13. Institute of Medicine. Accelerating Progress in Obesity Prevention: Solving the Weight of the Nation. The National Academies Press; 2012.
14. Barlow SE; Expert Committee. Expert committee recommendations regarding the prevention, assessment, and treatment of child and adolescent overweight and obesity: summary report. Pediatrics. 2007;120(suppl 4):S164-S192. doi:10.1542/peds.2007-2329C
15. Madsen KA, Thompson HR, Linchey J, et al. Effect of school-based body mass index reporting in California public schools: a randomized clinical trial. JAMA Pediatr. 2020;175(3):251-259. doi:10.1001/jamapediatrics.2020.4768
16. Gonzalez-Suarez C, Worley A, Grimmer-Somers K, Dones V. School-based interventions on childhood obesity: a meta-analysis. Am J Prev Med. 2009;37(5):418-427. doi:10.1016/j.amepre.2009.07.012
17. NQF’s history. National Quality Forum. Accessed November 20, 2020. http://www.qualityforum.org/about_nqf/history/
18. Adams JL. The Reliability of Provider Profiling: A Tutorial. RAND Corporation; 2009.
19. Hertzmark E, Spiegelman D. The SAS ICC9 Macro. Harvard University; 2010.
20. School Health Policies and Practices Study (SHPPS). CDC. Accessed April 11, 2020. https://www.cdc.gov/healthyyouth/data/shpps/index.htm
21. SHPPS School Health Policies and Practices Study: 2016 overview. CDC. 2016. Accessed September 7, 2020. https://www.cdc.gov/healthyyouth/data/shpps/pdf/2016factsheets/Overview-SHPPS2016.pdf
22. SPAN-ET: School Physical Activity and Nutrition Environment Tool. Oregon State University Extension Service. Accessed September 7, 2020. https://extension.oregonstate.edu/span-et
23. CDC. Your Guide to Using the School Health Index. HHS; 2019. Accessed September 7, 2020. https://www.cdc.gov/healthyschools/shi/pdf/FINAL_School-Health-Index-Guide-112619_revd-120319_508tag.pdf
24. Child Nutrition and WIC Reauthorization Act of 2004, Pub L No. 108-265, 118 Stat 729 (2004).
25. John DH, Gunter K, Jackson JA, Manore M. Developing the School Physical Activity and Nutrition Environment Tool to measure qualities of the obesogenic context. J Sch Health. 2016;86(1):39-47. doi:10.1111/josh.12348
26. Brener ND, Kann L, Smith TK. Reliability and validity of the School Health Policies and Programs Study 2000 questionnaires. J Sch Health. 2003;73(1):29-37. doi:10.1111/j.1746-1561.2003.tb06556.x
27. Bureau of Justice Assistance. Implementing Project DARE: Drug Abuse Resistance Education. US Department of Justice, Office of Justice Programs; 1988. Accessed September 7, 2020. https://www.ojp.gov/ncjrs/virtual-library/abstracts/implementing-project-dare-drug-abuse-resistance-education-0
28. West SL, O’Neal KK. Project D.A.R.E. outcome effectiveness revisited. Am J Public Health. 2004;94(6):1027-1029. doi:10.2105/ajph.94.6.1027
29. Singhal J, Herd C, Adab P, Pallan M. Effectiveness of school-based interventions to prevent obesity among children aged 4 to 12 years old in middle-income countries: a systematic review and meta-analysis. Obes Rev. 2021;22(1):e13105. doi:10.1111/obr.13105
30. Staten LK, Teufel-Shone NI, Steinfelt VE, et al. The School Health Index as an impetus for change. Prev Chronic Dis. 2005;2(1):A19.
31. Pearlman DN, Dowling E, Bayuk C, Cullinen K, Thacher AK. From concept to practice: using the School Health Index to create healthy school environments in Rhode Island elementary schools. Prev Chronic Dis. 2005;2(Spec No):A09.