Skip to main content
  • Research article
  • Open access
  • Published:

Association between multi-component initiatives and physical activity-related behaviors: interim findings from the Healthy Schools Healthy Communities initiative

Abstract

Background

Although successful, assessment of multi-component initiatives (MCIs) prove to be very challenging. Further, rigorous evaluations may not be viable, especially when assessing the impact of MCIs on long-term population-level behavior change (e.g., physical activity (PA) and health outcomes (e.g., childhood obesity). The purpose of this study was to use intensity scoring, to assess whether higher intensity MCIs implemented as part of Healthy Schools Healthy Communities (HSHC) were associated with improved physical activity and reduced sedentary behaviors among youth (dependent variables).

Methods

PA-related interventions were assigned point values based on three characteristics: 1) purpose of initiative; 2) duration; and 3) reach. A MCI intensity score of all strategies was calculated for each school district and its respective community. Multivariate longitudinal regressions were applied, controlling for measurement period, Cohort, and student enrollment size.

Results

Strategy intensity scores ranged from 0.3 to 3.0 with 20% considered “higher-scoring” (score > 2.1) and 47% considered “lower-scoring” (< 1.2). Average MCI intensity scores more than tripled over the evaluation period, rising from 14.8 in the first grant year to 32.1 in year 2, 41.1 in year 3, and 48.1 in year 4. For each additional point increase in average MCI intensity score, the number of days per week that students reported PA for at least 60 min increased by 0.010 days (p < 0.01), and the number of hours per weekday that students reported engaging in screen time strategies decreased by 0.006 h (p < 0.05). An increase of 50 points in MCI intensity score was associated with an average 0.5 day increase in number of weekdays physically active and an increase of 55 points was associated with an average decrease of 20 min of sedentary time per weekday.

Conclusions

We found a correlation between intensity and PA and sedentary time; increased PA and reduced sedentary time was found with higher-intensity MCIs. While additional research is warranted, practitioners implementing MCIs, especially with limited resources (and access to population-level behavior data), may consider intensity scoring as a realistic and cost effective way to assess their initiatives. At a minimum, the use of intensity scoring as an evaluation method can provide justification for, or against, the inclusion of an individual strategy into an MCI, as well as ways to increase the likelihood of the MCI impacting population-health outcomes.

Peer Review reports

Background

The health benefits of regular engagement in moderate-to-vigorous physical activity (PA) are well-established [1, 2]. Meeting the recommended 60 min of PA per day has musculoskeletal and cardiovascular health benefits and supports the maintenance of healthy body weight among children [3, 4]. More specifically, regular PA decreases the likelihood of becoming overweight or obese [1], with some estimates suggesting by as much as 70% [5]. In contrast, Tremblay et al. [5] found sedentary behavior significantly increases the likelihood of becoming overweight or obese by as much as 61%. Furthermore, evidence has emerged that sedentary behaviors are ubiquitous and represent a health risk independent of PA [6].

Traditional public health efforts have primarily targeted the individual, emphasizing education and behavioral skills training, and have not resulted in sustained behavior change [7, 8]. Consequently, there is an increasing focus on multi-component initiatives (MCIs), or efforts comprised of numerous strategies that work on multiple levels (i.e., individual, interpersonal, organizational, and community) at the same time, and in a best case scenario are synchronized across levels [9]. Such initiatives tend to target multiple settings such as schools, parks, and streets simultaneously—and do so in a coordinated manner to create greater intensity, effect, and sustainable change [9,10,11,12,13,14,15,16,17,18,19].

To ensure success, MCIs must include evidence-based strategies, as well as extensive community engagement and attention to community needs, wants, and strengths [20]. This kind of organization and coordination of strategies across different settings is challenging, yet there are a number of successful MCIs [10,11,12,13, 15,16,17, 21]. For example, Romon et al., [13] found that over a long period of time, MCIs with various strategies (e.g., new sporting facilities built, educators employed to promote PA in schools, and family strategies organized) had a synergistic effect on overweight prevalence. Shape Up Somerville — a comprehensive intervention that included a variety of school and community-wide initiatives — was found to decrease BMI z-score in children at high risk for obesity [22].

Assessment of MCIs prove to be very challenging [18, 23], and rigorous evaluations (e.g., randomized control trial, quasi-experiment) may not be viable especially when assessing the long-term effectiveness of MCIs. Moreover, while funders and other stakeholders want to see population-level health improvements (e.g., reduced obesity), these changes may not be detectable in the short-term [24, 25]. Efforts to assess strategies and their likely positive impact in the long-term are warranted, and have been increasingly explored [15, 26,27,28,29,30]. Cheadle et al. [31], developed and assessed various attributes (e.g., reach, efficacy and strength) of single strategies, and found those with higher reach and strength were correlated with improved health behaviors [31].

Numerous other researchers collaborated to explore the relationship between child obesity and characteristics of 130 MCIs across the U.S. using three predictors of population health—the purpose (the way the strategy is intended to impact behavior), duration (the length of time of the strategy), and reach (the number of people exposed to a strategy) [15, 27]. Each strategy was assessed for purpose, duration, and reach, and an overall MCI “intensity” score was calculated, based on the science which suggests that lower scores have less likelihood for sustained population-level behavior change compared to higher scoring strategies. Strategies that improve access (e.g., installation of walking/running trail), reduces barriers (e.g., increased time in physical education classes), or changes broader conditions (e.g., lighting repairs and maintenance of neighborhood playground) are higher in intensity compared to those aiming to educate or enhance skills (e.g., information on how to be physically active) [32,33,34]. Evidence also suggests that when more people are exposed, and for longer periods of time, the greater the likelihood that the strategy will lead to desired outcomes [27, 28, 31].

These types of studies have made an impactful contribution to the field by establishing a systematic way for measuring MCIs. That said, further exploration of these methodologies as a tool for assessing MCIs, determining the intensity necessary to achieve health improvements, and making strategic improvements is important.

Purpose and objectives

Using a realistic and cost effective way to assess MCIs, the purpose of this study was to explore intensity scoring as described by Fawcett et al. [28] and Collie-Akers et al. [27] to assess whether higher intensity PA-related MCIs implemented as part of the Healthy Schools Healthy Communities (HSHC) initiative were associated with improved physical activity-related behaviors among youth.

Methods

Intervention approach

With funding from the Missouri Foundation for Health, HSHC was implemented across 34 Missouri school districts and their respective communities to address childhood obesity targeting thousands of children and youth in grades K–8. Funding was intended to build on existing school and community assets to stimulate implementation of new and/or advanced efforts for increasing access to healthy food and PA in vulnerable communities throughout the foundation’s catchment area (see https://mffh.org/the-foundation/where-we-work/). As per the logic model, technical assistance and increased linkages within and across grantees, resources, and funding were intended to lead to short-term outcomes (e.g., establishment of strong, durable partnerships; regular collaboration and communication), intermediate outcomes (e.g., increased capacity, improved perceptions and behaviors regarding PA), and ultimately, long-term outcomes (e.g., increased percentage of youth at a healthy weight). HSHC began in 2013 with a cohort of 13 school districts across 12 communities (Cohort 1). The Missouri Foundation for Health enrolled 12 new school districts (adding one new community) (Cohort 2) in 2014, and 9 new school districts (Cohort 3) in 2015. Overall, there were 33 school districts enrolled across 13 communities included in the analysis.

School and community coordinators conducted wellness assessments and created action plans to achieve the long-term goal of reducing childhood obesity. School action plans were guided by the Alliance for a Healthier Generation’s Healthy Schools Program Framework of Best Practices (Alliance Framework) [35] and addressed CSPAP components. Community action plans were informed by the YMCA’s Community Healthy Living Index (CHLI) [36]. Across schools and communities, both the action plans and stakeholders implementing the strategies varied greatly, however they all included any combination of CSPAP strategies, and aimed to: 1) increase knowledge and awareness, enhance skills, support behavior change, and motivate the community, and 2) modify broader conditions. Common strategies were walk-to-school days, health and wellness fairs, joint-use-agreements, and installation of playground equipment or walking trails. Table 1 provides examples and an overview of these strategies, as recorded by school and community coordinators in real-time, by year.

Table 1 Activity Types Implemented Over the First Four Years, Missouri HSHCa 2013–2017

John Snow, Inc. (JSI), a research and consulting firm specializing in the implementation and evaluation of community-wide initiatives, was contracted by the Missouri Foundation for Health to conduct a mixed-methods evaluation during the first half of the HSHC initiative (2013–2017). This study was reviewed by John Snow Inc. IRB (OHRP IRB00009069) and deemed exempt. The evaluation was guided by the work of HSHC logic model and assessed the strategies set forth in each district and respective community’s action plans. Various methods were used to capture both quantitative and qualitative data including: 1) an online data platform which allowed grantees to document their strategies in real-time; 2) interviews with grantees; and 3) surveys administered at school to all students in 5th through 8th grades at baseline (in the fall) and once a year thereafter (in the spring) to assess PA behaviors and perceptions.

Regardless of how intentional or coordinated, each strategy was included in the evaluation if it was reported by the coordinators. For this investigation, however, strategies were only included in the analysis if they specifically targeted PA and/or sedentary activity across the participating schools and communities (Table 1). Strategies that only addressed healthy eating and those that were not reported by local champions between September 1, 2013 and July 31, 2017, were not included in these analyses (see Fig. 1 for an evaluation overview).

Fig. 1
figure 1

Evaluation Overview

Independent variables

Strategy intensity score

Individually, four evaluation team members assigned point values for every strategy based on the three attributes used in previous research [27]: 1) purpose (i.e., providing information, enhancing skills or services, modifying access or changing broader conditions); 2) duration (i.e., occurring just once, several times, or ongoing); and 3) reach or penetration of the strategy (i.e., the proportion of the total city/town population that either participated in or could have been exposed to the strategy based on where the intervention was implemented). To ensure inter-rater reliability, an agreement across the study team of at least 80% was accomplished. Each attribute was scored either a 0.1 (minimum), 0.55 (medium), or 1.0 (maximum) and summed to calculate an intensity score for every strategy (Σ purpose value + duration value + reach value). Strategy scores ranged from 0.3 (weakest and potentially of less influence on longer-term outcomes) to 3.0 (strongest and potentially more sustainable and of greater influence). Strategies that were a more permanent fixture (e.g., new playground equipment) or a policy that would take a process to change (e.g., a joint use agreement) were scored “ongoing” (assigned a duration score of 1.0). Strategies defined as ongoing were treated as “active” in each subsequent year after its reported adoption/installment, unless otherwise reported as over. For example, if a new park was installed in 2014, it would be assigned a duration score of 1.0 and its intensity score would be included as a separate strategy in the 2015, 2016, and 2017 grant years. Scoring examples are provided in Table 2.

Table 2 Protocol for Assigning Intensity Score for Each Strategy, Missouri HSHC 2013–2017

MCI intensity score

An average composite intensity score comprised of all strategies reported for each school district and its respective community was calculated and reported as the MCI intensity score. In cases where multiple school districts were located in the same community, or served by the same community-based organization funded to implement HSHC strategies, the individual community-level intensity scores were included in each school district’s MCI intensity score. Separate MCI intensity scores were calculated for each district for every grant year of participation.

District cohort

HSHC began with a cohort of 13 school districts across 12 communities in fall 2013 (Cohort 1) and was expanded by the Missouri Foundation for Health over the next 2 years. In 2014, HSHC grew to include 12 new school districts (adding one new community) (Cohort 2); in 2015, an additional 9 school districts were added (Cohort 3).

Time since enrolling in HSHC (Grant year)

Grant year accounts for the number of years that the cohort or group of school districts was enrolled in HSHC. Because school districts were enrolled in HSHC between the of fall of 2013 and 2015, the baseline of zero grant years (and subsequent years) corresponds to different calendar years depending on each school district’s cohort. The first grant year for Cohort 1 was 2013, Cohort 2 was 2014, and Cohort 3 was 2015. For example, the first grant year for Cohort 1 took place in 2013, and thus the time since enrolling in HSHC would be equal to 0. In 2014, it would be equal to 1, while Cohort 2 (which started in 2014) would be equal to 0.

Student enrollment size

The total number of students enrolled in the targeted grades for HSHC (K–8) was determined annually by the Missouri Department of Elementary and Secondary Education’s website (https://mcds.dese.mo.gov/quickfacts/Missouri School Directory). School districts were classified into a continuous format ranging from 1 to 63 for every 100 students (enrollment sizes of 1–100 students were assigned a value of 1, sizes of 101–200 students a value of 2, etc.).

Dependent variables

PA. A self-reported survey was administered by classroom teachers to all 5th through 8th grade students enrolled in the school districts. The survey was conducted in the spring of year 1 enrollment into HSHC (which was different for each cohort) and each subsequent school year. Standard questions on PA behaviors and perceptions were incorporated into the survey. Prior to administering the survey, the reading level of each survey question was verified (which averaged a 6th grade reading level) and piloted with a number of 5th and 6th graders. Students were asked, “During the past 7 days, on how many days were you physically active for a total of at least 60 minutes per day? Add up all of the time you spent in any kind of physical activity that increased your heart rate and made you breathe hard for some of the time.” PA time was defined as the number of days students reported engaging in PA for at least 60 min.

Screen time

Using the same self-reported survey as defined above, students were asked about their screen time (a major form of sedentary behavior) on an average school and weekend day. The questions were informed by the literature documenting recommended assessments of sedentary behaviors [37]. Questions read the same with the exception of the day (school day and weekend day). On an average “X” day, how many hours do you watch TV, play video games or computer games, or use a computer for something that is not school work? Count the time spent on things such as Xbox, PlayStation, an iPod, and iPad or other tablet, a smartphone, YouTube, Facebook or other social networking tools like Twitter or Pinterest, and the internet).” Screen time was defined as the number of hours per school and weekend day spent engaging with such technology.

Outcomes were operationalized as the average district number of days reported being physically active for a total of at least 60 min per day during the past 7 days, and average district number of hours of screen time per school and weekend day.

Analysis

Data were aggregated at the school-district level for each school year, beginning September 1, 2013 through July 31, 2017. This includes both school and community strategies. Descriptive statistics were generated to describe the MCI intensity scores and each of the three outcomes over time and by cohort. ANOVAs were applied to detect significant differences between cohorts and outcome measures at baseline. Pearson’s correlation was conducted to evaluate the crude association between MCI intensity scores and the three outcomes. To account for the correlation of repeated measurements within school districts, number of days per week physically active, and both number of hours engaged in screen time outcomes were analyzed using linear mixed-effects models for repeated measures (repeated observations nested within districts). Models included a random effect for district. All models controlled for district cohort (1, 2, or 3), student enrollment size, and time since enrolling into HSHC. Controlling for these factors is important given the variations among schools and communities and the time within which they entered into HSHC. A variance components covariance structure was used under the assumption of independence across measures and due to the fact that the independent variables were measured along different scales. With time since enrolling into HSHC as a fixed effect, the Bonferroni correction was applied to adjust for multiple comparisons over time. Analyses were conducted in SAS 9.4 (SAS Institute, Inc. Cary, NC).

Results

PA-related strategies by CSPAP

There were 2174 PA-related strategies implemented over the 4 years of study (Fig. 2) – see the supplementary table for breakdown of the PA-related strategies by community and school/childcare across time. Almost one-third (31%) of the strategies were community-based; 19% of the strategies were related to before and after school physical activity, with the majority occurring primarily after school; and 16% were related to increased physical activity during school. Fifteen percent (15%) of the strategies were either environmental or policy changes which occurred in the school and/or community.

Fig. 2
figure 2

Total Number and Percentage of Strategies by Comprehensive School Physical Activity Program (CSPAP) Interventionsa. a The CSPAP framework organizes school physical activity interventions into five categories: 1) physical activity before and after school, 2) physical activity during school. 3) family and community engagement, 4) physical education, and 5) school staff involvement

PA-related score by attribute

Out of the 2174 PA-related strategies implemented from 2013 to 2017, 56.2% were low-scoring in purpose (0.1), 31.6% were medium scoring (0.55), and 12.2% were high-scoring (1.0). In other words, over half of the strategies aimed to increase knowledge or enhance skills, one-third enhanced services, and a little over 10% modified access or changed broader conditions. Almost half (49.2%) of all strategies were low-scoring in duration, meaning they happened only once. Just over one-third (39.1%) were medium-scoring, and happened more than once but were not ongoing, and 11.7% were high-scoring or more permanent strategies. Finally, among the reach attribute score components, 51.7% were low-scoring, meaning they reached less than 5% of the total population, and almost one-quarter (24.4 and 24.0%, respectively) were both medium and high-scoring, and reached higher percentages of the population.

PA-related score by strategy

The sum of each individual strategy’s attributes, or PA-related strategy score ranged from 0.3 (lowest) to 3.0 (highest), with 20% of strategies considered “higher-scoring” (2.1 or higher), and an increased likelihood of having a greater impact on long-term positive behavior change and health outcomes. Almost half of all strategies (47%) were “lower-scoring” (score of 1.2 or below). Across all years and districts, the average individual strategy score was 1.33.

MCI intensity scores

The MCI intensity scores, or the averaged sum of all the strategies implemented within a school district and its respective community rose from 14.8 in the first grant year to 32.1 in year 2, 41.1 in year 3, and 48.1 in year 4 (Table 3). The total mean MCI intensity score for all Cohort 1 districts increased with every subsequent year engaged in HSHC (with the exception of the last year where it remained steady from the previous year), rising from a mean of 15.4 in the first year to 48.1 in the fourth year. Cohort 1 was involved in HSHC for the longest time, and scores were higher every grant year (2014–2017) compared to Cohorts 2 and 3. Where comparisons were possible, this trend held true for Cohort 2, which was enrolled in HSHC for the second longest period of time, and had higher scores than Cohort 3.

Table 3 Characteristics of Districts, Missouri HSHC 2013–2017

Across all years and cohorts, Cohort 1 had higher average MCI intensity scores, but implemented a considerably higher number of strategies compared to Cohorts 2 and 3 (Fig. 3). Moreover, while the actual number of Cohort 1 strategies more than doubled in the second year compared to the first (n = 377 vs. 163), the average strategy intensity score only increased slightly (1.28 from 1.24). In the third year, Cohort 1 implemented 478 strategies compared to Cohort 2’s 252, but the average intensity score per strategy was lower (1.37 vs. 1.48, respectively). Cohort 1 was the only one involved all 4 years of the study, with 454 strategies implemented in year 4, averaging an intensity of 1.38 per strategy. In summary, the large number of Cohort 1 strategies implemented each year drove up Cohort 1’s average MCI intensity score.

Fig. 3
figure 3

Total Number of Activities and Average Intensity Score per Activity

Unlike Cohort 1, Cohort 2 implemented fewer strategies but those that were implemented were at a higher intensity per strategy. Cohort 2 strategies did not quite double in year 2 compared to year 1 (n = 248 vs. 144), but the average intensity score per strategy was higher in the second year compared to the first (1.48 from 1.27). The number of Cohort 3 strategies almost doubled in year 2 compared to year 1 (n = 211 vs. 110), but the strategies had the lowest average intensity score in both year 1 (1.04) and year 2 (1.12) compared to the other cohorts’ year 1 and 2 scores.

PA and screen time behavior outcomes

It is likely that strategies were implemented prior to the launch of HSHC, however, they were not documented and therefore assumed as 0. The total mean number of days per week students reported PA for at least 60 min increased from 4.4 days at baseline to 4.8 days at year 4 (Table 3). While Cohort 1 showed a 0.6 day increase from baseline to year 4, Cohorts 2 and 3 showed a 0.1 day increase from baseline to years 3 and 2, respectively. The mean number of screen time hours per weekday and weekend day remained relatively stable across all study years for all cohorts.

As indicated in Table 3, there were no significant differences at baseline across cohorts in mean number of days per week that students reported PA for at least 60 min per day (p = 0.1004), or number of screen time hours reported per school (p = 0.4743) or weekend day (p = 0.2711).

At baseline, the average number of days students reported engaging in PA 60 min or more per day over the past 7 days was 4.4. Cohorts 1 and 3 reported fewer days compared to Cohort 2 (4.2 days vs. 4.7, respectively). The number increased for each cohort after its first year in HSHC (average 4.8 days compared to 4.4 days at baseline) and remained higher than baseline each year thereafter. The number of hours engaged in screen time on the average weekday increased slightly from baseline, with reported cohort averages ranging from 2.3 to 2.6 across all cohorts and survey years. Students reported engaging in more hours of screen time on a weekend day, compared to a weekday, with averages ranging from 2.9 to 3.1.

Relationship between MCI intensity scores and PA and screen time

Multivariate longitudinal regressions were applied to better understand the association between average MCI intensity score and PA and screen time, controlling for measurement period, cohort, and student enrollment size (Table 4). There was a statistically significant positive correlation between the average MCI intensity score and the number of days per week that students reported PA for at least 60 min per day. For each additional point increase in average MCI intensity score, the number of days per week students reported PA for at least 60 min increased by 0.010 days (p = 0.004). In other words, holding cohort, student enrollment size, and grant time constant, a modeled increase of 50 points in the average MCI intensity score (achievable through implementation of multiple MCI strategies within the same district) is associated with an average increase of 0.5 days per week of PA for at least 60 min.

Table 4 Summary of multiple longitudinal regression analysis, Missouri HSHC 2013–2017

There was also a statistically significant negative correlation between the average MCI intensity score and the number of hours per weekday students reported being engaged in screen time. For each additional point increase in average MCI intensity score, the number of hours per weekday that students reported engaged in screen time strategies decreased by 0.006 h (p = 0.016). In other words, holding cohort, student enrollment size, and grant time constant, a modeled increase of 55 points in the average MCI intensity score (achievable through implementation of multiple MCI strategies within the same district) is associated with an average decrease of 20 min of screen time per weekday. Even though average number of weekday hours spent engaged in screen strategies increased slightly over time, the relationship between higher intensity scores and lower screen time was significant. There was also a negative correlation between weekend screen time hours per day and intensity score, but it was not significant (p = 0.098).

Discussion

The comprehensive nature of MCIs, with multi-purpose PA strategies, show promise in addressing childhood obesity. Yet, they are extremely challenging to evaluate. Given the timing of the initiative (e.g., first 4 years) and the challenges of detecting changes in long-term outcomes such as obesity, the purpose of this study was to use intensity scoring to assess whether higher intensity MCIs implemented as part of a statewide initiative were associated with improved PA and reduced screen time among youth (dependent variables). Similar to previous research,25, 26, 29 we found a statistically significant relationship between higher MCI intensity scores and increased PA and decreased screen time behavior. Specifically, children living in a community with a higher average MCI intensity score had increased PA and decreased screen time. These findings are consistent with Pate et al. [38] who discovered a higher intensity scoring index for PA strategies was positively associated with non-Hispanic white children’s PA.

From a practical standpoint, an increase of 50 points was associated with an average 0.5 day increase in number of days per week physically active and an increase of 55 points was associated with an average decrease of 20 min of screen time per weekday. Yet, strategies contribute differently (between 0.3 to 3.0 points) to the overall MCI intensity score, and efforts to reach 50 to 55 points and see these results can also vary. The average contribution of each intervention across all districts and years was medium score of 1.33 points (constructed on a range with the lowest score possible being a 0.3 and the highest score possible a 3.0). Based on our findings, MCIs would need to include 38 medium-scoring strategies to decrease screen time by 20 min per weekday and/or include 42 medium-scoring interventions to increase 60 min of daily physical activity by 0.5 days per week. Higher MCI intensity scores (> 50 points) can be accomplished by implementing 1) strategies that modify access or change broader conditions which reach more people over longer periods of time (e.g., policy/environmental changes), 2) a greater number of strategies that increase knowledge or enhance skills but reach fewer people for shorter periods of time, or 3) a combination of both. Given that resources are often limited, careful consideration of the strategies within any given MCI should be taken to maximize the likelihood of long-term impact on population-level health.

In a review of three MCI case studies, Mikkelsen et al. [39], found that using the full range of strategies are key to a successful implementation, and efforts to increase knowledge, enhance services, modify access, and change broader conditions should all be included. The HSHC MCIs included a variety of educational events/programs, as well as policy, practice, and environmental changes, but there were over three times more programs and events as compared to policy, practice, and environmental changes (1703 vs. 471, respectively). Although programs and events are important, over half of the HSHC strategies scored low in purpose. Based on our findings, a more even distribution of strategies, fewer lower scoring strategies and more medium-to-high scoring strategies would have led to higher overall MCI intensity scores and likely better improvements in physical activity and screen time outcomes. Furthermore, while the goal of HSHC was to reduce childhood obesity in targeted communities, and the action plans were informed by evidence-based guides, this study did not assess whether strategies within the MCI reinforced each other. Future evaluations of MCIs should not only look at the intensity of the strategies but also whether they are synchronized and reinforcing.

Practitioners implementing MCIs are all faced with the challenge of how to demonstrate impact on population-level health behaviors and outcomes while also being open to participatory approaches and co-creation with local stakeholders. Mikkelsen et al. [39], found in general that program evaluator’s focus on overall effects. Yet, unlike a controlled setting where all possible confounders can be eliminated, MCIs are unpredictable in timing and scope, and populations are exposed (or not) to a potentially causal factor or factors (e.g., a new trail, a policy to ensure children have recess). Moreover, rigorous evaluation designs are often not feasible for local practitioners to conduct. As such, research suggests the importance of gathering and reporting evaluative data through methods that are suitable and acceptable in terms of scientific standards and support a project with timely feedback [40] while also ensuring that they are low-cost and easy to administer [39]. We found that scoring evidence-informed attributes of all strategies within a MCI and calculating an intensity score addresses these factors. Assessing the intensity of strategies can demonstrate progress towards reaching long-term goals, which can take years to be realized, as well as provide timely feedback which can guide program improvements or redirect resources (e.g., to increase the intensity of a strategy or strategies).

Within HSHC, over half of the strategies aimed to increase knowledge or enhance skills, 49.2% happened only once, 51.7% reached less than 5% of the total population. Findings such as these can help stakeholders and funders improve strategies within a MCI (e.g., implement at a greater frequency, work with partners to increase the reach of a program), or provide reasoning as to why a strategy (low-intensity) may be consuming a lot of resources with little return (e.g., likelihood of long-term impact). Adjustments may free up resources to implement fewer, more intense efforts that are more likely to lead to improved outcomes. Additional evaluations should consider the use of intensity scoring for both assessing and redirecting action plans, as well as understanding the impact of a MCI on youth physical activity and sedentary behavior.

This study is not without limitations. First, data were analyzed at the school district level and limited in terms of the overall small sample of districts and number of districts per year. Different types of interventions have varying levels of impact, and thus using a MCI district-level intensity score of all strategies cannot differentiate between intervention types or combinations. Second, student behaviors were self-reported and reading comprehension levels vary between 5th and 8th graders. We took steps in ensuring the appropriateness of the survey: it was piloted with 5th and 6th graders and the survey questions were adapted from valid and reliable sources. Each question averaged a 6th grade reading level and it was deemed appropriate. While we could have used more age-appropriate versions for each grade, we felt it was more important to reduce the potential for error in terms of the coordinators and/or teachers administering the wrong survey to the wrong students (e.g., 8th grade version given to 5th graders). Additionally, using the same survey version made it more methodologically sound to allow for comparisons across grades over time. Third, many previous studies show differences in the level of physical activity and sedentary behavior by age and sex. We controlled for grade, which we felt was reasonable given that there were no significant outliers by age per grade but did not control for sex given the number of controls in our model. Future studies should perhaps consider this variable. Fourth, PA levels were based on a question asking about activity that “increased your heart rate and made you breathe hard for some of the time.” While participants might have underreported the time spent in moderate activity, we found no significant differences between self-reported PA and data collected via pedometers on 5th grade students (not reported here). Future research should consider including multiple sources of data.

Another limitation was that the data used to calculate intensity was self-reported by various school and community stakeholders. It may have been incomplete and/or subjective, and there were likely variances in reporting across individuals. The evaluation team did, however, take necessary steps to ensure consistent and quality data collection by providing guidance and training in various formats (e.g., webinars, protocols, one-on-one technical assistance) and reviewing the data regularly. Upon review, the evaluation team followed-up with HSHC coordinators for clarification as needed. It was, however, left to the HSHC coordinator to provide the final data. Sixth, HSHC’s primary goal was to reduce childhood obesity and focused on MCIs that included both physical activity and healthy eating strategies. The purpose of this investigation was to identify associations between intensity and PA/sedentary behavior – two more intermediate outcomes – rather than the longer-term outcome of obesity. Given the focus of PA/sedentary time, we only considered the strategies that targeted physical activity-related behaviors (even if they also included healthy eating). It is possible that the healthy eating-only strategies, which were being implemented at the same time, may have confounded the results. Finally, the evaluation team was not involved in the enrollment of school districts and communities, which further challenged an evaluation design with a comparison/control group. The number of districts expanded to a total of 33 by 2017, but covered most of the same communities as in the first grant year. Children attending the schools later enrolled into HSHC may have been exposed to community-level interventions implemented prior to their school being on-boarded to HSHC, and their behaviors may have been influenced and baseline data subsequently impacted. Moreover, the time in which any child could have been exposed to a strategy may have varied for similar reasons. It is difficult to avoid confounding factors in an evaluation such as this, where the evaluators have no control in enrollment and implementation, and where the funding is limited. Nevertheless, the study limitations listed above are not unique to MCI being implemented and evaluated in “real-time.”

Regardless of the limitations, this study adds to the growing literature on MCIs addressing physical activity and screen time behavior across multiple settings in various communities, and with stakeholders facilitating and co-constructing the strategies. Findings expand upon others’ efforts to identify a realistic and cost effective way to scientifically evaluate these complex MCIs. While additional research is warranted, practitioners implementing MCIs, especially with limited resources (and access to data on population-level behaviors), may consider intensity scoring as a realistic and cost effective way to assess their initiatives. At a minimum, the use of intensity scoring as an evaluation method can provide justification for, or against, the inclusion of an individual strategy into an MCI, as well as ways to increase the likelihood of the MCI impacting population-health outcomes.

Conclusions

MCIs are difficult to evaluate given the variations within which they are implemented. Findings from this study suggest the value of a systematic scoring approach in assessing MCIs aimed to address physical activity and screen time behavior. In addition to providing a scientific way to evaluate complex initiatives, intensity scores can provide justification for, or against, an individual strategy and ways to increase the likelihood of the MCI impacting population-health outcomes.

Availability of data and materials

The authors do not wish to make the dataset available for several reasons: 1) it is the intellectual property of the authors and 2) the data were collected at the school level. Some of the schools were in rural areas and it is important that the schools (and students) remain anonymous.

Abbreviations

B/A PA:

PA before and after school

CDC:

Centers for Disease Control and Prevention

CHLI:

Community Healthy Living Index

CSPAP:

Comprehensive School Physical Activity Program

HSHC:

Healthy Schools and Healthy Communities

JSI:

John Snow, Inc.

MCI:

Multi-component initiative

PA:

Moderate-to-vigorous physical activity

U.S.:

United States

References

  1. U.S. Department of Health and Human Services. Physical Activity Guidelines Advisory Committee Report, 2008. Hyattsville: U.S. Department of Health and Human Services; 2008; 2008. http://www.health.gov/paguidelines

    Google Scholar 

  2. Tremblay MS, et al. Canadian 24-hour movement guidelines for children and youth: an integration of physical activity, sedentary behaviour, and sleep. Appl Physiol Nutr Metab. 2016;41(6 Suppl 3):S311–27.

    Article  Google Scholar 

  3. Janssen I, Leblanc AG. Systematic review of the health benefits of physical activity and fitness in school-aged children and youth. Int J Behav Nutr Phys Act. 2010;7:40.

    Article  Google Scholar 

  4. Tremblay MS, et al. Systematic review of sedentary behaviour and health indicators in school-aged children and youth. Int J Behav Nutr Phys Act. 2011;8:98.

    Article  Google Scholar 

  5. Tremblay MS, Willms JD. Is the Canadian childhood obesity epidemic related to physical inactivity? Int J Obes Relat Metab Disord. 2003;27(9):1100–5.

    Article  CAS  Google Scholar 

  6. Colley RC, et al. Physical activity of Canadian children and youth: accelerometer results from the 2007 to 2009 Canadian health measures survey. Health Rep. 2011;22(1):15–23.

    PubMed  Google Scholar 

  7. Sallis JF, Owen N. Physical Activity and Behavioral Medicine. Thousand Oaks: Sage Publications; 1999.

    Google Scholar 

  8. Dishman R. Increasing and maintaining exercise and physical activity. Behav Ther. 1991;22:345–78.

    Article  Google Scholar 

  9. McLeroy KR, et al. An ecological perspective on health promotion programs. Health Educ Q. 1988;15(4):351–77.

    Article  CAS  Google Scholar 

  10. Economos CD, et al. A community intervention reduces BMI z-score in children: shape up Somerville first year results. Obesity (Silver Spring). 2007;15(5):1325–36.

    Article  Google Scholar 

  11. Taylor RW, et al. APPLE project: 2-y findings of a community-based obesity prevention program in primary school age children. Am J Clin Nutr. 2007;86(3):735–42.

    Article  CAS  Google Scholar 

  12. Sanigorski AM, et al. Reducing unhealthy weight gain in children through community capacity-building: results of a quasi-experimental intervention program, be active eat well. Int J Obes. 2008;32(7):1060–7.

    Article  CAS  Google Scholar 

  13. Romon M, et al. Downward trends in the prevalence of childhood overweight in the setting of 12-year school- and community-based programmes. Public Health Nutr. 2009;12(10):1735–42.

    Article  Google Scholar 

  14. American Dietetic A. Position of the American Dietetic association: individual-, family-, school-, and community-based interventions for pediatric overweight. J Am Diet Assoc. 2006;106(6):925–45.

    Article  Google Scholar 

  15. Arteaga SS, et al. The healthy communities study: its rationale, aims, and approach. Am J Prev Med. 2015;49(4):615–23.

    Article  Google Scholar 

  16. Chomitz VR, et al. Healthy living Cambridge kids: a community-based participatory effort to promote healthy weight and fitness. Obesity (Silver Spring). 2010;18(Suppl 1):S45–53.

    Article  Google Scholar 

  17. Phillips MM, et al. The evaluation of Arkansas act 1220 of 2003 to reduce childhood obesity: conceptualization, design, and special challenges. Am J Community Psychol. 2013;51(1–2):289–98.

    Article  Google Scholar 

  18. Hunter CM, McKinnon RA, Esposito L. News from the NIH: research to evaluate "natural experiments" related to obesity and diabetes. Transl Behav Med. 2014;4(2):127–9.

    Article  Google Scholar 

  19. Verheijden MW, Kok FJ. Public health impact of community-based nutrition and lifestyle interventions. Eur J Clin Nutr. 2005;59(Suppl 1):S66–75 discussion S76.

    Article  Google Scholar 

  20. Collie-Akers VL, Fawcett SB, Schultz JA. Measuring progress of collaborative action in a community health effort. Rev Panam Salud Publica. 2013;34(6):422–8.

    PubMed  PubMed Central  Google Scholar 

  21. Samuels SE, et al. The California Endowment's healthy eating, active communities program: a midpoint review. Am J Public Health. 2010;100(11):2114–23.

    Article  Google Scholar 

  22. Economos CD, Curtatone JA. Shaping up Somerville: a community initiative in Massachusetts. Prev Med. 2009;50(Suppl 1):S97–8.

  23. Komro KA, et al. Research design issues for evaluating complex multicomponent interventions in neighborhoods and communities. Transl Behav Med. 2016;6(1):153–9.

    Article  Google Scholar 

  24. Holston D, et al. Implementing policy, systems, and environmental change through community coalitions and extension partnerships to address obesity in rural Louisiana. Prev Chronic Dis. 2020;17:E18.

    Article  Google Scholar 

  25. Compernolle S, et al. A RE-AIM evaluation of evidence-based multi-level interventions to improve obesity-related behaviours in adults: a systematic review (the SPOTLIGHT project). Int J Behav Nutr Phys Act. 2014;11(1):147.

    Article  Google Scholar 

  26. Cheadle A, et al. Using the concept of “population dose” in planning and evaluating community-level obesity prevention initiatives. Am J Eval. 2013;34:71–84.

    Article  Google Scholar 

  27. Collie-Akers VL, et al. Measuring the intensity of community programs and policies for preventing childhood obesity in a diverse sample of US communities: the healthy communities study. Pediatr Obes. 2018;13(Suppl 1):56–63.

    Article  Google Scholar 

  28. Fawcett SB, et al. Measuring community programs and policies in the healthy communities study. Am J Prev Med. 2015;49(4):636–41.

    Article  Google Scholar 

  29. Harner LT, et al. Using population dose to evaluate community-level health initiatives. Am J Prev Med. 2018;54(5 Suppl 2):S117–23.

    Article  Google Scholar 

  30. Kuo ES, et al. Dose as a tool for planning and implementing community-based health strategies. Am J Prev Med. 2018;54(5 Suppl 2):S110–6.

    Article  Google Scholar 

  31. Cheadle A, et al. Kaiser Permanente's community health initiative in northern California: evaluation findings and lessons learned. Am J Health Promot. 2012;27(2):e59–68.

    Article  Google Scholar 

  32. Institute of Medicine. Preventing Childhood Obesity: Health in the Balance. Washington, DC: The National Academies Press; 2005.

    Google Scholar 

  33. Institute of Medicine (IOM). Progress in Preventing Childhood Obesity: How Do We Measure Up? Washington DC: National Academies of sciences; 2006.

    Google Scholar 

  34. Koplan J, Institute of Medicine (U.S.). Committee on Progress in Preventing Childhood Obesity. Progress in preventing childhood obesity : how do we measure up? Washington, D.C.: National Academies Press. xvi; 2007. p. 475.

    Google Scholar 

  35. Alliance for a Healthier Generation, Healthy Schools Program Framework of Best Practices. 2016.

    Google Scholar 

  36. YMCA, Community Healthy Living Index (CHLI). https://www.ymca.net/chli-tools.

  37. Prince SA, et al. Measurement of sedentary behaviour in population health surveys: a review and recommendations. PeerJ. 2017;5:e4130.

    Article  Google Scholar 

  38. Pate RR, et al. Associations between community programmes and policies and children's physical activity: the healthy communities study. Pediatr Obes. 2018;13(Suppl 1):72–81.

    Article  Google Scholar 

  39. Mikkelsen BE, Novotny R, Gittelsohn J. Multi-level, multi-component approaches to community based interventions for healthy living-a three case comparison. Int J Environ Res Public Health. 2016;13(10):1023.

    Article  Google Scholar 

  40. Patton MQ. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. 3rd ed. New York: The Guilford Press. u; 2011.

    Google Scholar 

Download references

Acknowledgments

This study was supported by the Missouri Foundation for Health. However the findings and conclusions in this article are those of the authors and do not necessarily represent the official position of the Missouri Foundation for Health. The authors wish to express appreciation to the Missouri Foundation for Health and GMMB for support with the survey development. No financial disclosures were reported by authors of this paper.

Funding

The evaluation of HSHC was funded by the Missouri Foundation for Health, however, the evaluation and preparation of the manuscript was the work of the authors. Missouri Foundation for Health staff were not involved.

Author information

Authors and Affiliations

Authors

Contributions

TVC provided lead direction on the MCI study, oversaw the intensity scoring, and led the development of the manuscript. NS provided analytical analysis of the intensity scoring and overall development of the manuscript. LR assisted with the scoring and the overall development of the manuscript. AR assisted with the scoring and the overall development of the manuscript. CW assisted with the scoring and the overall development of the manuscript. AH provided analytical analysis intensity scoring. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Tamara Vehige Calise.

Ethics declarations

Ethics approval and consent to participate

This study was approved by John Snow Inc. IRB (OHRP IRB00009069). Per the IRB the evaluation was deemed exempt and therefore parental consent was not required.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Table S1.

2174 MCI community (N = 982) and school/childcare (N = 1192) related strategies over time by cohort/school district.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Calise, T.V., Spitzer, N., Ruggiero, L. et al. Association between multi-component initiatives and physical activity-related behaviors: interim findings from the Healthy Schools Healthy Communities initiative. BMC Public Health 21, 340 (2021). https://0-doi-org.brum.beds.ac.uk/10.1186/s12889-021-10312-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12889-021-10312-y

Keywords