If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
Address for correspondence: Jini E. Puma, PhD, Rocky Mountain Prevention Research Center, Department of Community and Behavioral Health, Colorado School of Public Health, University of Colorado Denver, 13001 E 17th Ave, Campus Box B-119, Denver, CO 80045
Rocky Mountain Prevention Research Center, Department of Community and Behavioral Health, Colorado School of Public Health, University of Colorado Denver, Denver, CO
Rocky Mountain Prevention Research Center, Department of Community and Behavioral Health, Colorado School of Public Health, University of Colorado Denver, Denver, CO
Rocky Mountain Prevention Research Center, Department of Community and Behavioral Health, Colorado School of Public Health, University of Colorado Denver, Denver, CO
To understand Supplemental Nutrition Assistance Program-Education (SNAP-Ed) Implementing Agencies’(SIAs) use of the SNAP-Ed Evaluation Framework (Framework), which is a tool that includes 51 indicators that SNAP-Ed programs can use to measure the success of their programs in the first 5 years after its release.
Methods
A repeated cross-sectional study design was utilized to administer electronic surveys to between 124 and 154 SIAs who received SNAP-Ed funding in fiscal years 2017, 2019, and 2021. Analyses included descriptive statistics and tests of proportions.
Results
Most SIAs indicated that they used the Framework to inform both data collection instruments and program planning decisions and the rates remained relatively constant over the 3 time points (> 80%). The most common specific use of the Framework across all 3 time points was to define, count, or measure the work accomplished, but this statistically decreased from 2017 (76%) to 2021 (57%) (z-score = 3.31; P < 0.001).
Conclusions and Implications
The results of this analysis confirmed that 5 years after its introduction, uptake and use of the Framework was high and that, as a whole, SIAs focused on priority indicators set by the US Department of Agriculture, with no notable increases in addressing and measuring longer-term, multisector, and population-wide outcomes. The systematic study of the Framework's usability over time has a broader application to other national health promotion initiatives with shared frameworks.
The Supplemental Nutrition Assistance Program-Education (SNAP-Ed) is the nutrition education and promotion arm of the US Department of Agriculture's (USDA) Supplemental Nutrition Assistance Program (SNAP). Overseen by the USDA Food and Nutrition Service, SNAP-Ed is the nation's largest federal nutrition education effort. Supplemental Nutrition Assistance Program-Education aims to help individuals and families experiencing poverty align household dietary choices with current Dietary Guidelines for Americans
by promoting healthy eating, physical activity, and obesity prevention. Supplemental Nutrition Assistance Program-Education develops and implements evidence-based programming in locations that reach many low-income people.
The SNAP-Ed programs overlay complementary direct education programming, social marketing campaigns, and policy, system, and environmental interventions, emphasizing multisector initiatives to leverage resources and achieve larger-scale impact.
The USDA Food Nutrition Service funds SNAP State Agencies in all 50 states, the District of Columbia, and the Territory of Guam. In 2021, State Agencies contracted with more than 160 diverse SNAP-Ed Implementing Agencies (SIAs) to provide comprehensive SNAP-Ed programs in each state. SIAs include Cooperative Extension and other university programs, nonprofits, public health, state departments, and tribal organizations. Many SIAs subcontract with other organizations to provide local and specialized services. The SIAs engage with community partners to reach low-income people where they live, work, shop, learn, eat, and play. Each year from 2017 through 2019, more than 4 million people participated in direct education classes, and more than 500 additional environmental and social marketing interventions with more than 30,000 partner organizations were conducted in low-resource settings.
In 2020, the most recent year for which data are available, direct education dropped about one-third, and partners dropped about 10% because of the coronavirus disease 2019 (COVID-19) pandemic; however, the number of multicomponent interventions increased.
but has been challenged to report comprehensively on longer-term and larger-scale impacts.
Because of the large scope of SNAP-Ed, state and federal partners identified the need for a set of unifying outcomes tailored to low-resource settings that would capture the work and potentially allow the results of SNAP-Ed partners to be compiled across the country. Beginning in 2013, work began to develop a comprehensive framework with a companion interpretive guide to help SIAs measure and track changes resulting from program efforts. In 2017, the framework was adopted by USDA for national use.
and reach, effectiveness, adoption, implementation, and maintenance elements. The Framework is a tool for all SNAP-Ed programs to select and use 51 indicators to measure the program's success and effectiveness. The Framework and Interpretive Guide provides details, background, research, and measurement instructions for each indicator.
The 51 indicators of the Framework can be used to measure change at 4 levels of influence (individual, environmental settings, sectors of influence, and population results), at different lengths of time (short-, medium-, and longer-term), and emphasizes the importance of reach, effectiveness, adoption, implementation, and maintenance (Figure 1). For example, the Individual-level of the Framework represents the foundation of SNAP-Ed: individual, group, and family nutrition education, physical activity promotion, and related interventions. Short-term indicators illustrate goals and intentions that motivate or demonstrate readiness for behavior changes but fall short of action. Medium-term represents immediate outcomes following program completion, and long-term is at a minimum of 6 months postintervention. The Framework's menu of options is flexible and intended to capture outcomes across the wide variety of topic areas, geographies, and organizational settings in which SNAP-Ed is implemented.
Figure 1SNAP-Ed Evaluation Framework: Nutrition, physical activity, and obesity prevention indicators. SNAP-Ed indicates Supplemental Nutrition Assistance Program Education. Source: US Department of Agriculture, Food and Nutrition Service.
the Framework is designed to provide evaluation metrics that SIAs can choose on the basis of their priorities, skills, and resources and that has a collective structure to measure impacts on public health outcomes. Development and uses of the Framework, as well as first-year uptake, have been reported elsewhere.
To our knowledge, this is the only nationally-adopted, comprehensive, multilevel framework being used for planning and evaluation in any federal nutrition, physical activity and/or dietary change program; it exhibits components of all the dietary quality, physical activity, and behavior change frameworks, as well as aspects of public health promotion frameworks identified in a scoping review of evaluation frameworks by Fynn and colleagues.
Their scoping review resulted in a classification of evaluation framework types, and our study examines the adoption of a comprehensive evaluation framework among practitioners.
A census of SIAs was first conducted in 2017 to understand the use of the Framework and to provide insights about barriers to its use and gaps in metrics. This census was conducted by the Association of SNAP Nutrition Education Administrators (ASNNA) and has been repeated twice since the original collection effort (in 2019 and 2021). Continually reevaluating the work of the field is an important piece of identifying adherence to a practitioner framework and is needed to identify areas for refinement of its metrics.
The main objectives of the ongoing census efforts are to better understand: (1) how SIAs are using the Framework and how this has changed over time, (2) what barriers to evaluating indicators in the Framework have existed and what evaluation-related technical assistance needs are there, and (3) how COVID-19 has affected SIAs’ intent to affect and evaluate the indicators of the Framework.
METHODS
Instrument
Core survey content and administration of the census were kept consistent for all 3 years. The survey was organized by the 4 levels of influence used in the Framework: individual, environmental settings, sectors of influence, and population results. The SIAs were asked about their intent to affect and evaluate each of the 51 indicators exactly the same way across all years. For example, SIAs were asked: Is your program intending to affect the STI: Healthy Eating indicator? (response options: yes, no, I don't know). If yes, are you evaluating the ST: Healthy Eating indicator in any way? (response options: yes, no, I don't know). Definitions of each indicator from the Interpretive Guide
Depending on the year of inquiry, additional questions were included that pertained to SIAs’ characteristics, barriers to use, training and technical needs related to program planning and evaluation, and the impact the COVID-19 pandemic had on program planning and evaluation. The additional survey questions were the only survey modifications made across the years. For example, in the 2017 baseline survey, the closed-ended barrier questions were broad. In the later surveys, barriers more specific to each level of influence were asked. In 2021, 2 close-ended questions were added to the survey asking if COVID-19 had affected SIAs’ intent to affect and evaluate indicators in the Framework. If SIAs responded yes to either of these questions, open-ended questions asked them to describe how they were asked. There were 115 questions in 2017, 119 in 2019, and 138 in 2021. The census instrument was pilot-tested before the administration in 2017. Any new questions added in subsequent years were pilot-tested before administration to SIAs.
Sample and Data Collection
Each year, the respondents were the program directors of the SIA or designees (n = 135 in 2017; n = 142 in 2019, and n = 164 in 2021). Depending on the year of administration, census respondents were asked to use their federal fiscal year 2017, 2019, or 2021 SNAP-Ed plan and its evaluation activities to inform their responses. Each SIA was asked to report only on its activities and those of its subrecipients and contractors, so only 1 survey was completed by each SIA in a state.
Review by the Institutional Review Board was not required for this study because human subjects were not involved, as per US Department of Health and Human Services guidelines (http://www.hhs.gov/ohrp/policy/checklists/decisioncharts.html#c1). The data collection methods were the same for all 3 years.
The ASNNA census workgroup identified the directors of each SIA by contacting the USDA Regional Offices, SNAP State Agencies, and cold calls to new SIAs. The ASNNA listserv was used to encourage response. The survey was administered through a Research Electronic Data Capture database link sent between October and December in each of the years. The data collection protocol followed the Tailored Design Method
and included: An advanced notice email, a survey link e-mailed 1 week after the notice, and up to 3 follow-up e-mails (1 per week) with the survey link to nonresponders. After 4 email attempts, nonresponders were contacted personally by ASNNA colleagues serving on the census workgroup to encourage their participation. Participants were not paid to complete the survey at any time point.
Data Analysis
All quantitative data were exported from Research Electronic Data Capture,
a secure web platform for building and managing online databases and surveys, into Microsoft Excel or SPSS (version 26, IBM, 2019) for analysis. Basic descriptive statistics (ie, frequencies, means, and SDs) were computed. An I don't know response was treated as missing data. There was a small amount of missing data for individual items across the years of administration (<5%), so missing data were treated using listwise deletion when univariate analyses were run. To assess if the intent to affect or evaluate the indicators in the Framework differed across the years, 2-sample tests of proportions were conducted using z-scores. Proportions were only used for yes responses to the questions related to the intent to affect an indicator and the intent to evaluate it. A more stringent alpha was selected (α = 0.01) to minimize the possibility of making a type I error with the multiple tests that were run.
The qualitative COVID-19 data were analyzed by a trained evaluator using rapid qualitative analysis, a systematic method to organize and inventory data.
Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis.
Matrix analyses, which involve summarizing and analyzing qualitative data in a table of rows and columns, were used to identify patterns and themes in the data.
The resulting matrices were then reviewed and confirmed by the lead author.
RESULTS
Sample
Of the total population of SIAs targeted in each year (n = 135 in 2017; n = 142 in 2019, and n = 164 in 2021), the survey response rates for each census were high: 91% in 2017, 90% in 2019 and 94% in 2021. All SIAs in all 50 states, the District of Columbia, and Guam were represented yearly, except in 2019, Guam and Idaho did not participate, and in 2021, West Virginia did not participate. Although total federal funding was flat except for increases tied to the Consumer Price Index,
Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis.
SIAs increased by nearly 25% over 5 years. The characteristics of our sample across the years are included in Table 1. The proportion of SIAs from the state/local government sector grew, nonprofits remained constant, and Cooperative Extensions and other universities dropped slightly. There was a slight drop in the percentage of Indian Tribal Organizations over the years, but this reflects nonresponse instead of fewer tribal serving organizations being funded by SNAP-Ed. In 2021, of the nearly 80% of SIAs who knew when their agency was first funded, 18% were relatively new to SNAP-Ed, having participated for 3 or fewer years. Approximately a quarter of SIAs had an external evaluator across the years.
Table 1Characteristics of Responding SNAP-Ed Implementing Agencies 2017–2021
Percentage of Respondents (%)
Agency Characteristic
2017 (n = 124)
2019 (n = 129)
2021 (n = 154)
Agency type
University, Cooperative Extension
41
41
35
University, other
13
9
11
Nonprofit organization
23
23
23
State or local government
15
20
26
Indian tribal organization
6
5
3
Other
2
1
2
Year received initial funding
Peer education model (≤ 1996)
24
23
19
Networks, marketing (1997–2010)
37
22
22
Comprehensive (2011–2017)
23
28
20
Framework (2018–2021)
–
5
18
Unknown
16
23
22
Has an external evaluation contractor
22
25
23
SNAP-Ed indicates Supplemental Nutrition Assistance Program Education.
Use of the Framework and Interpretive Guide Over Time
Most SIAs indicated that they used the Framework and Interpretive Guide to inform data collection instruments and program planning decisions. Table 2 shows that rates remained relatively constant over the 3-time points (data collection: 2017 = 88%, 2019 = 86%, and 2021 = 94%; Program Planning: 2017 = 90%, 2019 = 89%, and 2021 = 89%). No results were statistically significant, indicating that the use of the Framework for data collection and program planning efforts was the same across the various years they were sampled.
Table 2Uses of the SNAP-Ed Evaluation Framework and Interpretive Guide Reported by SIAs from 2017–2021
Percentage of Respondents (%)
Difference in Test of Proportions: Z-Score
Uses
2017 (n = 124)
2019 (n = 129)
2021 (n = 154)
2017–2019
2019–2021
2017–2021
Data Collection Instruments
88
86
94
−0.48
2.09
1.59
Program Planning Decisions
90
89
89
−0.30
0.13
−0.19
SIAs indicates Supplemental Nutrition Assistance Program Education Implementing Agencies; SNAP-Ed, Supplemental Nutrition Assistance Program Education.
Note: Analysis performed was a 2-sample test of proportions. Results were not statistically significant (α = 0.01).
Figure 2 shows the percentage of SIAs who reported using the Framework and Interpretive Guide for specific program planning uses. The most common program planning use of the Framework and Interpretive Guide in all years was (1) to define, count, or measure the work you accomplish (2017 = 76%, 2019 = 69%, 2021 = 57%). Responses on the census survey over time revealed that 3 of the 5 specific uses of the framework statistically significantly decreased over time: (1) to define, count, or measure the work you accomplish (2017 [76%]) to 2021 [57%]: z-score = 3.31; P < 0.001), (2) to report results that can be used nationally (2017 [61%] to 2021 [41%]: z-score = 3.31; P < 0.001), and (3) to showcase the larger mission and scope of SNAP-Ed (2017 [58%] to 2021 [40%]: z-score = 2.99; P < 0.01).
Figure 2Specific uses of the SNAP-Ed Evaluation Framework and Interpretive Guide to inform program planning by SIAs from 2017–2021. SIAs indicates Supplemental Nutrition Assistance Program Education Implementing Agencies; SNAP-Ed, Supplemental Nutrition Assistance Program Education. Note: These were questions on the census survey. The analysis performed was a 2-sample test of proportions. ⁎⁎P <0.01; ⁎⁎⁎P <0.001.
Figure 3 shows the percentage of SIAs that indicated their intent to affect each indicator. The results in 2019 and 2021 mirror those found at baseline in 2017: More SIAs reported intending to affect indicators at the individual and environmental levels, compared with the sectors of influence and population levels of the Framework, and more SIAs intended to affect and evaluate short- or medium-term indicators, compared with long-term indicators. The average number of indicators in the Framework (total = 51) that SIAs intended to affect and evaluate remained similar across the years. SIAs intended to affect 19 indicators and evaluate 12 in 2017, 20 to affect and 12 to evaluate in 2019, and 18 to affect and 13 in 2021.
Figure 3Percentage of SIAs that intend to impact each indicator in 2017, 2019, and 2021. LT indicates long-term; MT, medium-term; R, results at the population level; SIA, Supplemental Nutrition Assistance Program Education Implementing Agencies; ST, short-term. *Indicates a US Department of Agriculture priority indicator.
In each level of the Framework for the 3 census years, the 7 indicators designated as priorities by USDA were addressed by the greatest number of SIAs. These priority indicators included the following: (1) individual-level indicators (MT1: healthy eating [range: 91% to 95%]; MT2: food resource management [range: 73% to 79%]; MT3: physical activity and reduced sedentary behavior [range: 75% to 79%]), (2) environmental setting indicators (ST7: organizational partnerships [range: 74-81%]; MT5: nutrition supports [71% to 86%]), (3) sector of influence indicator (ST8: multisector partnerships and planning [range: 49% to 60%]), and (4) population results indicator (R2: fruits and vegetables [range: 45% to 50%]).
Table 3 shows that for SIAs as a group, there was only 1 of the 51 indicators with a significant change in intent to affect across the time points, and that was a decrease for MT5: nutrition supports. In 2017, 86% of SIAs intended to affect this indicator, and in 2021 only 71% of SIAs intended to affect this indicator (z-score = −2.98; P < 0.01). This is an individual-level, medium-term indicator that refers to the number and estimated reach of sites/organizations that report adopting new policy, system, and environment changes and complementary promotion/marketing that make it easier to eat healthily and are known to be effective in improving food choices. It is also a USDA priority indicator.
Table 3Differences in Percentages of SIAs Intending to Affect and Evaluate the Indicators of the Framework Over Time
MT indicates medium-term; MT5, Number and estimated reach of sites/organizations that report adopting new policy, system, and environment changes and complementary promotion/marketing that make it easier to eat healthily and are known to be effective in improving food choices; SIAs, Supplemental Nutrition Assistance Program Education Implementing Agencies; ST, short-term; ST4, Intent to use food safety behaviors recommended by MyPlate principles.
There was also only 1 of the 51 indicators with significant changes in intent to evaluate across the time points, and that was for ST4: food safety. In 2019, 20% of SIAs intended to evaluate this indicator, and in 2021, 36% of SIAs intended to do so (z-score = 2.99; P < 0.01). This is an individual-level, short-term indicator that relates to the evaluation of the use of food safety behaviors recommended by MyPlate principles.
Barriers to Evaluating Indicators in the Framework and Evaluation-Related Technical Assistance Needs Over Time
Barriers to evaluating indicators in the Framework were not assessed in 2017 but were assessed, by the level of influence, in 2019 and 2021. Cross-cutting evaluation barriers that SIAs cited across levels of influence and census year included: Not enough evaluation staff time/personnel (range, 14% to 49%), lack of outside funds to pay respondents or for comparison studies (range, 8% to 29%); general budget constraints (range, 10% to 31%); choosing or using evaluation instruments for some indicators (range, 4% to 23%); lack of training/expertise in evaluation (range, 7% to 26%); some indicators are not a priority for my program/state/stakeholders (range, 7% to 22%). Unique evaluation barriers for indicators in the Sectors of Influence and Population Results sections included: secondary data sources are unavailable (range, 4% to 11%), and outcomes cannot be linked to SNAP-Ed programming (range, 4% to 11%). Some barriers were only asked in 2021. The most endorsed barrier of these new barriers was that a 1-year/annual time frame is too short to measure meaningful change (range, 12% to 20%), followed by high respondent burden (range, 6% to 17%) and limited evaluation instruments/tools (range, 7% to 16%).
Figure 4 shows the technical assistance (TA) needs in evaluating and reporting over the years. Across all years, SIAs indicate that their biggest need was help choosing or using evaluation instruments for some indicators (2017, 57%; 2019, 48%; 2021, 34%). However, this need significantly decreased over time (2017 to 2021: z = −4.22; P < 0.001). In 2017, the next highest needs had been for identifying or using reporting systems (42%) and aligning existing programming activities with the Framework (40%), but by 2021 these dropped to 16% and 22%, respectively, both statistically significant changes (z = −4.79; P < 0.001 and z = 3.58; P < 0.001, respectively). In 2017 and 2019, there was a similar need for creating final reports and communicating the results (although this question changed slightly from 2017 to 2019) (37% and 35%, respectively), but this decreased significantly in 2021 to 22% (2017 to 2021: z = −3.15; P < 0.01). In 2019, the question regarding TA needs related to creating SNAP-Ed plans was amended to include multiyear evaluation plans and the need increased from 11% to 36% (z = 3.89; P < 0.001) but then declined in 2021 (22%) (z = 2.82; P < 0.01). Two new TA questions were introduced in 2019 related to methodological issues (sampling, data analysis, etc) and conducting the SNAP-Ed annual/multiyear needs assessment. In 2019, just more than a third of the respondents (36%) indicated that they had TA needs related to methodological issues, and 23% indicated needs related to conducting annual/multiyear needs assessments. In 2021, the need related to methodological issues decreased to 24%, whereas the need related to conducting the SNAP-Ed annual/multiyear needs assessment remained at about a quarter of the sample (25%). Neither of these changes was statistically significant.
Figure 4Technical assistance needs related to SNAP-Ed evaluation and reporting reported by SIAs over time. SIAs indicates Supplemental Nutrition Assistance Program Education Implementing Agencies; SNAP-Ed, Supplemental Nutrition Assistance Program Education; wm, wording of the question was slightly modified from the first year of administration. The analysis performed was a 2-sample test of proportions. ⁎⁎P < 0.01; ⁎⁎⁎P < 0.001.
The 2021 federal fiscal year began on October 1, 2020, so the 2021 state plans used for the 2021 census had been prepared in the summer of 2020 and reflected changes that SIAs expected on the basis of experience since the COVID outbreak began early in 2020. Most SIAs indicated that the pandemic would affect their intent to affect some indicators in the Framework (60%). The predominant theme that emerged when SIAs described how was because of reductions in in-person direct education or changes to virtual delivery. Few reported impacts on partnerships, collaborations, engagement opportunities, priorities or scope of work changes, and evaluation challenges.
DISCUSSION
There is little research on the usability and applicability of evaluation frameworks in nutrition and public health.
This study focused on 3 objectives: (1) examining SIAs’ use of the Framework and how this has changed over time, (2) barriers in evaluating indicators in the Framework and evaluation-related TA needs, and (3) the influence of COVID-19 on impacting and evaluating indicators in Framework.
There were no major changes in the use of the Framework over time. The results of this analysis confirmed that 5 years after its introduction, uptake and use of the Framework and its Interpretive Guide
for data collection and program planning was high, exceeding 80% of SIAs across all years (seen in Table 2). The most common program planning use of the Framework and Interpretive Guide in all years was to define, count, or measure the work SIAs were hoping to accomplish (2017, 76%; 2019, 69%; 2021, 57%), but this decreased over time. This statistically significant decrease likely reflects increased familiarity with the Framework and its indicators. The percentage of SIAs that intended to affect and evaluate the indicators of the Framework over the 3 years remained stable, with only 2 of the 51 indicators (nutrition supports and food safety) showing any meaningful changes. As a whole, SIAs focused on priority indicators set by the USDA, with no notable increases in addressing and measuring longer-term, multisector, and population-wide outcomes.
Three types of training and TA emerged as priorities across all Framework levels—choosing and using evaluation instruments, conducting needs assessments and planning, and compiling and communicating results. The high need for TA in planning and evaluation reported in 2017 was dramatically reduced in the years following the implementation of the Framework. However, the lower intent to affect and evaluate longer-term and more distant outcomes than medium-term and sectors of influence indicators persisted over the 3 census years. This finding suggests that more support is needed for these topics. Potential influences related to Framework uptake could be unmodifiable factors such as alignment between annual grant timeframes and multiyear outcomes combined with a reported lack of practical evaluation tools to detect a sustained change in environmental and sectors of influence outcomes. In addition, the high respondent burden for SNAP-Ed participants and partners is an inherent challenge to rigorous evaluation and the attribution of outcomes to SNAP-Ed. Future TA efforts should focus on ways to reduce these challenges.
Most SIAs (60%) indicated that COVID-19 affected their intent to affect and evaluate indicators in the Framework. However, differences in intent to affect most indicators between 2019 and 2021 were not significant. Modifications to the delivery of in-person programming were the most frequently mentioned. The evidence base regarding the virtual delivery of nutrition education continues to emerge, demonstrating how SNAP-Ed programs were able to successfully pivot their efforts to maintain programming.
There are several limitations to this study. First, to examine the uptake of the national Framework and its indicators, these data focused on the intended evaluation scope rather than evaluation outcomes or methods. As such, they do not speak to actual SNAP-Ed program results or the degree of rigor in evaluation design; these topics are worthy of investigation but are outside the scope of the current study. Second, these data reflect activities at the implementing agency level and therefore do not describe the uptake of the Framework at the state level. As a result of these limitations, no inferences should be made about program outcomes or Framework uptake for states. Finally, many statistical tests were run, increasing the chances of making a type 1 error. Although a more stringent alpha was set to help prevent this, some statistically significant findings may be due to chance.
IMPLICATIONS FOR RESEARCH AND PRACTICE
The SNAP-Ed program and its guiding documents must stay current with the evidence base and emerging federal and community priorities while responding to the dynamic contexts impacting Americans eligible for SNAP-Ed.
Three implications for research, policy, and practice that are applicable for SNAP-Ed, as well as other nutrition and public health programs, are as follows:
Continually Update Framework Indicators, Metrics, and Guiding Documents
As the context for programming shifts and expands, so must the priorities and objectives of the Framework. For example, incorporating an equity lens would ensure that SNAP-Ed opportunities and intended outcomes are equitable across the communities and populations eligible for programming. It is important to incorporate community voice and ownership so that SNAP-Ed indicators are meaningful at the local level and to states, so work can be done collaboratively using community-driven processes. In addition, priority indicators include a few longer-term, multisector, and population outcomes. Based on the results of this study, more SIAs are likely to report impacting and evaluating these indicators if future program guidance prioritizes them. This may require TA for SIAs not currently engaged with these indicators, whether in terms of program planning, implementation, or evaluation.
To support real-world applicability, with the inherent flexibility to address emergent crises like a worldwide pandemic, it is important that all updates to the Framework be “empirically robust and practically meaningful”
and includes a wide range of practitioners, scientists, funders, and most importantly, those served by SNAP-Ed. Because of the variability of practitioner resources and expertise, guiding documents must meaningfully support the use of the Framework and its indicators and be updated accordingly. As demonstrated by the findings of this study, practitioners require a range of ongoing support and resources to address the dynamic challenges associated with program implementation and Framework application in low-resource settings when minimizing respondent burden is a priority.
Prioritize Research that Yields Evidence for Updating the Framework and Its Guidance
Funders and practitioners should work together to develop an action-oriented research agenda that focuses efforts, tests long-standing and emerging interventions, and disseminates findings to help scale up effective programming.
Association of SNAP Nutrition Education Administrators. Recommendations for Implementing the Nutrition Education and Obesity Prevention Grant Program (SNAP-Ed) Provisions of the 2018 Farm Bill: A Position Paper of the Association of SNAP Nutrition Education Administrators (ASNNA). ASNNA; 2020.https://asnna.us.org/wp-content/uploads/2021/09/2018-ASNNA-FB-PP-FINAL-FOR-WEB-02012021.pdf. Accessed October 1, 2021.
A comprehensive, purposive research agenda would yield evidence for updating the Framework and its guidance, assuring effective program theory, and maximizing funds available for program delivery.
Encourage the Systematic Study of Frameworks Designed to Influence Program Theory Uptake
The analysis of the Framework and its uptake has the potential to inform policymakers and practitioners working on SNAP-Ed, as well as similar programs with comparable public health goals and shared program theory frameworks. The biennial practitioner-driven evaluation informs gaps and areas for future development and provides ongoing data regarding the usability and applicability of a comprehensive public health evaluation framework. Findings demonstrate the importance of ongoing user input, experience, and contextual feedback. This evaluation provides a replicable process that can be used by other nutrition and public health programs.
In summary, the Framework is designed to inform program planning and evaluation of SNAP-Ed programs. This study examined 1 aspect of the Framework's implementation: engagement with indicators in terms of intent to affect and evaluation. Examining this level of engagement is valuable because, ultimately, a robust evaluation framework is an essential foundation for demonstrating the impact of SNAP-Ed programming nationally.
ACKNOWLEDGMENTS
The authors wish to extend a sincere thank you to all the Supplemental Nutrition Assistance Program-Education State Implementing Agencies and Supplemental Nutrition Assistance Program State Agencies, without whom this project would not have been possible. We would also like to thank the Association of Supplemental Nutrition Assistance Program Nutrition Education Administrators for funding the open access fee, so this article could be freely available to all.
REFERENCES
US Department of Agriculture, US Department of Health and Human Services
2020–2025 Dietary Guidelines for Americans.
9th ed. 2020 (USDA, US Department of Health and Human Services; 2020.)
Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis.
Association of SNAP Nutrition Education Administrators. Recommendations for Implementing the Nutrition Education and Obesity Prevention Grant Program (SNAP-Ed) Provisions of the 2018 Farm Bill: A Position Paper of the Association of SNAP Nutrition Education Administrators (ASNNA). ASNNA; 2020.https://asnna.us.org/wp-content/uploads/2021/09/2018-ASNNA-FB-PP-FINAL-FOR-WEB-02012021.pdf. Accessed October 1, 2021.