243713 Improving design rigor for participatory evaluation: Lessons from a multi-site nutrition intervention program

Monday, October 31, 2011: 3:24 PM

Carolyn Kitzmann Rider, MA , Research & Evaluation Unit, Network for a Healthy California, California Department of Public Health, Sacramento, CA
Jennifer Gregson, MPH, PhD , Research & Evaluation Unit, Network for a Healthy California, California Department of Public Health, Sacramento, CA
Amanda Linares, MS , Research & Evaluation Unit, Network for a Healthy California, California Department of Public Health, Sacramento, CA
Sharon Sugerman, MS, RD , Research & Evaluation Unit, Network for a Healthy California, California Department of Public Health, Sacramento, CA
Andrew Fourney, DrPH , Formerly of Research & Evaluation Unit, Network for a Healthy California, California Department of Public Health, Sacramento, CA
Introduction As part of the Network for a Healthy California (Network) social marketing nutrition education campaign, the 44 largest local programs evaluate their own interventions. Funded through the United States Department of Agriculture's Supplemental Nutrition Assistance Program, a major goal of the Network is increasing fruit and vegetable consumption. A key challenge is a shortage of proven, effective interventions, especially those appropriate for diverse populations; thus, quality evaluation of Network interventions can inform the nutrition education field about effective techniques. Since 2004, local programs have been trained in evaluation capacity building within a participatory evaluation framework and annually are asked to increase the rigor of their evaluations. Methods Evaluation capacity building tools were developed from 2004-2009 and augmented in 2010 with new administrative procedures and evaluation tools. Procedures included individual technical assistance for sample size, control groups, and random assignment. Evaluation tools included validated and standardized FV consumption surveys with modules that can be tailored to local evaluations, enhanced data analysis templates, and new meta-analysis techniques. Results In FFY 2010, local partners completed more rigorous evaluations of their programs, measured by the use of adequate sample size and more complex design. Specific program examples will be presented. Meta-analysis techniques of aggregated data detected more sensitive changes, and demonstrated that statewide, the local program delivery channel significantly changes behavior. Discussion Specific examples of practical tools that build evaluation capacity will be discussed. Local program evaluation benefited from a combination of autonomy and more structured evaluation guidance.

Learning Areas:
Administration, management, leadership
Conduct evaluation related to programs, research, and other areas of practice

Learning Objectives:
Identify standard evaluation elements created for participatory evaluation of multi-site interventions. Demonstrate how administrative procedures can improve rigor in evaluation design. Demonstrate how evaluation tools can improve rigor in evaluation design.

Keywords: Evaluation, Challenges and Opportunities

Presenting author's disclosure statement:

Qualified on the content I am responsible for because: I am qualified to present because I oversee programs such as statewide evaluation of local nutrition education programs and statewide surveillance of health behaviors.
Any relevant financial relationships? No

I agree to comply with the American Public Health Association Conflict of Interest and Commercial Support Guidelines, and to disclose to the participants any off-label or experimental uses of a commercial product or service discussed in my presentation.