4335.0: Tuesday, October 23, 2001 - 9:00 PM

Abstract #24842

Building a cross-site evaluation tool - 48 cooks with 47 recipes

Maya Y Mauch, MPH1, Rodolfo Vega, PhD1, Thomas Mangione, PhD2, Hortensia Amaro, PhD3, Anita Raj, PhD3, and Lucille Perez, MD4. (1) JSI Research and Training, 44 Farnsworth Street, Boston, MA 02210, 617-482-9485, mmauch@jsi.com, (2) Survey Research Group, JSI Research and Training Institute, Inc, 44 Farnsworth St, Boston, MA 02210, (3) Department of Social and Behavioral Sciences, Boston University School of Public Health, 715 Albany Street, T2W, Boston, MA 02118, (4) Medical and Clinical Affairs, Center for Substance Abuse Prevention, SAMHSA, 5600 Fishers Lane, Rockwall II, Rockville, MD 20857

This presentation will focus on the processes used by the Project Coordinating Center to build consensus around a scientifically valid evaluation design and measurement tool. We will identify processes that went well and point to elements that could be improved.

Three phases will be described. The first phase involved reaching consensus on a research design; determining the number of versions of the cross-site tool to create; and soliciting ideas for topics. In the second phase topics were turned into questions and a draft sequence was prepared. Suggestions for modifications were solicited. In the third phase various evaluations and pretests were made and revisions made. These steps are not dramatically different from ordinary instrument development. The key difference was the challenge of building consensus among 47 community-based programs with diverse program interventions, expectations, organizational capacities, target populations, and evaluation experience. Some programs had never received a federal grant before. Many were familiar with service delivery but not familiar with research protocols and most were not familiar with the elements of a rigorous scientific evaluation which necessitated IRB's, comparison groups, and follow-up surveys.

Consensus on the instruments was obtained through the utilization of participatory processes; small representative working groups; and evaluation from external sources. Of great importance was the inclusion of consumer feedback, youth involvement and local and national experts. Great care was also taken to develop instruments that were culturally appropriate, sensitive to gender and sexual identity and youth friendly. The effectiveness of these additional elements will be critically examined.

Learning Objectives: Participants will be able to:1) Articulate strategies to engage community-stakeholders in the development of evaluation tools; 2) Implement consensus-building strategies around evaluation activities with community-stakeholders; 3) Identify challenges faced by community-based organizations in the implementation of a collaborative evaluation approach.

Awards: - Honorable MentionPresenting author's disclosure statement:
Organization/institution whose products or services will be discussed: Substance Abuse and Mental Health Services Administration/Center for Substance Abuse Prevention (SAMHSA/CSAP)
I do not have any significant financial interest/arrangement or affiliation with any organization/institution whose products or services are being discussed in this session.

The 129th Annual Meeting of APHA