800 likes | 898 Views
Evidence-Based Public Health: A Course in Chronic Disease Prevention MODULE 9: Evaluating the Program or Policy Ross Brownson Anjali Deshpande Darcy Scharff March 2013. Learning Objectives. Understand the basic components of program evaluation.
E N D
Evidence-Based Public Health: A Course in Chronic Disease Prevention MODULE 9: Evaluating the Program or PolicyRoss BrownsonAnjali DeshpandeDarcy ScharffMarch 2013
Learning Objectives • Understand the basic components of program evaluation. • Describe the differences and unique contributions of quantitative and qualitative evaluation. • Understand the various types of evaluation designs useful in program evaluation. • Understand the concepts of measurement validity and reliability. • Understand some of the advantages and disadvantages of various types of qualitative data. • Understand some of the steps involved in conducting qualitative evaluations. • Describe organizational issues in evaluation.
Best available research evidence Environment and organizational context Decision-making Population characteristics, needs, values, and preferences Resources, including practitioner expertise
Discontinue Disseminate widely Retool
What is program evaluation? “a process that attempts to determine as systematically and objectively as possible the relevance, effectiveness, and impact of activities in light of their objectives.” A Dictionary of Epidemiology, 2008 The best evaluations often “triangulate” The combination of quantitative and qualitative methods - Looking in a room from two windows Prominent example: the evaluation of California Prop 99
Evaluation is basically … a process of measurement & comparison
Why evaluate? • Improve existing programs • Measure effectiveness • Demonstrate accountability • Share effective strategies and lessons learned • Ensure funding and sustainability Evaluation is a tool that can both measure and contribute to the success of your program.
Evaluation versus research Evaluation • Controlled by stakeholders • Flexible design • Ongoing • Used to improve programs Research • Controlled by investigator • Tightly controlled design • Specific timeframe • Use to further knowledge
Do you have… • A research/evaluation person on staff? • Time and other resources? • Staff to assist? • Necessary skills? From Mattessich, 2003
What are the most significant challenges you face in program evaluation? • Program personnel may be threatened by the evaluation • Need for personnel involvement vs. objectivity • Comprehensive evaluation versus nothing at all • “10% Rule” as you design and implement programs
In program planning when should you begin planning an evaluation?
Logic Model (Analytic Framework) Worksheet: Evidence-Based Public Health Program Title:__________________________________________ Goal: Long-Term Objective: What are the evidence- Intermediate Intermediate Intermediate Intermediate Objective based determinants? Objective (Individual Objective (Social Objective ( Govt ./Org. (Environmental Level): Level): Level): Level): Impact Based on an evidence review, what activities will address these Activities: Activities: Activities: Activities: determinants? What do Evaluation you do? How long will it take? How much will it cost? What other resources Costs: Costs: Costs: Costs: d? are needed Process Instructions : First discuss your target population. Using data, evidence-based recommendations (the Community Guide or others), your own knowledge, and group discussion, develop a program strategy for controlling diabetes in your community. Define the goal, objectives, activities, and costs: and describe them in this sample logic model.
Some important questions: What are sources of data?How might program evaluation differ from policy evaluation?
Some notes on policy evaluation • Same principles apply • Lack of control over the intervention (policy) • Time frame may be much shorter • No evaluation is completely “objective,” value-free, or neutral
Logic Model PROGRAM PLANNING Goal Objective Activities EVALUATION Outcome Impact Formative/ Process
Evaluation Framework Program - instructors? - content? - methods? - time allotments? - materials Process Impact Evaluation Types Behavior/cognition - knowledge gain? - attitude change? - habit change? - skill development? Outcome Health - mortality? - morbidity? - disability? - quality of life? (Adapted from Green et al., 1980)
Types of Evaluation Formative evaluation • Is an element of a program or policy (e.g., materials, messages) feasible, appropriate, and meaningful for the target population? • Often, in the planning stages of a new program • Often, examining contextual factors
Types of Evaluation • Considerations for formative evaluation 1. Sources of data • (pre-) program data 2. Limitations of data (completeness) 3. Time frame 4. Availability & costs • Examples • Attitudes among school officials toward a proposed healthy eating program • Barriers in policies toward healthy eating
Types of Evaluation Process evaluation • “Field of Dreams” evaluation • shorter-term feedback on program implementation, content, methods, participant response, practitioner response • what is working, what is not working
Types of Evaluation Process evaluation (cont) • direct extension of action planning in previous module • uses quantitative or qualitative data • data usually involves counts, not rates or ratios
Types of Evaluation • Considerations for process evaluation 1. Sources of data • program data 2. Limitations of data (completeness) 3. Time frame 4. Availability & costs • Examples • Satisfaction with a diabetes self-management training • How resources are being allocated
California Local Health Department Funding by Core Indicator 2004/07 2001/04 Cess. Cess. Spon. SSD License Bars License Outdoor Outdoor
Types of Evaluation Impact evaluation • long-term or short-term feedback on knowledge, attitudes, beliefs, behaviors • uses quantitative or qualitative data • also called summative evaluation • probably more realistic endpoints for most public health programs and policies
Types of Evaluation • Considerations for impact evaluation 1. Sources of data • surveillance or program data 2. Limitations of data (validity and reliability) 3. Time frame 4. Availability & costs • Example • Smoking rates (tobacco consumption) in California
California and U.S. minus California adult per capita cigarette pack consumption, 1984/1985-2004/2005 Packs/Person 200 $0.25 tax increase 150 $0.02 tax increase US minus CA $0.50 tax increase 100 California 50 0 Source: California State Board of Equalization (packs sold) and California Department of Finance (population). U.S Census, Tax Burden on Tobacco, and USDA.. Note that data is by fiscal year (July 1-June 30). Prepared by: California Department of Health Services, Tobacco Control Section, February 2006.
Types of Evaluation Outcome evaluation • long-term feedback on health status, morbidity, mortality • uses quantitative data • also called summative evaluation • often used in strategic plans
Types of Evaluation Considerations for outcome evaluation 1. Sources of data • routine surveillance data 2. Limitations of data (validity and reliability) 3. Time frame 4. Availability & costs - often the least expensive to find Example • Geographic dispersion of heart disease
Acute myocardial infarction rates, Missouri, 2010-2011 (age-adjusted)
Program Evaluation Designs Reasons for research on causes • to identify risks associated with health-related conditions type 1 evidence Reasons for evaluating programs • to evaluate the effectiveness of public health interventions type 2 evidence
Interventioninitiated Program Evaluation Designs rate Can we conclude that the intervention is effective?
Interventioninitiated Program Evaluation Designs rate Can we conclude that the intervention is effective?
Program Evaluation Designs Experimental • randomized controlled trial • group randomized trial Quasi-experimental • pre-test / post-test with external control group (non-randomized trial) • pre-test / post-test without external control group (before-after or time series) Observational • cohort • case-control • cross-sectional
Population Ineligibility Eligibility Participation No Participation Randomization ? Intervention Group No Intervention Group Outcome(s) Outcome(s) Program Evaluation Designs
Follow-up July-Sept 2004 Pretest-Posttest with comparison group Reference: Brownsonet al. Preventive Medicine 2005; 41: 837-842
Pretest-Posttest with comparison group Reference: Brownsonet al. Preventive Medicine 2005; 41: 837-842
Employees & visitors, at JHMC Before new smoke-free policy After new Smoke-free policy Cigarettes smoked per day Cigarette remnant counts per day Nicotine concentrations Cigarettes smoked per day Cigarette remnant counts per day Nicotine concentrations Pre-test / Post-testwithout external control group Reference: Stillmanet al. JAMA 1990;264:1565-1569
% People Smoking Reference: Stillmanet al. JAMA 1990;264:1565-1569
Avg. Daily Cigarette Remnant Counts Reference: Stillmanet al. JAMA 1990;264:1565-1569
Median Nicotine Vapor Concentrations (ug/m3) Reference: Stillmanet al. JAMA 1990;264:1565-1569
California residents 1980-1982 before federal tax 1983-1988 before state tax 1989-1990 after state tax cigarette sales per capita cigarette sales per capita cigarette sales per capita Pre-test / Post-testwithout external control group References: Emery et al. AJPH 2001;21(4)278-283 Flewellinget al. AJPH 1992;82:867-869 Siegel et al. AJPF 2000;90:372-379
California and U.S. minus California adult per capita cigarette pack consumption, 1984/1985-2004/2005 Packs/Person 200 $0.25 tax increase 150 $0.02 tax increase US minus CA $0.50 tax increase 100 California 50 0 Source: California State Board of Equalization (packs sold) and California Department of Finance (population). U.S Census, Tax Burden on Tobacco, and USDA.. Note that data is by fiscal year (July 1-June 30). Prepared by: California Department of Health Services, Tobacco Control Section, February 2006.
Program Evaluation Designs Quality of evidence from program evaluation depend on … • type of program evaluation design • execution of the program evaluation • generalizability of program evaluation results
Population Ineligibility Eligibility Participation No Participation Randomization ? Intervention Group Control Group Outcome(s) Outcome(s) Consider for ‘generic’ evaluation design
Concepts of validity and reliability and their importance(evaluation “threats”)