330 likes | 490 Views
By: Thomas J. Chapel, MA, MBA Office of the Director Office of Program Planning and Evaluation Centers for Disease Control and Prevention. Using the CDC Evaluation F’work to Avoid “Minefields” on the Road to Good Evaluation. Presented to: 2002 National Asthma Conference October 24, 2002.
E N D
By: Thomas J. Chapel, MA, MBA Office of the Director Office of Program Planning and Evaluation Centers for Disease Control and Prevention Using the CDC Evaluation F’work to Avoid “Minefields” on the Road to Good Evaluation Presented to: 2002 National Asthma Conference October 24, 2002
Why We Evaluate “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason, that there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus –MMWRFramework for Program Evaluation in Public Health 1
Defining Evaluation • Evaluation is... • the systematic investigation of the merit, worth, or significance of an “object”–Michael Scriven • Program is... • any organized public health action/activity
Research vs. Program Evaluation • A continuum, not a dichotomy, but at far ends may differ in: • Framework and steps • Decision making • Standards • Key questions • Design • Data collection sources and measures • Analysis timing and scope • Role of values in making judgments • Centrality of attribution as conclusion • Audiences for dissemination of results
The Continuum • Efficacy…does my effort work in ideal circumstances • Effectiveness…does my effort work in real world settings, and work the same way across settings • Implementation fidelity…is my (efficacious and effective) effort being implemented as intended.
Today’s Focus Top Minefields on the Road Conducting Good Evaluation! –MMWRFramework for Program Evaluation in Public Health 5
Minefield # 8 Not linking planning and evaluation… –MMWRFramework for Program Evaluation in Public Health 6
Minefield # 7 Evaluating only what you can measure… –MMWRFramework for Program Evaluation in Public Health 7
You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99) –MMWRFramework for Program Evaluation in Public Health 8
Minefield # 6 Thinking evaluatively only at the end… –MMWRFramework for Program Evaluation in Public Health 9
When to Evaluate…. Good program evaluation shifts our focus from “Did it (my effort) work?” to “Is it (my effort) working?” –MMWRFramework for Program Evaluation in Public Health 10
Minefield # 5 Not asking “ who (else) cares… –MMWRFramework for Program Evaluation in Public Health 11
Minefield # 4 Neglecting intermediate outcomes… –MMWRFramework for Program Evaluation in Public Health 12
Minefield # 3 Neglecting process evaluation… –MMWRFramework for Program Evaluation in Public Health 14
Minefield # 2 Confusing attribution and contribution… –MMWRFramework for Program Evaluation in Public Health 15
Minefield # 1 Using more “sticks” than “carrots”… –MMWRFramework for Program Evaluation in Public Health 17
Standards forEffective Evaluation • Not HOW TO do an evaluation, but help direct choices among options at each step: • At each step, standards ask which choice(s) • Utility (7): Best serve information needs of intended usersFeasibility (3): Are most realistic, prudent, diplomatic, and frugal given resources • Propriety (8): Best meet law, ethics, and due regard for the welfare of those involved and affected • Accuracy (12): Best reveal and convey technically accurate information 19
Broadening Our Thinking About Evaluation • What to evaluate • When to evaluate • Who should be involved in evaluation • How to evaluate 20
Why Involve Stakeholders • Smoke out disagreements in… • Definition of the problem • Activities and priorities of program • Outcomes that equate to success • What constitutes “proof” of success • Get their help with.. • Credibility of findings • Access to key players • Follow-up • Dissemination of results 22
Using Logic Modelsfor Evaluation • Clarity on • What are activities • What are intended effects • What is the sequence/order of intended effects • Which activities are to produce which effects • Consensus with stakeholders on all of the above • Focus the evaluation design 23
Some Factors That Influence Choice of Evaluation Focus • Users and uses – Who wants the information and what are they interested in? • Accountability to (other) stakeholders – For what effects are key stakeholders expecting to see results? • Resources – Time, money, expertise • Stage of development –How long has the program been in existence? • “Ripple effect”- How far out would an intervention of this intensity reasonably be expected to have an effect?
Setting Evaluation Focus: Some “Process” Issues • What are the likely key challenges to “implementation fidelity? • “Dropped baton” issues are key • Partner failed to do their part • Client/family/patient failed to fulfill their referral • Other common challenges • Inadequate dosage • Bad access • Failure to retain participants • Wrong match of staff and participant 25
Evidence Gathering: Choosing Design • What intervention was actually delivered? • Were impacts and outcomes achieved? • Was the intervention responsible for the impacts and outcomes?
Justifying Claims About Intervention Effectiveness • Performance vs. a comparison/control group • Time sequence • Plausible mechanisms (or pathways toward change) • Accounting for alternative explanations • Similar effects observed in similar contexts 27
Choosing DataCollection Methods • Function of: • Time • Cost • Sensitivity of the issue • “Hawthorne effect” • Ethics • Validity • Reliability 28
Maximizing Use of Results: Key Questions • Who is the audience? • What will be of greatest importance to them? • How will they use the information provided? • How much time will they be willing to spend reading and assimilating the material? • What type of vocabulary will express the information most clearly? 29
Some CDC Asthma Examples • Comprehensive School-Based Asthma Project • Controlling Asthma in American Cities (CAAP) Project 30