600 likes | 749 Views
Program Evaluation Webinar Series Part 1:. “Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them” Presented by: Tom Chapel. Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them . Thomas J. Chapel, MA, MBA Chief Performance Officer (Acting)
E N D
Program Evaluation Webinar Series Part 1: “Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them” Presented by: Tom Chapel
Top Roadblocks on the Path to Good Evaluation– And How to Avoid Them Thomas J. Chapel, MA, MBA Chief Performance Officer (Acting) CDC/Office of the Director/OCOO Presented November 20, 2008 Tchapel@cdc.gov 404-498-6073
Objectives • Program evaluation and typical “roadblocks” in doing good evaluation. • CDC’s Evaluation Framework as way to surmount roadblocks.
Key Points • In today’s session we will discuss: • What is important about CDC’s framework? • Why does it lead to better use of findings? STEPS Engage stakeholders Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence
Why We Evaluate… • “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. • They thought, with some reason…
Why We Evaluate… …there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus
The Problem • The stuff I do doesn't make a difference! • Why don't things get better?!
Implementing Program Evaluation • How • do I • motivate? Not this… This… What gets in the way?
Today’s Focus Top Roadblocks on the Road to Good Evaluation
Defining Evaluation • Evaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael Scriven
Use the Findings! • If the findings don’t get used… the program will not improve.
What is “Evaluation?” • Evaluation is not… Evaluation is… • An orientation to your program. • The idea of continuous reflection. A specific set of tools or techniques.
Defining Evaluation • Evaluation is the systematic investigation of the merit, worth, or significance of any “object”. Michael Scriven
What is a “Program”? • Not only: • Big training programs • Community interventions • But also: • Recommendations and guidelines • Surveillance systems • In other words, a program is anything with an intended outcome.
Roadblock #6 Not understanding where evaluation “fits in” …
The Integrated or “CQI” Model • To achieve “continuous quality improvement” planners, performance measurers, and evaluators must communicate with each other.
The Customer is the Key • Program evaluation must: • See planning, performance measurement, and evaluation as being integrated. • Start with the idea of having a customer or an intended user of findings. • Direct the evaluation with the customer in mind.
Roadblock #5 Making the “perfect” the enemy of the “good”.
Roadblock #5 What if you said, “To be cardiovascularly fit, you must run a marathon.”?
Thanks, but… • That's not me. I don't have that expertise. I don't have the money to do that. I don't have those skills.
Do What You Can! • There’s always an evaluation worth doing. • The biggest mistake is doing nothing because you can only do a little. • Even a little bit is going to yield some benefit.
Roadblock #4 • Evaluating only what you can “measure”… … because those are the things wecanmeasure with validity, reliability and accuracy.
Upstream Questions • How many brochures? How many trainees? • How many people showed up? Did we get a lot of product out there?
Downstream Questions • What have you • done for me lately? • How has it mattered? What have you done for public health?
Measuring the Right Thing… “…Sometimes, what counts can’t be counted. And what can be counted doesn’t count….” Albert Einstein
Evaluation Starts By Saying… • What are the important things that need to be measured? • Can I measure them with enough rigor to meet the needs of this situation this time? • Sometimes the answer is “NO!”
You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. As a result, today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)
Roadblock #3Neglecting Intermediate Outcomes…. • Nothing has advanced the evaluation • cause in public health more than preaching • this idea of intermediate outcomes.
Intermediate OutcomesContribute to Downstream Success • How is it that my program will make • a contribution to that downstream outcome? We call these “intermediate outcomes”.
What is the Program Logic? • What needs to happen to achieve the desired outcome? • What is the “program logic”? My action Desired outcome
Don’t just ask: Did it work? What are the markers that tell me I’m on the right road? How many tomatoes did I get?
Ask: Is it working? What are the markers that tell me I’m on the right road? • Are planting, watering, and weeding taking place? • Have the blossoms “set”? • Are there nematodes on the plants?
Research Model Develop Theory Program Activities Measure Outcome
Research Model Develop Theory Program Activities Measure Outcome • If I achieved the outcome– great! • If I didn’t achieve the outcome– why?
Evaluation Unpacks the “Black Box” My action Desired outcome
Focus on Intermediate Outcomes • Can we: • pass the ball? • spread out? • spend more time on the opponent’s side of the field?
Forgetting Intermediate Outcomes • ScienceCartoonsPlus.com
What’s In the Box? • My program: • training • technical assistance • funding • partnerships • Desired outcome: • less morbidity • fewer mortalities
What’s In the Box? • My program: • training • technical assistance • funding • partnerships • Desired outcome: • less morbidity • fewer mortalities Intermediate outcomes
The Power of Evaluation • Establishing intermediate outcomes allows you to determine if you are making progress in the right direction.
Why Intermediate Outcomes? • I’m making progress in the right direction. • I am contributing to the downstream outcome.
Identifying Intermediate Outcomes • What is the ultimate outcome I’m seeking? • Who (besides me) needs to take action to achieve it? • What action do they need to take? These are the intermediate outcomes that populate the “black box” or the “program logic”.
Roadblock #2 Confusing attribution and contribution… “I can’t make the case that my program was responsible for that change.”
The Role of Public Health • a direct deliverer of services Public health is not … Public health is… a mobilizer and convener Based on: The Future of Public Health, Institute of Medicine, 1988.
“Networked” Interventions OUTPUTS SHORT-TERM Program A-1 OUTCOMES Agency A Program A-n LONG-TERM OUTCOMES Program B-1 Agency B SYSTEM OUTCOME Program C-1 Agency C Program C-n Program D-1 Agency D Program D-n
Attribution OUTPUTS SHORT-TERM Program A-1 OUTCOMES Agency A Program A-n LONG-TERM OUTCOMES Program B-1 Agency B SYSTEM OUTCOME Program C-1 Agency C Program C-n Program D-1 Agency D Program D-n