580 likes | 1.45k Views
Utilization Focused Evaluation. Michael Quinn Patton. The Field of Program Evaluation. “ What has to be done to get results that are appropriately and meaningfully used?”
E N D
Utilization Focused Evaluation Michael Quinn Patton
The Field of Program Evaluation • “What has to be done to get results that are appropriately and meaningfully used?” • “There’s not enough money to do all the things that need doing; and 2nd, even if there were enough money, it takes more than money to solve complex human and social problems” • “Can Evaluation contribute to program Effectiveness?”
Traditional Empirical Evaluation Rigorous Methodology Experimental Design Sophisticated Stats No Regard for whether Decision-Makers understood analysis Utilization Focused Evaluation Utility – Provide practical information Feasibility – Realistic, frugal, diplomatic Propriety – legal and ethical for all involved Accuracy – Reveal and convey technically adequate information which determine the worth of the program Evaluation through the Ages
“UFE” -- A Definition Evaluations should be judged by their utility and actual use. Evaluators should facilitate the full process and design – from beginning to end! The focus is on intended use by intended users! Move from possible audiences and potential use to real and specific users with explicit commitments.
Program Evaluation Systematic collection of information about activities, characteristics and outcomes of programs (page 22): To make judgments about programs, Improve program effectiveness, and/or Inform decision about future programming
Instrumentation -1 • Individuals do not share all the same values – examining beliefs and testing actual goal attainment is neither natural nor widespread • Early on, assess incentives for, and barriers to reality testing within the program culture • The irony of life in the information age is we are surrounded by so much misinformation, and act on so many untested assumptions!
Instrumentation -2 • Early brainstorming – defining program goals – considering ways that these goals might be measurable • Program staff may be waiting to be told what to do • Use program staff questions and concerns –phrased in their terms, their nuances, and considering local circumstances • Selecting stock ‘measurement instruments’ at the outset is not wise
The Questions • Data can exist to address the question(s) – an empirical issue exists! • More than one possible answer to the question – not predetermined • Primary intended user wants info to help answer the question – they care • Primary intended user wants to answer the question for themselves not just for someone else • Intended user can indicate how they could use the answer – the relevance to future action
Frames of Reference • Evaluation is like a camera. It captures only what it’s pointed at • Evaluation is like an empty envelope. You can use it to send someone a message • Evaluation is like a toothbrush. If used correctly, you get out particles; if used lightly, some gunk stays and rots the teeth
Frames of Reference-2 Cool Hand Luke is often in operation when discussions arise between evaluators and non-evaluators: “What we’ve got here is a failure to communicate” PAUL Newman 1967
Getting to the Stakeholders • Surface those who want to know • May be better to work with lower-level, hands-on staff than complacent administrators • Be strategic and sensitive when asking for a person’s time and involvement with busy people
Getting to the Stakeholders-2 • Work to build and sustain interest in evaluation • Use skill in building relationships, facilitating groups, managing conflict, walking political tightropes, & using effective interpersonal communication • Some evaluations have multiple layers of stakeholders – so prepare for multi-layer stakeholder involvement and multiple methodologies (p. 53)
Some Don’ts for Evaluators • Don’t be primary decision maker • Don’t work with passive audience • Don’t select organizations as target users – use people • Don’t focus on decision-makers, choose decisions • Don’t assume the fund source is a primary stakeholder • Don’t wait for research findings to ID users • Don’t stand above the fray of people/Politics
Evaluation Uses • Judgment-oriented evaluation • Determine merit of project – how effectively it meets the needs of those intended to target - Audit • Determine worth of project – extrinsic value to those outside the program, the community “Does the program ‘work’? Attain its goals? Is it ‘accountable’?” “Summative evaluation” [pp. 65-68]
Evaluation Uses • Improvement oriented evaluation • “Formative Evaluation”, quality enhancement, TQM, start-up period • Making things better rather than reach a conclusive judgment –Strengths/Weaknesses • Program’s strengths/weaknesses, progress toward goals/outcomes • Unexpected events “When the cook tastes the soup – that’s formative. When the customer does, that’s summative.”
Evaluation Uses • Knowledge-oriented evaluation • Influence thinking about issues in a general way – Theory building • Enhance communication, facilitate and share perceptions • What did participants really experience • Bring forth enlightenment! • Improve understanding of ‘what’ and ‘why’ – Policy Making Contribute to implementation theory – help develop “best Practices”
Evaluation Relationships Knowledge-Oriented Improvment-Oriented Judgment-Oriented Page 76
The Point Of UFE • Utilization-Focused Evaluation can hasten change – or provide a new Impetus to get things rolling • Uncertainty Reduction • Impacts probably observed as Ripples, not Waves • Don’t assume HIGH or LOW Expectations – Determine Users’ Expectations, the Negotiate a Shared Reality & Mutual Commitment
Increasing the Impact of EvaluationAsk the Following Questions: • Clarify the Decision Upon Which Evaluation should be Focused • What’s at Stake? • When are Decisions Made? By Whom? • Clarify other Factors Involved – Politics, Personalities, values, Promises • Anticipated Influence of the Evaluation – How to Maximize that • How to know after that the Evaluation took
Less-Than-Obvious Impacts • Sharing/Clarifying Project Missions and Goals • Sharing Understandings between Managers and Line Staff that Focus on OUTCOMES • Integral nature of Data Collection Can Strengthen Program Intervention and enhance sustainability
Skills Gained by Project Staff • Problem Identification • Criteria Specification • Data Collection • Analysis • Interpretation • A feeling of Ownership/Membership in the project
Why Integrate Data Collection into Project-1 • Reinforces and Strengthens Program Intervention • Can be Cost-Effective and Efficient • Enhances sustainability of evaluation by not being a temporary add-on • Not unlike Instructional Design Principle of Telling learner what they will learn, so they will be prepared to learn, and be reinforced as that event takes place
Why Integrate Data Collection into Project-2 • Programs/Projects exist typically to Intervene, to make a change! • Usually doesn’t matter to intended users how much of the change Is due to a pre-sensitization versus actual learning • It’s the Results that count! • Logically & meaningfully interject data collection in ways to enhance achievement of program outcomes