1 / 14

PA546 16 February 2010

PA546 16 February 2010. Identifying outcomes Measuring outcomes Monitoring performance (a few highlights) Internal vs external evaluators. Road Map. Up to 7:15 - Review of proposals What is the role of client input in evaluation? [Morris case]

jeff
Download Presentation

PA546 16 February 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PA54616 February 2010 Identifying outcomes Measuring outcomes Monitoring performance (a few highlights) Internal vs external evaluators

  2. Road Map • Up to 7:15 - Review of proposals • What is the role of client input in evaluation? [Morris case] • What to consider in measuring client satisfaction • Working with outcome indicators • Urban Institute Project • Review of assignment due 2/23

  3. Review of proposals • Team: Present on • Your understanding of the evaluation’s purpose • Proposed research question(s) • Proposed methodology • Class: [take the role of a consulting group] • Ask questions to clarify • Make suggestions: approach, resources, methodology

  4. The Collaboration Case • What is the problem in the case? • What arguments support involving consumer feedback? • What arguments suggest it isn’t necessary right now? • What lessons can you draw from the case & commentaries • About evaluator/client roles in defining research question? • About evaluator/client roles in defining informants? • About getting customer feedback? • What are your opinions about when to get consumer feedback & what questions to ask?

  5. What clients look for • Tangibles: Appearance of facilities, equipment,staff, materials. • Reliability: Able to perform as promised, dependably and accurately. • Responsiveness: Willingness to help clients; provide prompt service. – • Competence: Possess required skills & knowledge to perform services – • Courtesy: Polite, respect, consideration, friendliness of contact staff. – • Credibility: Trustworthiness, believability, honesty of provider. – • Security: Freedom from danger, risk or doubt. – • Access: Approachability and ease of contact. • Communication: Keep customers informed in language they can understand and listening to them. – • Understanding the Client: Making effort to know clients and their needs Source: http://www.tbs-sct.gc.ca/eval/pubs/pubs-to-1995/satis-satis_e.asp Indicators listed in Annex II [Canadian Comptroller Gen’l]

  6. More on measuring satisfaction • Indicators for services wt children • % caretakers who (1) understand program services, (2) rate them as effective, efficient, coordinated, and responsive in meeting (a) their child’s needs, (b) their family’s needs  • % caretakers satisfied with their involvement in program decision making • Percent of children who enjoy their participation in the program setting Source: http://ag.arizona.edu/sfcs/cyfernet/nowg/satisfac.htmlA collaboration between U of Arizona, Cooperative Extension • Rossi et al (p 226) include satisfaction with results • Example: Has home-delivered meal program been helpful to you in maintaining your health and nutrition (leading question?) • See L. L. Martin & P. M. Kettner (1996) Measuring the Performance of Human Service Programs Note: recommended search term “Satisfaction wt program services”

  7. Outcome Indicators: Useful terms • Outcome level: status of outcome at a point in time (% teenagers who smoke) • Outcome change: difference between outcomes at two points in time • Program effect: portion of change that can be attributed to program • Indicators of program outcomes • Characteristics of target population or social condition

  8. Outcome Indicators Project I • Urban Institute & Ctr for What Works • In response to demands for nonprofit accountability • Identified quality indicators for 14 program areas • Criteria for indicators • Specific (unique/nonambiguous) • Observable • Understandable • Relevant • Time bound • Valid

  9. Outcome Indicators Project II • Indicators actually used by practitioners • Indicators that have worked for specific program areas • Common Framework for outcomes that are • Program-centered (reach, participation, satisfaction) • Client-center (knowledge/learning/attitude, behavior, condition/status, • Community centered • Organization centered [incomplete]

  10. Examining a program • Outcome sequence chart & outcomes • Note: satisfaction is not viewed as having a place within the sequence • List of outcomes includes data sources • Follow up • Involve stakeholders in deciding what to track • Start with a few indicators at first • Add indicators as org learns what is useful • Tabulate data by client grouping, age, sex

  11. Interpreting the outcome data • Suggests what works well and what doesn’t • Can identify categories of clients that the program works for & those it doesn’t • Improves interpretation if accompanied by information on client demographics, program process, utilization (think of these as independent variables) • Pre-post data may make more powerful (but possibly misleading) statement [consider threats of history & maturation]; better if linked to benchmarking data

  12. Pitfalls with outcome monitoring • Goal displacement • Corruption of measures • Misinterpreting outcome data • Knowing the outcomes does not demonstrate program effectiveness or explain why the outcome/change in outcome occurred

  13. Class 2/23 • Presentation of proposals • Morris 5 & Rossi 8 • Short paper “Applying Outcome Measures to [program type]” • Select a program area from Outcome Indicators Project • Identify & read an evaluation report on program area • Indicate: bibliographic information; purpose of evaluation; research question(s); methodology (sample, type evaluation, key variables) • Write a short essay considering relevance of outcome indicators to the evaluation report

  14. Short paper (now due 2/23) • Select an outcome indicator at http://www.urban.org/center/cnp/projects/outcomeindicators.cfm • At google scholar enter “evaluation [name of outcome indicator, e.g., youth mentoring programs] • Read article – (1) note evaluation criteria used (2) use outcome indicators to help assess article]. [Assignment is introduce you to practice, familiarize you wt outcome indicator project; may be basis for mini-study}

More Related