1 / 36

Evaluation: Asking the Right Questions & Using the Answers

Evaluation: Asking the Right Questions & Using the Answers. Presented by Annemarie Charlesworth, MA UCSF National Center of Excellence in Women’s Health November 3, 2006. Part 1 - Evaluation Overview Part 2 - Steps to Program Planning and Evaluation

Ava
Download Presentation

Evaluation: Asking the Right Questions & Using the Answers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation:Asking the Right Questions & Using the Answers Presented by Annemarie Charlesworth, MA UCSF National Center of Excellence in Women’s Health November 3, 2006

  2. Part 1 - Evaluation Overview Part 2 - Steps to Program Planning and Evaluation Part 3 - The Logic Model: A Tool for Planning and Evaluation

  3. Part 1 - Evaluation OverviewWhat is Evaluation? • Process of collecting information about your program in order to make some decisions about it. • Complements program management by improving and accounting for program effectiveness.

  4. How is Evaluation Helpful? • Gain insight • Change practice • Assess effects • Affect participants

  5. Gain Insight • Assess needs, desires, and assets of community members. • Identify barriers and facilitators to service use. • Learn how to describe and measure program activities and effects.

  6. Change Practice • Refine plans for introducing a new service. • Characterize the extent to which plans were implemented. • Improve the content of educational materials. • Enhance the program's cultural competence.

  7. Change Practice (cont.) • Verify that participants' rights are protected. • Set priorities for staff training. • Make midcourse adjustments for improvement. • Improve the clarity of health communication messages. • Mobilize community support for the program.

  8. Assess Effects • Assess skills development by program participants. • Compare changes in provider behavior over time. • Compare costs with benefits. • Find out which participants do well in the program. • Decide where to allocate new resources.

  9. Assess Effects (cont.) • Document the level of success in accomplishing objectives. • Demonstrate that accountability requirements are fulfilled. • Aggregate information from several evaluations to estimate outcome effects for similar kinds of programs. • Gather success stories.

  10. Affect Participants • Reinforce program/intervention messages. • Stimulate dialogue/raise awareness regarding health issues. • Broaden consensus among coalition members regarding program goals. • Teach evaluation skills to staff and other stakeholders. • Support organizational change and development.

  11. Types of Program Evaluation • Goals based evaluation (identifying whether you’re meeting your overall objectives) • Process based evaluation (identifying your program’s strengths and weaknesses) • Outcomes based evaluation (identifying benefits to participants/clients)

  12. Type of evaluation depends on what you want to learn… Start with: 1) What you need to decide (why are you doing this evaluation?); • What you need to know to make the decision; • How to best gather and understand that information!

  13. Key questions to consider when designing program evaluation: • For what purposes is the evaluation being done, i.e., what do you want to be able to decide as a result of the evaluation? 2. Who are the audiences for the information from the evaluation (e.g., funders, board, management, staff, clients, etc.) • What kinds ofinformation are needed to make the decision you need to make and/or enlighten your intended audiences?

  14. Key questions (cont.) 4. From what sources should the information be collected (e.g., employees, customers, clients, etc.?) 5. How can that information be collected in a reasonable fashion (e.g., questionnaires, interviews, examining documentation, etc.) 6. When is the information needed (so, by when must it be collected)? 7. What resources are available to collect the information?

  15. Evaluation should be considered during program planning and implementation…Not just at the end!

  16. It is not enough to have agoal…Goals exist becausesome action is needed. However, you can’t argue an action without a deep understanding of the problem. Problem Need Action Goal

  17. Part 2 - Steps to Program Planning and Evaluation

  18. 10 Steps to Planning a Program(and its evaluation!) • Needs and assets • Extent, magnitude and scope of problem • Summary of what’s already being done • Gaps between needs and existing services • Community support • Goals and objectives • Long-term specific to target population • Link short-term objectives to goals 3. Defining the intervention/treatment • program components to accomplish objectives and goals • one or two activities should support each objective

  19. 10 Steps to Planning a Program(and its evaluation!) 4. Developing the program/logic model 5. Choose the type(s) of data collection (i.e., surveys, interviews, etc.) 6. Select your evaluation design (i.e., one group pre/posttest vs. comparison pre/posttest)

  20. 10 Steps to Planning a Program(and its evaluation!) 7. Pilot test tools 8. Collect data 9. Analyze data 10. Report, share, and act on the findings

  21. Part 3 - The Logic Model: A Tool for Planning and Evaluation • Picture of how your organization does its work • Communicates its “rationale” • Explains hypotheses and assumptions about why the program will work • Links outcomes with activities

  22. Logic models help you chart the course ahead … Allow you to better understand • Challenges • Resources available • Timetable • Big picture as well as smaller parts

  23. Basic Logic Model 1. Resources/ Inputs 2. Activities 3. Outputs 4. Outcomes 5. Impact Planned Work Intended Results *From W.K. Kellogg Foundation Logic Model Development Guide

  24. Basic Logic Model

  25. Example Logic Model for a free clinic to meet the needs of the growing numbers of uninsured residents (Mytown, USA) Produced by The W. K. Kellogg Foundation

  26. S.M.A.R.T. • Outcomes and Impacts should be: • Specific • Measurable • Action-oriented • Realistic • Timed

  27. One size does not fit all! • Many different types of logic models • Experiment with models that suit your program and help you think through your objectives

  28. Useful for all parties involved(Funder, Board, Administration, Staff, Participating organizations, Evaluators, etc.) • Convey purpose of program • Show why its important • Show what will result • Illustrate the actions that will lead to the desired results • Basis for determining whether actions will lead to results! • Serves as common language Enhance the case for investment in your program!

  29. Strengthen Community involvement • Created in partnership, logic models give all parties a clear roadmap • Helps to build community capacity and strengthen community voice • Helps all parties stay on course or intentionally decide to go off-course • Visual nature communicates well with diverseaudiences

  30. Logic Models Used throughout the life of your program • Planning • Program Implementation • Program Evaluation May change throughout the life of the program! • Fluid; a “working draft” • Responsive to lessons learned along the way • Reflect ongoing evaluation of the program

  31. The Role of the Logic Model in Program Design/Planning • Helps develop strategy and create structure/organization • Helps explain and illustrate concepts for key stakeholders • Facilitates self-evaluation based on shared understanding • Requires examination of best-practices research

  32. The Role of the Logic Model in Program Implementation • Backbone of management plan • Helps identify and monitor necessary data • Help improve program • Forces you to achieve and document results • Helps to prioritize critical aspects of program for tracking

  33. The Role of the Logic Model in Program Evaluation • Provides information about progress toward goals • Teaches about the program • Facilitates advocacy for program approach • Helps with strategic marketing efforts

  34. References • Kellogg Foundation http://www.wkkf.org/pubs/tools/evaluation/pub3669.pdf • Schmitz, C. & Parsons, B.A. (1999) “Everything you wanted to know about Logic Models but were afraid to ask” http://www.insites.org/documents/logmod.pdf • University of Wisconsin Cooperative Extension http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html • CDC Evaluation Working Group http://www.cdc.gov/eval/logic%20model%20bibliography.PDF • CDC/MMWR - Framework for Program Evaluation in Public Health http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm • McNamara, C. (last revision: Feb 16, 1998) “Basic Guide to Program Evaluation” http://www.managementhelp.org/evaluatn/fnl_eval.htm

More Related