1 / 34

Evaluation Planning and Management John Wooten ~ Lynn Keeys

WELCOME. Evaluation Planning and Management John Wooten ~ Lynn Keeys. June 2012. Session 1: Course Introduction. Objectives: Introduce the course, facilitators, participants Set protocols Overview course objectives and logistics Conduct a pre-test . What We Remember.

tarala
Download Presentation

Evaluation Planning and Management John Wooten ~ Lynn Keeys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WELCOME Evaluation Planning and Management John Wooten ~ Lynn Keeys June 2012

  2. Session 1: Course Introduction Objectives: • Introduce the course, facilitators, participants • Set protocols • Overview course objectives and logistics • Conduct a pre-test

  3. What We Remember “The more you can hear it, see it, say it, and do it, the easier it is to learn.” Colin Rose Accelerated Learning Action Guide On average, we remember: • 20% of what we read • 30% of what we hear • 40% of what we see • 60% of what we do • 90% of what we read, hear, say and do

  4. Rules/Guidelines • Cell Phones OFF, please • Respect the speaker • Keep questions relevant to topic • Start on time • Actively participate

  5. Participants’ Introductions • Name • Office and role • Level of evaluation planning and management experience (Lo-Med-Hi) • Two course expectations • How will fulfilling these expectations impact your job/career?

  6. Course Objectives USAID’s iterative approach to EPM, including new project design and evaluation policy guidance Basic terms, concepts and methodological challenges The importance of performance baselines in evaluation All phases of EPM Planning and managing different types of evaluations

  7. Course Objectives (continued) Data collection methods and tools Evaluation statements of work Data analysis and use The importance and content of evaluation follow-up plans to reporting, disseminating and using evaluation findings Common pitfalls in EPM Flexible evaluation checklists

  8. Logistics • Class time • Breaks • Parking lot • Small group areas • Small groups assignments • Courtesy rules • Special needs? • Course evaluation Proactive note-taking

  9. Pre-Test • Closed “book & mouth”, please  • Questions: • Multiple choice • Fill in the blanks • Cross references

  10. Session 2: USAID Evaluation Planning and Management Objectives: • Understand why we evaluate • Review USG policy on evaluation and USAID contexts • Review highlights of the revised USAID project design and evaluation policies • Introduce some key terms and types of evaluations • Overview USAID’s program cycle and context for evaluation • Review some key values to guide evaluation

  11. Why Evaluate? “People and their managers are working so hard to be sure things are done right, that they hardly have time to decide if they are doing the right things.” Stephen R. Covey Author

  12. Why Evaluate? Early 1990s, U.S. Congress found: • Waste/inefficiency undermine confidence in government and reduces ability to address vital public needs • Federal managers disadvantaged due to insufficient articulation of program goals and inadequate info on performance • Congress seriously handicapped by insufficient attention to program performance and results

  13. Why Evaluate? …It’s the law of the land! US Government Performance Results Act, 1993 (GPRA) • Holds entire USG accountability for achieving results • Focuses on results, service quality and customer satisfaction • Requires objective information on effectiveness and efficiency in achieving objectives • Improves the internal management of the USG • Requires Strategic Plans per agency with regular performance assessments and program evaluations • http://www.whitehouse.gov/omb/mgmt-gpra/gplaw2m

  14. But Why Else Evaluate? “Be ware the watchman…” Sir Josiah Stamp

  15. USAID and Donor Evaluation Experiences USAID • Rich performance management and evaluation history and culture • Evaluation leader among donors   • Past decade, quality and leadership slipped   • Recent efforts to reclaim leadership as a “learning institution” Other Donors Paris Declaration: Aid Effectiveness and Accra Agenda for Action • Ownership  • Alignment  • Harmonization • Results  • Mutual accountability • Inclusive partnerships  • Delivering results

  16. Reinvigorated Project Designs and Evaluations New Project Design Guidance • Designs informed by evidence, supported by analytical rigor • Promote gender equality, female empowerment • Strategically apply innovative technologies • Selectively target and focus on investments with highest probability of success • Designwithevaluationinmind, rigorously measure and evaluate performance and impact…

  17. Reinvigorated Project Designs and Evaluations New Project Design Guidance • Design with clear sustainability objectives • Apply integrated/multi-disciplinary approaches • Strategically leverage or mobilize “solution-holders” and partners • Apply analytic rigor, utilize best available evidence • Broaden the range of implementing options…

  18. Reinvigorated Project Designs and Evaluations New Project Design Guidance • Incorporate continuous learning for adaptive management (re-examining analytic basis) • Implement peer review processes • Promote collaboration and mutual accountability • Demonstrate USAID staff leadership in the project design effort

  19. Reinvigorated Project Designs and Evaluations New Evaluation Policy • More and higher quality evaluations (2 types) • Evidence-based evaluation and decision-making • Generating knowledge for the dev. Community • Increased transparency on return on investments • Evaluation as an integral part of managing for results • Designing with evaluation in mind • Building local evaluation capacity…

  20. Reinvigorated Project Designs and Evaluations New Evaluation Policy • Management actions: • More training • Evaluation Audits • DEC submissions • Peer SOW Reviews • Annual Evaluation Plan • Evaluation Point-of-Contact • At least one opportunity for an impact evaluation per DO • Evaluating all large and all pilot projects • Thematic or meta evaluations • Best affordable evaluation designs • Collection/storage of quantitative data

  21. Reinvigorated Project Designs and Evaluations • “Meaning for ME?” More + More + Much More !More aggressive, direct involvement of USAID staff !Morecarefully integrated, systemic approach !!Much more rigorous evidence-based planning and decision-making throughout the entire program cycle +“Unprecedented transparency”(A/AID)

  22. “Meaning for ME?” More + More + Much More USAID expects you to… • Define and organize your work around the end results you seek to accomplish. This requires: • Making intended results clear and explicit • Ensuring agreement among partners, customers, and stakeholders that proposed results are worthwhile (relevant and realistic) • Organizing your work/interactions to achieveresultseffectively

  23. Some Key Terms and Definitions • Evaluation • Performance Indicators • Performance Monitoring • Performance Management -- Managing for Results (MFR) • Evaluation Design • Performance Evaluations • Impact Evaluations • Attribution • Counterfactual

  24. Types of Evaluations Performance Evaluation (Normative) • Reviews performance against agreed standards • Assesses mgmt. structure, performance, resource use • Reviews project design/development hypothesis • Reviews progress, constraints and opportunities • Assesses likelihood of achieving targets • Provides notional judgments on project’s perceived value Evaluation design challenges • Clarity/flexibility of project design • Appropriateness of a few evaluation questions

  25. Types of Evaluations Impact Evaluation (Summative) • Probe/answer ‘cause-effect’ questions testing the development hypothesis • Require comparison group (counterfactual), baselines and end-line indicator data • Extrapolate broader lessons and policy implications Evaluation design challenges • Timing • Internal/external validity (ruling out “noise”) • Availability, adequacy, comparability of baseline and end-line data

  26. Evaluation within USAID Program Context USAID Program Cycle • http://www.usaid.gov/our_work/policy_planning_and_learning/documents/ProgramCycleOverview.pdf

  27. Evaluation within USAID Program Context Strategy Implementation Roadmap Project Design and Implementation Roadmap Evaluation Roadmap

  28. Evaluation within USAID Program Context

  29. Evaluation within USAID Program Context Conceptual  Analytical  Approval

  30. Evaluation within USAID Program Context

  31. Evaluation within USAID Program Context

  32. Values for Planning and Managing Evaluations • Designing for Learning • Best Methods • Local Capacity Building/Reinforcing • Unbiased Accountability • Participatory Collaboration • Evidence-based Decision-making • Transparency (revisited)

  33. Values for Planning and Managing Evaluations “Unprecedented transparency…” • Deciding on an evaluation design • Disseminating the evaluation report upon completion • Registration Requirement • Statement of Differences • Standard Reporting and Dissemination • DEC Submissions • Data Warehousing

  34. Evaluation Planning and Management John Wooten ~ Lynn Keeys  • Thank you~ (jwootenjr@yahoo.com) (lynnkeeys50@yahoo.com)

More Related