80 likes | 100 Views
An evaluation of draft guidance for better quality outcomes in the CPPB field, focusing on effectiveness, sustainability, and revisions to address the needs of evaluators and managers. Key points include managing evaluations, conducting evaluations, conflict analysis, and sustainability measures.
E N D
Reflections on Revising the Guidance: An Evaluation Dr. Thania Paffenholz Oslo 17 February 2011
Impact Contribution to better Quality in CPPB work Outcome 3 Results Improved Evaluation Practice Outcome 2 Outcome 1 Awareness for/of EVAL + CPPB field Draft Guidance Output 2 DAC Networks + Experts Input
Evaluation along Criteria • Relevance • Is the Draft Guidance responding to the needs of CPPB field? • Is the Draft Guidance responding to the needs of evaluation field? • Effectiveness • Intended outcomes: • How effectively has the Draft Guidance replied to the needs of the two main target groups: Evaluators and Evaluation Managers? • Unintended outcomes: • What kinds of other outcomes (positive and negative) have the Draft Guidance so far produced? • Sustainability • How can the Guidance be used for sustainable learning in CPPB? What kinds of processes have been build-in to ensure follow ups?
Revision: Main Points • Structure of Guidance • needs to serve purpose of audiences • Evaluation Managers • Evaluators (DEV + CPPB) • Broader CPPB field • New Overall Structure • Introduction of CPPB Context • Introduction into evaluations • General • Specificities of CPPB evaluation (incl. conflict sensitivity as eval goal, transversal theme or conduct issue • Managing/Preparing an Evaluation • Conducting an Evaluation • Preconditions for Evaluations -> Planning for Results, Evaluability and closing strategic gap
Revision: Main Points • Chapter 2 Managing/Preparing an Evaluation: Points to be added/changed • Evaluation’s general focus • Evaluation Criteria (DAC + 3C) + transversal themes (e.g. conflict sensitivity, gender) • Process Design • Elaborate on Conflict Analysis Topic • Build-in Quality Control • Phases + Reporting (distinction along Types of eval) • Politics and other real world risks • Build-in Reference/Steering group + ombudsperson • Feedback, Dissemination, Learning etc • TORs + how they will be adapted after inception phase! • (Flexible) budgets • Request (potential) evaluators to develop proposal for evaluation design, approaches, methodologies, feasibility
Revision: Main Points • Chapter 3: Conducting an Evaluation: much more focus on HOW • Overall evaluation designs • Distinction between different types + scopes of evaluations • Evaluation Approaches: purposes + HOW+ best practice • Linking elements + methodology to criteria • Relevance: need for conflict analysis + theory of change + HOW to do it with a set of options + examples (incl. sampling) • Effectiveness (theory of change), etc. • Clarification about impact assessment (impact versus outcomes versus conflict effects) • Core challenges • Data gathering under constraints including overreliance on interviews + reality based options on HOW incl. sequences + feasibility • Politics
Revision: Main Points • Conflict / Context Analysis • Transparency about • HOW, Ownership and USE in Evaluation • Adjusting Types of Analysis to Evaluation Goals • Elements • Historical, Socio-economic context, etc • National + local level • Conflict Analysis insufficient, more elements needed • Analysis of Peacebuilding Context and short, medium and long-term needs • Assessing conflict sensitivity of activities: general + context specific definition and assessment (+ options for how, ex. Coverage/partners, power relations => link to conflict analysis) • Assessing conflict monitoring capacity/performance • Assessing adaptation capacities • ‘Conflict’ is not always the good term!!!
Sustainability • Draft Guidance follow ups • Revision • Dissemination in different forms • Capacity Building/Training for Eval Managers + (potential) Evaluators • Work on Evaluation Culture • Awareness Building in different communities • DAC EVAL Net to draft harmonized SUPER Guidance • How to use the Guidance for Learning in CPPB • INCAF to make use of policy lessons • Ongoing feedback loop needs to be institutionalised