150 likes | 361 Views
Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009. Recent ODI work. 1. Comparative study on evaluation policies & practices in development agencies 2. Improving impact evaluation production & use.
E N D
Development agency support for impact evaluationHarry Jones, ODIImpact Evaluation Conference 2009
Recent ODI work 1. Comparative study on evaluation policies & practices in development agencies2. Improving impact evaluation production & use
Comparative study on evaluation policies and practices in development agenciesMarta Foresti with C. Archer, T. O’Neil, R. LonghurstDecember 2007
Overview of study • Scope and Objectives • A descriptive comparative study of evaluation policies and practices in key agencies, to inform AFD reform process. • key features of evaluation function (e.g. mandate, position, management, roles etc. ) • main aspects of evaluation systems, processes and tools • practices involved in commissioning, managing and supporting evaluation • Activities • Desk Case Studies: DANIDA, EU, OXFAM, IMF (Evaluation Units) • Full Case Studies: DFID, SIDA, WB, AfDB, KFW (Evaluation Units) • + Key Informants Interviews • Outputs • - Case Study Reports (AFD) • - Final Comparative Report • - Workshops: Mid Term (AFD Internal), Dissemination: AFD internal, ODI lunchtime meeting (Feb 08) and DAC network in March 08
Profiles of Evaluation Units - Overview • Variability in budget and staffing • Many evaluation policies being reviewed, updated or created • Mandate not always clear in policies: lack of clarity across organisation • No single/unified methodology
Independence vs Integration • Most EUs sit outside management structure or operational dept. Report to minister/boards etc. • Position of unit important, but also rules for budget allocation, appointment of staff, disclosure (WB, IMF) • All recognise tension between independence and integration. ‘Being involved’ as important as ‘being detached’. • Reliance on ‘usual consultants’: are they ‘really independent’ and ‘free’ to be critical?
Staff capacity, roles and responsibilities • Main responsibilities: tendering, contracts and managing evaluation processes, not doing evaluation. • Different levels and intensity of consultation with other departments, more on implementation and dissemination, less at planning/decision phase • Capacity and evaluation skills of EU staff a major constraint (DFID and others). Focus often on specific sectoral skills (e.g. economists at KFW) • ‘New’ roles and responsibilities: KM and learning, communication, dissemination and capacity building
Communication and dissemination • Of increasing importance, beyond ‘dissemination of findings’ towards effective communication, reach and active engagement of client/stakeholders (big push at WB). • Disclosure policies and transparencies, all reports on website • Products: more than reports: synthesis, briefs, seminars, internet etc. • Limited feedback and weak evidence on utilisation (AfDB)
Improving Impact EvaluationProduction and UseNicola Jones, Harry Jones, Liesbet Steer, and Ajoy DattaMarch 2009
Overview of study • Scope and Objectives • Commissioned by DfID to inform discussions on IE production and use, particularly within NONIE • • To determine how amenable various methods for IE are to different types of projects, programmes and policies; • • To assess the dynamics around commissioning, production and delivery of IEs; • • To analyse how IEs are disseminated and communicated; • • To assess use and influence of IEs; and • • To make recommendations to improve the production and use of IEs. Activities • Scoping study • Literature review • Annotated database of IEs: • Sector Case studies • Synthesis • Outputs • ODI Working paper • Opinion piece
Methodologies: Suitability and opportunities for IE Similarities between sectors • Projects with simple impact pathways • Methodological innovation • Call for pluralism Differences across sectors • Sector history of IE • Gestation of impact • Coverage and relevance
Demand and supply: commissioning, production and delivery • Largely supply-driven • Upward accountability • Less for downward accountability, learning • Some exceptions in social development: range of implementing agencies, Southern government demand
Communication, Use and Influence • Communication varied but difficult at a national level; greater interest and initiatives at international level Use: • Some direct use • ‘Legitimation’ most common function • Indirect and ‘enlightenment’ use
Emerging messages • Common aims but diversity of practices, diversity of sector experiences, and of evaluation questions • Need for plural approach to (I)E quality ; and balance between rigour and coverage for accountability • Improving agency learning from IEs difficult; but crucial to improve programmes • Disconnect between rhetoric on strategic importance of development evaluation, and practice in development agencies. An ‘institutional gap’: need to invest in institutional role of evaluation, at different levels. • How to strengthen Demand for development evaluation?
Thank you! h.jones@odi.org.uk Other relevant ODI work: • Development effectiveness: the role of qualitative research in impact evaluation – Martin Prowse (RPGG) • Re-thinking the impact of humanitarian aid – Karen Proudlock and Ben Ramalingam (ALNAP)