230 likes | 370 Views
The ALNAP Meta-evaluation. Tony Beck Presentation for the IDEAS Conference, Delhi, 14 th April 2005. Outline. Background The ALNAP Quality Proforma Agency visits Findings from the agency visits Finding from the Quality Proforma. What is the ALNAP and its meta-evaluation?.
E N D
The ALNAP Meta-evaluation Tony Beck Presentation for the IDEAS Conference, Delhi, 14th April 2005
Outline • Background • The ALNAP Quality Proforma • Agency visits • Findings from the agency visits • Finding from the Quality Proforma
What is the ALNAP and its meta-evaluation? • An overview of evaluation of humanitarian action quality • Identification of strengths and weaknesses • Recommendations for improvement across the sector and in individual agencies
Process • Review of evaluation reports against a set of standards • Visits to and interaction with agency evaluation offices Focus: • 2001-2002: Accountability • 2003-2005: Accountability and: good practice, dialogue, interaction
The ALNAP Quality Proforma • ALNAP’s meta-evaluation tool • Draws on good practice in EHA and evaluation in general • Revised and peer reviewed in 2004
The ALNAP Quality Proforma Made up of seven sections: • Terms of reference • Methods, practice and constraints • Contextual analysis • Analysis of intervention • Assessing the report • Overall comments
4 point rating scale A = good B = satisfactory C = unsatisfactory D = poor Guidance notes for meta-evaluators. Eg: Consideration given to confidentiality and dignity? Guidance: The evaluation report should detail how the overall approach and methods will protect confidentiality and promote respect for stakeholders’ dignity and self-worth. The ALNAP Quality Proforma
The ALNAP Proforma Coverage 2001-2005: 197 evaluations Process • 2 meta-evaluators • Reconciliation of rating • Analysis by section
Mainstreaming of the Quality Proforma • By ECHO to revise tor (lesson learning, protection, identification of users, prioritisation, time frame and users of recommendations etc) • DEC Southern Africa evaluation (rated 7 agency report) • Groupe URD (for planning of evaluations)
Agencies included in dialogue: 2003-4 CAFOD, Danida, ECHO, ICRC, OCHA, OFDA, Oxfam, SC-UK, SIDA, UNHCR, and WHO
Purpose of agency dialogue • Agency response to initial two years of use of Quality Proforma • To discuss Quality Proforma rating and agency strengths and weaknesses • To discuss processes leading to good evaluation practice • To discuss goof practice
Findings from dialogue with evaluation managers • Areas affecting evaluation quality are not currently captured by the QP, eg • Evaluation quality depends on subtle negotiations within agencies • Evaluation funds in most cases are not being allocated for follow-up • Follow-up to recommendations is complex • More agencies are using tracking matrices
Findings from dialogue with evaluation managers: the EHA market • Main constraint to improved evaluation quality is agencies accessing available evaluators with appropriate skills • Does the EHA market need further regulation?
Findings from the Proforma - 2005 • Improvement in most areas noted above of between 10 and 30 per cent • Too early to disaggregate or suggest why this improvement has taken place • Still a number of areas of generic weakness
Conclusions Process: • Meta-evaluations need to include interaction with those being meta-evaluated • Agency visits have been important is discussing constraints to improved evaluation quality • Meta-evaluations need to maintain an appropriate balance between accountability functions and the need to improve evaluation quality through lesson learning
Conclusions: findings • EHA demonstrates some areas of strength, and improvement over four years, eg use of most of the DAC criteria, analysis of HR • Many evaluative areas need to be strengthened, eg gender, identification of use and users, participation of primary stakeholders, transparency of methodologies used