640 likes | 660 Views
Strategies for improving Monitoring and Evaluation. South African Monitoring & Evaluation Association Inaugural conference 28 to 30 March 2007. Associate Professor Patricia Rogers CIRCLE at RMIT University, Australia. Sign at the Apartheid Museum, Johannesburg.
E N D
Strategies for improving Monitoring and Evaluation South African Monitoring & Evaluation Association Inaugural conference 28 to 30 March 2007 Associate Professor Patricia Rogers CIRCLE at RMIT University, Australia
Good evaluation can help make things better • Bad evaluation can be useless – or worse • Findings that are too late, not credible or not relevant • False positives (wrongly conclude things work) • False negatives (wrongly conclude things don’t work) • Destructive effect of poor processes
Overview of presentation The ‘Big Five’ problems in M & E Seven possible strategies for improving the quality of M & E
Questions for you The ‘Big Five’ problems in M & E Are these relevant for South Africa? Are there others? Which are most important to address? Seven possible strategies for improving the quality of M & E Are these relevant for South Africa? Are there others? Which are most important to enact – and how?
1. Presenting a limited view • Only in terms of stated objectives and/or targets • Only from the perspectives of certain groups and individuals • Only certain types of data or research designs • Bare indicators without explanation
2. Unfocused • Trying to look at everything – and looking at nothing well • Not communicating clear messages
3. Unrealistic expectations Expecting • too much too soon and too easily • definitive answers • immediate answers about long-term impacts
4. Not enough good information • Poor measurement and other data collection • Poor response rate • Inadequate data analysis • Sensitive data removed • Pressure to fill in missing data
5. Waiting till the end to work out what to do with what comes out
5. Waiting till the end to work out what to do with what comes out
5. Waiting till the end to work out what to do with what comes out • Collecting lots of data – and then not being sure how to analyse it • Doing lots of evaluations – and then not being sure how to use them
Avoiding the Big 5 LIMITED VIEW
Avoiding the Big 5 LIMITED VIEW UNFOCUSED
Avoiding the Big 5 LIMITED VIEW UNFOCUSED UNREALISTIC
Avoiding the Big 5 LIMITED VIEW UNFOCUSED GAPS IN DATA UNREALISTIC
Avoiding the Big 5 LIMITED VIEW UNFOCUSED WHAT TO DO WITH IT? GAPS IN DATA UNREALISTIC
Seven strategies • Better ways to think about M & E • Training and professional development • Organisational infrastructure • Supportive networks • External review processes • Strategies for supporting use • Building knowledge about what works in evaluation in particular contexts
1. Better ways to think about M & E Useful definitions Models of what evaluation is, and how it relates to policy and practice
1. Better ways to think about M & E Useful definitions • Not just measuring whether objectives have been met • Articulating, negotiating: • What do we value? • How is it going?
1. Better ways to think about M & E Models of what evaluation is, and how it relates to policy and practice • Different types of evaluation at different stages of the program/policy cycle – rather than a final activity • The effect of M & E • Iteratively building evaluation capacity
Common understandings of M & E • Including definitions and models in major documents, not just training manuals • Having these familiar to managers, staff and communities, not just to evaluators • Also recognising the value of different definitions and conceptualisations
Conceptual model for evaluation:Evaluation is often seen as a final activity Mainstreaming Social Inclusion http://www.europemsi.org/index.php
Conceptual model for evaluation:Evaluation is often seen as a final activity But this can lead to: Leaving it all to the end (no baselines) Not being evaluative early on
Conceptual model for evaluation:Different types of evaluation at different stages of the program/policy cycle Needs analysis Outcomes Evaluation & performance monitoring Program or policy design Implementation of activities and ongoing management Continuous improvement Based on Funnell 2006 Designing an evaluation
Conceptual model for evaluation:The effect of M & E The underlying programme logic of the South African Public Service Commission M & E system Overall result: Better governance and service delivery in South Africa Problems are addressed Learning from good practice examples takes place Departments focus on priority areas Achievements are affirmed and promoted FOLLOW UP: Problem areas identified Good practice by others is identified and promoted Priority areas in public administration are communicated Departments reflect on their own performance REPORTING: Public service monitoring Public Service Commission, 2003
Simple model of building evaluation capacity Better outcomes for the public Improved programs Application of new capacity Build skills and knowledge in M & E Various activities
Conceptual model for evaluation:Iterative model of building evaluation capacity Better outcomes for the public Improved programs (through improved implementation, better resource allocation, or improved selection of programs) Development of systems to apply evaluation capacity to undertake, oversee and use discrete evaluations, ongoing evaluative activity and monitoring Identify existing capacity and build new capacity (types of capital) Opportunities to deploy the capacity Human Economic Social Organisational Rogers, 2002 Various activities
2. Training and professional development WHO is to receive training? HOW will training be undertaken? WHAT will training cover? WHO will control content, certification and accreditation?
Training and professional development - WHO WHO is to receive training? • Those formally named as evaluators? • Those with formal responsibility for doing evaluation? • Those who will commission or require evaluation? • Those who will use evaluation (eg program managers, policy makers)? • Citizens and citizen advocates?
Training and professional development - HOW HOW will training be undertaken? • Timing – before working in evaluation, or as ongoing professional development? • Duration – a few days, a few weeks, a few years? • Intensity – concentrated, weekly, annually, “sandwich”? • Method – face to face, distance (email, webinars, teleconference, videoconference), selfpaced? • Level – short course, certificate, graduate program (Master’s, Graduate Diploma, PhD)? • Customisation – generic, sector-specific, organisation-specific?
Training and professional development - WHAT WHAT will training cover? • An integrated package – or a specific topic? • Methods for identifying the type of M & E required and Key Evaluation Questions? • Evaluation designs? • Specific types or a range? • Methods of data collection? • Specific types or a range- especially mixed qual and quant? • Methods of data analysis? • Specific types or a range? • Focus on critical thinking? • Approaches to reporting and supporting use? • Managing evaluation – including participatory processes? • Identifying and including existing skills and knowledge?
Example of suggested evaluation competencies (Stevahn et al, 2006)
Training and professional development – Short course examples • University of Zambia M & E course • IPDET (International Program for Development Evaluation Training) Independent Evaluation Group of the World Bank and Carleton University, http://www.ipdet.org • CDC (Centers for Disease Control), Summer Institute USA http://www.eval.org/SummerInstitute/06SIhome.asp • The Evaluators Institute San Francisco, Chicago, Washington DC, USA www.evaluatorsinstitute.com • CDRA (Community Development Resource Association) Developmental Planning, Monitoring, Evaluation and Reporting, Cape Town, South Africa www.cdra.org.za • Pre-conference workshops AfrEA - African Evaluation Association www.afrea.org AEA - American Evaluation Association www.eval.org SAMEA - South African Monitoring and Evaluation Association www.samea.org.za AES - Australasian Evaluation Society www.aes.asn.au EES – European Evaluation Society www.europeanevaluation.org CES – Canadian Evaluation Society www.evaluationcanada.ca UKES – United Kingdom Evaluation Society www.evaluation.org.uk
Training and professional development – Graduate programs • Centre for Research on Science and Technology, The University of Stellenbsoch, Cape Town, South Africa, Postgraduate Diploma in Monitoring and Evaluation Methods . One year couse delivered in intensive mode of face to face courses interspersed with self-study. • School of Health Systems and Public Health (SHSPH), the University of Pretoria, South Africa, in collaboration with the MEASURE Evaluation Project M&E concentration in their Master of Public Health degree program. Courses taught in modules of one to three weeks, six-month internship and individual research. • Graduate School of Public & Development Management (P&DM), the University of the Witwatersrand, Johannesburg • Electives on monitoring and evaluation as part of their Masters Degree programmes in Public and Development Management as well as in Public Policy. • Centre for Program Evaluation, University of Melbourne, Australia Masters of Assessment and Evaluation. Available by distance education. www.unimelb.edu.au/cpe • CIRCLE, Royal Melbourne Institute of Technology, Australia Masters and PhD by research • University of Western Michigan, USA Interdisciplinary PhD residential coursework program.
Training and professional development – On-line material • Self-paced courses • Manuals • Guidelines
Training and professional development – Key Questions • Who controls the curriculum, accreditation of courses and certification of evaluators? • What are the consequences of this control?