1 / 64

Strategies for improving Monitoring and Evaluation

Strategies for improving Monitoring and Evaluation. South African Monitoring & Evaluation Association Inaugural conference 28 to 30 March 2007. Associate Professor Patricia Rogers CIRCLE at RMIT University, Australia. Sign at the Apartheid Museum, Johannesburg.

emccombs
Download Presentation

Strategies for improving Monitoring and Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Strategies for improving Monitoring and Evaluation South African Monitoring & Evaluation Association Inaugural conference 28 to 30 March 2007 Associate Professor Patricia Rogers CIRCLE at RMIT University, Australia

  2. Sign at the Apartheid Museum, Johannesburg

  3. Good evaluation can help make things better • Bad evaluation can be useless – or worse • Findings that are too late, not credible or not relevant • False positives (wrongly conclude things work) • False negatives (wrongly conclude things don’t work) • Destructive effect of poor processes

  4. Overview of presentation The ‘Big Five’ problems in M & E Seven possible strategies for improving the quality of M & E

  5. Questions for you The ‘Big Five’ problems in M & E Are these relevant for South Africa? Are there others? Which are most important to address? Seven possible strategies for improving the quality of M & E Are these relevant for South Africa? Are there others? Which are most important to enact – and how?

  6. 1. Presenting a limited view

  7. 1. Presenting a limited view

  8. 1. Presenting a limited view

  9. 1. Presenting a limited view • Only in terms of stated objectives and/or targets • Only from the perspectives of certain groups and individuals • Only certain types of data or research designs • Bare indicators without explanation

  10. 2. Unfocused

  11. 2. Unfocused

  12. 2. Unfocused

  13. 2. Unfocused

  14. 2. Unfocused • Trying to look at everything – and looking at nothing well • Not communicating clear messages

  15. 3. Unrealistic expectations

  16. 3. Unrealistic expectations

  17. 3. Unrealistic expectations Expecting • too much too soon and too easily • definitive answers • immediate answers about long-term impacts

  18. 4. Not enough good information

  19. 4. Not enough good information

  20. 4. Not enough good information • Poor measurement and other data collection • Poor response rate • Inadequate data analysis • Sensitive data removed • Pressure to fill in missing data

  21. 5. Waiting till the end to work out what to do with what comes out

  22. 5. Waiting till the end to work out what to do with what comes out

  23. 5. Waiting till the end to work out what to do with what comes out • Collecting lots of data – and then not being sure how to analyse it • Doing lots of evaluations – and then not being sure how to use them

  24. Avoiding the Big 5

  25. Avoiding the Big 5 LIMITED VIEW

  26. Avoiding the Big 5 LIMITED VIEW UNFOCUSED

  27. Avoiding the Big 5 LIMITED VIEW UNFOCUSED UNREALISTIC

  28. Avoiding the Big 5 LIMITED VIEW UNFOCUSED GAPS IN DATA UNREALISTIC

  29. Avoiding the Big 5 LIMITED VIEW UNFOCUSED WHAT TO DO WITH IT? GAPS IN DATA UNREALISTIC

  30. Seven strategies • Better ways to think about M & E • Training and professional development • Organisational infrastructure • Supportive networks • External review processes • Strategies for supporting use • Building knowledge about what works in evaluation in particular contexts

  31. 1. Better ways to think about M & E Useful definitions Models of what evaluation is, and how it relates to policy and practice

  32. 1. Better ways to think about M & E Useful definitions • Not just measuring whether objectives have been met • Articulating, negotiating: • What do we value? • How is it going?

  33. 1. Better ways to think about M & E Models of what evaluation is, and how it relates to policy and practice • Different types of evaluation at different stages of the program/policy cycle – rather than a final activity • The effect of M & E • Iteratively building evaluation capacity

  34. Common understandings of M & E • Including definitions and models in major documents, not just training manuals • Having these familiar to managers, staff and communities, not just to evaluators • Also recognising the value of different definitions and conceptualisations

  35. Conceptual model for evaluation:Evaluation is often seen as a final activity Mainstreaming Social Inclusion http://www.europemsi.org/index.php

  36. Conceptual model for evaluation:Evaluation is often seen as a final activity But this can lead to: Leaving it all to the end (no baselines) Not being evaluative early on

  37. Conceptual model for evaluation:Different types of evaluation at different stages of the program/policy cycle Needs analysis Outcomes Evaluation & performance monitoring Program or policy design Implementation of activities and ongoing management Continuous improvement Based on Funnell 2006 Designing an evaluation

  38. Conceptual model for evaluation:The effect of M & E The underlying programme logic of the South African Public Service Commission M & E system Overall result: Better governance and service delivery in South Africa Problems are addressed Learning from good practice examples takes place Departments focus on priority areas Achievements are affirmed and promoted FOLLOW UP: Problem areas identified Good practice by others is identified and promoted Priority areas in public administration are communicated Departments reflect on their own performance REPORTING: Public service monitoring Public Service Commission, 2003

  39. Simple model of building evaluation capacity Better outcomes for the public Improved programs Application of new capacity Build skills and knowledge in M & E Various activities

  40. Conceptual model for evaluation:Iterative model of building evaluation capacity Better outcomes for the public Improved programs (through improved implementation, better resource allocation, or improved selection of programs) Development of systems to apply evaluation capacity to undertake, oversee and use discrete evaluations, ongoing evaluative activity and monitoring Identify existing capacity and build new capacity (types of capital) Opportunities to deploy the capacity Human Economic Social Organisational Rogers, 2002 Various activities

  41. 2. Training and professional development WHO is to receive training? HOW will training be undertaken? WHAT will training cover? WHO will control content, certification and accreditation?

  42. Training and professional development - WHO WHO is to receive training? • Those formally named as evaluators? • Those with formal responsibility for doing evaluation? • Those who will commission or require evaluation? • Those who will use evaluation (eg program managers, policy makers)? • Citizens and citizen advocates?

  43. Training and professional development - HOW HOW will training be undertaken? • Timing – before working in evaluation, or as ongoing professional development? • Duration – a few days, a few weeks, a few years? • Intensity – concentrated, weekly, annually, “sandwich”? • Method – face to face, distance (email, webinars, teleconference, videoconference), selfpaced? • Level – short course, certificate, graduate program (Master’s, Graduate Diploma, PhD)? • Customisation – generic, sector-specific, organisation-specific?

  44. Training and professional development - WHAT WHAT will training cover? • An integrated package – or a specific topic? • Methods for identifying the type of M & E required and Key Evaluation Questions? • Evaluation designs? • Specific types or a range? • Methods of data collection? • Specific types or a range- especially mixed qual and quant? • Methods of data analysis? • Specific types or a range? • Focus on critical thinking? • Approaches to reporting and supporting use? • Managing evaluation – including participatory processes? • Identifying and including existing skills and knowledge?

  45. Example of suggested evaluation competencies (Stevahn et al, 2006)

  46. Training and professional development – Short course examples • University of Zambia M & E course • IPDET (International Program for Development Evaluation Training) Independent Evaluation Group of the World Bank and Carleton University, http://www.ipdet.org • CDC (Centers for Disease Control), Summer Institute USA http://www.eval.org/SummerInstitute/06SIhome.asp • The Evaluators Institute San Francisco, Chicago, Washington DC, USA www.evaluatorsinstitute.com • CDRA (Community Development Resource Association) Developmental Planning, Monitoring, Evaluation and Reporting, Cape Town, South Africa www.cdra.org.za • Pre-conference workshops AfrEA - African Evaluation Association www.afrea.org AEA - American Evaluation Association www.eval.org SAMEA - South African Monitoring and Evaluation Association www.samea.org.za AES - Australasian Evaluation Society www.aes.asn.au EES – European Evaluation Society www.europeanevaluation.org CES – Canadian Evaluation Society www.evaluationcanada.ca UKES – United Kingdom Evaluation Society www.evaluation.org.uk

  47. Training and professional development – Graduate programs • Centre for Research on Science and Technology, The University of Stellenbsoch, Cape Town, South Africa, Postgraduate Diploma in Monitoring and Evaluation Methods . One year couse delivered in intensive mode of face to face courses interspersed with self-study. • School of Health Systems and Public Health (SHSPH), the University of Pretoria, South Africa, in collaboration with the MEASURE Evaluation Project M&E concentration in their Master of Public Health degree program. Courses taught in modules of one to three weeks, six-month internship and individual research. • Graduate School of Public & Development Management (P&DM), the University of the Witwatersrand, Johannesburg • Electives on monitoring and evaluation as part of their Masters Degree programmes in Public and Development Management as well as in Public Policy. • Centre for Program Evaluation, University of Melbourne, Australia Masters of Assessment and Evaluation. Available by distance education. www.unimelb.edu.au/cpe • CIRCLE, Royal Melbourne Institute of Technology, Australia Masters and PhD by research • University of Western Michigan, USA Interdisciplinary PhD residential coursework program.

  48. Training and professional development – On-line material • Self-paced courses • Manuals • Guidelines

  49. Training and professional development – Key Questions • Who controls the curriculum, accreditation of courses and certification of evaluators? • What are the consequences of this control?

More Related