190 likes | 328 Views
The Presidency Department of Performance Monitoring and Evaluation. Draft National Evaluation Policy Framework. Provincial M&E Forum 18 August 2011. Process on the Framework. Evaluation as core part of GWM&ES More emphasis on monitoring Draft policy framework 6 months ago
E N D
The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework Provincial M&E Forum 18 August 2011
Process on the Framework • Evaluation as core part of GWM&ES • More emphasis on monitoring • Draft policy framework 6 months ago • Study tour to Mexico/Colombia/US focusing on this (with DBE/DSD/OPSC/GCIS) • Writeshop with same depts plus GP (and FS invited) • Draft framework developed together – DPME edited • Comments by 31 August
Structure of the Framework Part AIntroduction 1Background 2Why evaluate? 3Approach to evaluation Part B How we do evaluation? 4Uses and types of evaluations 5Assuring credible and quality evaluations 6The process of evaluation 7Assuring follow-up Part C How we make this happen? 8Institutionalising evaluation in the Government system • Management and coordination of evaluation across government
1Background • Challenges • Lack of clear policy and strategic direction around evaluation; • A need to promote the use of knowledge from both evaluation and research; • Confusion on what is evaluation, performance auditing, research etc; • Evaluation work exists but not necessarily known, within departments or externally; • Lack of coordination between organisations and fragmentation of approaches; • Inadequate use of evaluation, leading to a perception that it is a luxury and a lack of institutionalisation. • Problem - Evaluation is applied sporadically and not informing planning, policy-making and budgeting sufficiently, so we are missing the opportunity to improve Government’s effectiveness, efficiency and impact.
Focus of document • Focus of this policy framework • A common language and conceptual base for evaluation in Government; • An institutionalised system across Government linking to planning and budget; • Clear roles and responsibilities; • Improved quality of evaluations; • Utilisation of evaluation findings to improve performance. • Target group • Political principals and senior managers in the public sector who must improve their performance and incorporate evaluation into what they do • Other actors who need to be involved in the evaluation process, such as potential evaluators (including academics and other service providers • Training institutions, who will have to ensure that public servants understand evaluation and we have a wider cadre of potential evaluators with the required skills and competences.
2 Why evaluate • Judge merit or worth of something: • Was the programme successful? Was it effective? Did the intended beneficiaries receive the intervention? Did it impact on their lives? • Improving policy or programme performance (evaluation for learning): • this aims to provide feedback to programme managers. Questions could be: was this the right intervention for this objective, was it the right mix of outputs, what is the most effective way to do X? • Evaluation for improving accountability: • where is public spending going? Is this spending making a difference? • Evaluation for generating knowledge (for research): • increasing knowledge about what works and what does not with regards to a public policy, programme, function or organization.
3 Approach to evaluation • For this Evaluation Policy Framework evaluation is defined as: • The systematic collection and objective analysis of evidence on public policies, programmes, projects, functions and organizations to assess issues such as relevance, performance (effectiveness and efficiency) and value for money, and recommend ways forward. • It is differentiated from monitoring: • Monitoring involves the continuous collecting, analysing and reporting of data in a way that supports effective management. Monitoring aims to provide managers with regular feedback on progress in implementation and results and early indicators of problems that need to be corrected. It usually reports on actual performance against what was planned or expected (adapted from the Policy Framework on GWM&ES)
Priority for existing programmes/policies Figure to be confirmed • Large(eg over R500 million) • or covering a large proportion of the population, and have not had a major evaluation for 5 years. This figure can diminish with time; • Of strategic importance, • and for which it is important that they succeed. If these have not been evaluated for 3 years or more, an implementation evaluation should be undertaken; • Innovative, • from which learnings are needed – in which case an implementation evaluation should be conducted; • Of significant public interest – eg key front-line services; • Any programme for which there are real concerns about its design should have a design evaluation conducted.
Internal/external • Balancing ownership and credibility
8 Institutionalising evaluation • Legal framework • Evaluation plan • 3 year and annual evaluation plan developed by DPME (with partners) starting with 2012/13. Specifies from a national perspective what needs to be done. Government institutions can choose to do additional evaluations. • Role and responsibilities • Departments and public institutions - responsibility to incorporate evaluation into their management functions as a way to continuously improve their performance. They need to: • Ensure there is an evaluation budget in all programmes (see 8.4) and a plan over 3-5 years for which evaluations will be undertaken, and the form of evaluation; • Ensure there are specific structures within the organisation entrusted with the evaluation role, and with the required skills. This could be a M&E Unit, or a research unit, or a policy unit. • Ensure that the results of evaluations are used to inform planning and budget decisions, as well as general decision-making processes. Thus the results of evaluations must be discussed in management forums and used to guide decision-making.
Other roles and responsibilities • DPME is the custodian of the evaluation function in Government. Includes: • Standard setting,, Pooling of knowledge,Quality assurance,Capacity building and technical assistance, Promotion • National Treasury- assure value for money when allocates budgets. See that: • Plans and budgets are informed by evidence, including from evaluations; • Ensure that cost-effectiveness analyses are undertaken, and that suitable methodologies employed. • DPSA - see that the results of evaluations which raise questions around the performance or structure of the public service are addressed. • OPSC - specific independent role in the evaluation process, reporting directly to Parliament, and source of expertise in helping to build the evaluation system. • Auditor-General-independent body, and an important player in its role of performance audit. • PALAMA -responsible for developing capacity-building programmes around M&E across government. • Universities • tertiary education including evaluation, and skills development • supply many of the evaluators, particularly where sophisticated research methodologies are needed • undertake research which is closely allied to evaluation, and can help to inform research processes. • SAMEA - The South African M&E Association • support the development of systems and capacities, and are an important forum for learning and sharing.
Other issues • Budgeting 1-5% of programme budgets for evaluation • Using standardised systems • Donor funded evaluations following government system • Optimising limited capacity • technical capacity in DPME to support departments on methodology and quality; • Outsourcing of evaluations to external evaluations using an accredited panel; • Trainingusing short courses including PALAMA, universities, and private consultants. • Building on international partnerships with similar countries (eg Mexico and Colombia), and international organisations, eg 3ie or World Bank.
9 Management • Champion DPME, with specific technical unit created to provide support • Evaluation Working Group to build on strengths in government and ensure commitment across government, including provincial expert • OoPs to provide leadership and coordination at provincial level