130 likes | 143 Views
This article explores the importance of focusing on results in development projects and provides insights on what results mean, characteristics of effective monitoring systems, risks associated with results management, and actions that can be taken to improve results.
E N D
Why and How should we focus on Results? Susan Stout, Manager Results Secretariat OPCS November , 2006
Topics for Today • Why is it important to focus on results? • What do we mean by results ? • Characteristics of an effective monitoring system • Some Risks about Results
Why is it important to focus on Results? • Knowledge and Information is Power • For implementing agencies, for consumers/recipients • Countries are in the driver’s seat, results provide the steering wheel • Goal is to improve quality of management and decision making –When accountability to client/consumer is strong, results are strong • Learning is fundamental to ‘scaling up’ and easing absorptive capacity constraints • Money not useful, or used, without ideas and data on results • Donors more concerned with accountability and results than ever • Results Agenda • IDA 13 and IDA 14 Commitments
Encourages Managing for Development Results at all levels The Results Agenda • Country Pillar • Encourage countries to focus on results • Encourage use of donor resources to strengthen results management at country level • Strengthened statistical capacity • Work with other donors and country partners to help country better link resources to results • (Results / resource meetings replace traditional CG) Clients / Consumers ‘on the ground’ Local Government Civil Society & Private Sector • Bank Pillar • Strengthen Bank capacity to report and learn from Results • Strengthen focus on Results in country programming and portfolio management (lending and AAA) • Use Bank capacities and instruments to strengthen country capacity to collect and use information on Results State/Provincial Government • GlobalPillar • Collaborate with efforts to improve aid effectiveness across donor agencies • MDB Working Group on MfrDR • Joint Venture /OECD/ DAC • Biannual international meetings hosted by Bank Central Government Bank Management Regions / Networks Shareholders / Donors
Some general lessons on ‘Results’ • “M and E” recognized as strategic, but … • donors better at saying it is important than ‘how to do it’ • “Indicatory” – indicators only part of the problem – put in decision making context • Incentives matter ! Distinguish and balance “M and E” for reporting and for managing • – use to guide budgeting and planning • Need to focus on “Who is doing the learning” • Not just for reporting to donors, or to national level • Goal is to improve quality, relevance, effectiveness of project implementation - create added value
What do we mean by ‘Results’ ? • Results -- sustainable improvements in country outcomes • If the intervention is successful, what will be ‘the difference’ for the primary target group ? Examples -- • Children are learning more • Municipalities are more efficient • Firms are earning more • HIV transmission from IDU to general population reduced • Managing for results -- using information to improve decision-making and steer country-led development processes toward clearly defined goals • The ‘art’ of Results Management is defining outcomes that are meaningful to BOTH provider and client/consumer, are measurable in a credible way and are used in decision making
What are the characteristics of an effective monitoring system? • More than a list of ‘indicators’ ! • Clarity on who is going to use the information for what kinds of decision making • System is ‘usable’ for the level of decision making • Not every level needs every indicator ! • The system is ‘operational’ – clear on who is to collect and report what data by when and to whom • Where capacities for monitoring and evaluation are limited, how will they be established? • Improved capacities to collect and use information itself a ‘result’
Characteristics (2) • Use simplicity and common sense as guides • Better to do a few things well than all things perfectly • Strong link between what is measured and desired outcome • Some project outcomes are ‘inputs’ to others • Don’t try to solve all problems in one project • Qualitative data (e.g. of organizational/institutional change) as important as quantitative
RC focuses on country program as a whole, a portfolio of activities; feeds into global learning Results Cycle and Project Cycle Projects are embedded in the RC Recurrent stages; many simultaneous processes Clear cut stages and processes
Risks and Results Management • Defining results without participation and ownership • Over promising on results – trying to deliver more than is feasible • Creating a ‘measurement bureaucracy’ • If what gets measured gets done, be sure about the validity of the measures • Volume does not equal effectiveness • Use independence and verification to reduce upward bias
How can Bank staff help? • Drive the design according to results • M and E not an ‘end of appraisal add on’ • Focus on defining PDO that • Is clear on who/what is the target group for the intervention • Is clear on what problem/behavior of this groups will be changed • Recognize the role of incentives to use in designing systems -- • Distinguish and balance “M and E” for reporting and for managing/learning • Who needs to learn about what is working • Who wins what if the indicator changes ? • Make sure there is capacity and time to get and communicate the data
Some closing thoughts • Most often cited reasons for weak ‘M and E’ • “nobody cares” • Remember that the ultimate consumer/target group does • Don’t approach as a problem for Bank accountability, rather for motivating greater responsiveness • No time, no resources to design and oversee • Remember that fiduciary responsibilities are means to an end – Results management is about making sure the end is in sight • Encourage comparisons of results achieved across implementing agencies – peer pressure works • Focus on defining and reporting on achievements, not just failure