240 likes | 339 Views
Institutional Arrangements for PRS Monitoring: Lessons from Experience. Markus Goldstein Poverty Reduction Group From Bedi, Coudouel, Cox, Goldstein and Thornton (2006) “Beyond the Numbers: Understanding the Institutions for Monitoring Poverty Reduction Strategies” World Bank. Content.
E N D
Institutional Arrangements for PRS Monitoring: Lessons from Experience Markus Goldstein Poverty Reduction Group From Bedi, Coudouel, Cox, Goldstein and Thornton (2006) “Beyond the Numbers: Understanding the Institutions for Monitoring Poverty Reduction Strategies” World Bank
Content • Expectations and realities • Organizing monitoring activities • Making use of PRS monitoring • Organizing participation
1. Expectations (and realities) • Objectives of a poverty monitoring system • Supports decision-making • Supports accountability to the public • Promotes evidence-based dialogue • Supports reporting to donors for their own accountability • Functions of the PRS-MS • Poverty monitoring • PRS implementation monitoring • Expenditure tracking • Focus on entire results-chain, linking the various elements
1. Expectations (and realities) PRS-MS mainly has institutional functions: • Coordinating actors (not duplicating) • Developing set of indicators and targets • Building capacity where deficient • Organizing information flows • Compiling data • Linking elements of results-chain • Organizing analysis and evaluation • Generating reports • Disseminating findings • Organizing participation of civil society
1. (Expectations and) realities • Modest achievements: Few have established functioning links between monitoring and decision-making • Common obstacles: • Practical issues with data collection, especially administrative routine data • Difficulties in coordination, duplication, redundancies • “turf battles” • No incentives to participate (and relinquish space) • Formal plans are not translated into actual practice
1. (Expectations and) realities • Common obstacles (cont.): • Shortcomings in PRSs themselves • Lack operational details • Lack of costing • Lack of prioritization • Inadequate indicators and targets • Deficit in evaluation and analysis • Limited budget planning and PEM systems • Weak demand (interest?) from decision-makers • Donor requirements typically not aligned
2. Organizing monitoring activities • Usually, formal plans exist… but not implemented • Problem may be in process of design • Often narrow: some stocktaking, short consultations, design (consultant?)…no stakeholder analysis, no real participation • Details of system not worked out – roles, responsibilities, standards, modalities for cooperation • Limited buy-in from actors • Limited accountability or compliance • Systems are consensual in nature, function only if participants find it useful and legitimate • w/o common purpose, formal obligations don’t work Need more organic design, common commitment
2. Organizing monitoring activities Common building blocks • Steering Committee: political support and oversight • Coordination Unit or Secretariat: convening meetings, managing processes, compiling data, drafting reports • Inter-agency committees and working groups: promote dialogue, inclusive membership, debate results • National Statistics Institute: key data producer, plus normative and technical-assistance role • Line ministries: liaison point (M&E Unit or individual) Key issues are relationships and modalities
2. Organizing monitoring activities Lessons/considerations: • Leadership • Coordination • Liaison with line ministries • Role of national statistical agencies • Involving local governments
2.1. Leadership • Choice of institutional lead is critical • Should be close to center of government/budget process • Range of locations: • Ministry of Finance (Mali, Niger, Uganda) close to budget • Ministry of Planning (Malawi, Mauritania) better analysis • Office of the (vice-)President (Tanzania) greater authority • Leadership more effective if in a single agency, rather than an inter-agency committee • A champion is important but danger that system becomes tied to a personality • In any case, leadership may need to change over time, need for flexibility
2.2. Coordination – the greatest challenge • Typically series of inter-agency committees (13 in Mali) but: • Committee system often over-elaborate • Run out of steam • Incentives work against coordination • Often lack concrete recommendations • Technical secretariats typically suffer from high turnover and limited resources and skills Avoid burdensome structures, build working relationships Effective secretariat is key to organize dialogue, work through the issues, assist its members Process, advocacy, political leadership are critical Donors can: • Limit parallel demands which create wrong incentives • Support the system by providing incentives
2.3. Liaison with line ministries • Most PRS-MS are “second-tier” systems: rely on routine data from line ministries • Usually a “liaison person” in ministry, but often w/o the authority, time or incentives to play that role effectively • Quality of sectoral data often an issue • Project/donor-specific reporting often take precedence • Promote monitoring within line ministries (for their own management purposes) • Change incentives (+capacity) • Choose liaison persons with higher profile • Requirements from PRS-MS aligned with sectoral information systems • Donors align their reporting requirements
2.4. Role of statistical agencies Often most institutionally advanced element of PRS-MS But issues: 1: PRS-MS arrangements sometimes duplicate existing statistical structures (master plan). Potential rivalry between statistical system and PRS-MS. Limited links between central agency and line ministries Ensure complementarity with existing systems and plans 2: Role of agency in setting standards, technical assistance, capacity building often not fully played. Often survey and administrative data not compatible. Funding mechanism to leave space for this role. Donors to move away from supporting activities, towards supporting plans 3: Existing data typically not fully utilized outside the central agency More dissemination, more training/statistical literacy
2.5. Involving local governments • Communication within a sector often an issue • Incentives differ with degree of decentralization • Limited capacity (and numerous reporting obligations) • No “best practice” examples Limit indicators to reduce burden (make it easier to comply) Central quality control mechanisms Support and capacity-building Provide feedback to local level Build on local civil society (?) Encourage local accountability (dissemination) Options: • decentralized monitoring (e.g. Uganda, link to grant mechanism) • central monitoring of local governments (when capacity too low)
3. Making use of PRS monitoring In addition to organizing data supply, PRS-MS must build demand • Establish linkages with entry points in decision-making processes: • Budget • MTEF • Planning • Review/update PRS • Parliamentary sessions • Public dialogue • Donor strategies and operations Processes outside the PRS-MS, but should guide activities: • Analysis and evaluation • Outputs and dissemination • Linking PRS monitoring and budget • Role of parliament
3.1. Analysis and evaluation • Analysis key to effective use of data • Area of great deficit • Lack of capacity • Lack of incentives (weak accountability) • Focus on APR production, w/o much analytical content • Often dedicated analytical unit (e.g.Tanzania, Uganda) Work when close to government Work when focused only on analysis Issue of funding and sustainability • Need greater capacity (and incentives) in sectoral agencies • Option: joint work with donors (e.g. PERs)
3.2. Outputs and dissemination • Information must be disseminated to have an impact • Within governments: pushing information back to • central agencies • local and regional governments • service providers • Outside governments: • Parliament • Media and general public • Donors, etc. • Often not accessible • Main focus is often donors • Ensure right format/content for users, including public • Ensure right timing for key moments • Dissemination strategy
3.3. Linking with budget/planning Most likely incentive for evidence-based policy-making • In practice, often weak link • Experience to date: • requirement in rules for budget preparation (usually in countries with MTEF – Uganda, Tanzania) • Challenge function around budget preparation • Ability to “sanction” often limited • Careful: • Results can take time or can be due to exogenous factors • linking funds to ability to monitor or to ability to deliver? • incentives to mis-report? • Incentives to under-commit? • Difficult to operationalize, depends on maturity of MTEF and PEM system • Donors should strengthen the budget process, rather than bypass it (wrong incentives)
3.4. Links with parliament • Relatively low participation in PRS process in most countries • Missed opportunity for oversight function • Low capacity of committees for analysis • Low resources Capacity building, economic literacy, committees
4. Organizing participation Belongs to both the supply and demand side • A means to strengthen the PRS-MS (producer) • A means to increase accountability (user) Experience varies greatly • Issues of capacity and representativity • Forms of participation • Carrying out monitoring activities (including “action-oriented”) • Participating in PRS-MS structures • Analyzing and providing policy advice • Disseminating information • Typically participation not very formalized
Further lessons from experiences • We asked staff in PRS units or national statistics agencies (with responsibility for poverty monitoring): What are the main barriers you see to getting data effectively used in your country
Main issues from Sub-Saharan Africa • Political will/leadership (29%) • Capacity building, local & central (19%) • Coordination @ central level (13%) • Coordination between central and local levels (13%) • M&E link to budget (10%) • M&E budget (9%) • Legislation/regulation (4%) • Engagement w/civil society (2%)
Issues faced in the Balkans • Lack of capacity within statistics agency (22%) • Coordination between central and local levels (18%) • Coordination at central level (17%) • Political will/leadership (9%) • Inadequate budget (9%) • Missing census/data quality (9%) • Uneducated users (8%) • Overly technical dissemination (3%) • Legislation (3%) • Data access (1%)
Conclusions • Do not start from blank slate… build on existing • Won’t happen overnight… gradual improvement • Goal not an ideal system… but a process of change • Context evolves… build flexible arrangements • Focus on relations, incentives and activities • Demand needs to be stimulated… identify entry points • Users differ and need different formats and content • Donors can support or distort… Thank you !