70 likes | 87 Views
Understand the institutional and technical challenges of implementing a PRS monitoring system. Learn how to define what to monitor and evaluate effectively, why "M" and "E" are essential, and how to ensure accountability and learning. Explore the importance of being SMART with monitoring indicators to maximize effectiveness and reliability.
E N D
Data for PRS Monitoring:Institutional and Technical Challenges
Institutional Preconditions • PRS strategy is in place and objectives have been clearly set • Monitoring effort wasted – garbage in, garbage out • Evaluation can reveal wasted effort • Agreement on where monitoring should occur • are policies in place? • are inputs being provided? • are outputs being achieved? • are outcomes being realized? • Define specifics of what should be monitored and what should be evaluated
Why “M” and “E”? Management perspective Oversight and Accountability • Decisions about future funding based on the results • Desired effects • Implementation • Assure public and stakeholders that money is spent appropriately • Ensure staff or contractor accountability Learning • Enhance learning among stakeholders Project/program perspective Design and Implementation • Feed back of results to improve implementation
Some Definitions: “M” • A continuing function that uses • systematic collection of data • on specified indicators • Provides management with indications of • the extent of progress toward milestones and targets (results - MDGs) • achievement of outputs and outcomes • use of funds • Generally comprehensive for, and conducted by, business unit
Some Definitions: “E” • A periodic function that uses • systematic assessment • of intended and unintended consequences • of a specific intervention • Provides management and the main stakeholders with indications of • quality, effectiveness, efficiency • attribution of effects to the intervention • key information for decision-making • Conducted by independent evaluators
Defining the Scope • Purpose -- how to use M&E data? • Accountability? Planning? Learning? Better Implementation? Budgeting? • Prioritization – what data are important to collect? • Organization – who will collect data? • Periodicity – when? • Reporting – how the data be turned into usable information? • Costs – what are the cost implications for data collection and use? • Relevance, reliability, and validity of data– how will it be ensured? • Coordination of data– what are the mechanisms for interagency collaboration?
Being “SMART” About Monitoring Indicators • Specific • Measurable • Achievable • Relevant • Timely Valid – measure what they set out to measure Reliable – different people/agencies must produce the same statistic Cost-Effective