1 / 7

Data for PRS Monitoring: Institutional and Technical Challenges

Data for PRS Monitoring: Institutional and Technical Challenges. Institutional Preconditions. PRS strategy is in place and objectives have been clearly set Monitoring effort wasted – garbage in, garbage out Evaluation can reveal wasted effort Agreement on where monitoring should occur

wrightwayne
Download Presentation

Data for PRS Monitoring: Institutional and Technical Challenges

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data for PRS Monitoring:Institutional and Technical Challenges

  2. Institutional Preconditions • PRS strategy is in place and objectives have been clearly set • Monitoring effort wasted – garbage in, garbage out • Evaluation can reveal wasted effort • Agreement on where monitoring should occur • are policies in place? • are inputs being provided? • are outputs being achieved? • are outcomes being realized? • Define specifics of what should be monitored and what should be evaluated

  3. Why “M” and “E”? Management perspective Oversight and Accountability • Decisions about future funding based on the results • Desired effects • Implementation • Assure public and stakeholders that money is spent appropriately • Ensure staff or contractor accountability Learning • Enhance learning among stakeholders Project/program perspective Design and Implementation • Feed back of results to improve implementation

  4. Some Definitions: “M” • A continuing function that uses • systematic collection of data • on specified indicators • Provides management with indications of • the extent of progress toward milestones and targets (results - MDGs) • achievement of outputs and outcomes • use of funds • Generally comprehensive for, and conducted by, business unit

  5. Some Definitions: “E” • A periodic function that uses • systematic assessment • of intended and unintended consequences • of a specific intervention • Provides management and the main stakeholders with indications of • quality, effectiveness, efficiency • attribution of effects to the intervention • key information for decision-making • Conducted by independent evaluators

  6. Defining the Scope • Purpose -- how to use M&E data? • Accountability? Planning? Learning? Better Implementation? Budgeting? • Prioritization – what data are important to collect? • Organization – who will collect data? • Periodicity – when? • Reporting – how the data be turned into usable information? • Costs – what are the cost implications for data collection and use? • Relevance, reliability, and validity of data– how will it be ensured? • Coordination of data– what are the mechanisms for interagency collaboration?

  7. Being “SMART” About Monitoring Indicators • Specific • Measurable • Achievable • Relevant • Timely Valid – measure what they set out to measure Reliable – different people/agencies must produce the same statistic Cost-Effective

More Related