1 / 32

Layers: Adaptations for DP’s M&E needs

Layers: Adaptations for DP’s M&E needs. Food and Nutrition Technical Assistance II Project (FANTA-2) Academy for Educational Development 1825 Connecticut Ave., NW Washington, DC 20009 Tel: 202-884-8000 Fax: 202-884-8432 E-mail: fanta2@aed.org Website: www.fanta-2.org.

jeroen
Download Presentation

Layers: Adaptations for DP’s M&E needs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Layers:Adaptations for DP’s M&E needs Food and Nutrition Technical Assistance II Project (FANTA-2) Academy for Educational Development 1825 Connecticut Ave., NW Washington, DC 20009 Tel: 202-884-8000 Fax: 202-884-8432 E-mail: fanta2@aed.org Website: www.fanta-2.org Food and Nutrition Technical Assistance II Project (FANTA-2) Academy for Educational Development 1825 Connecticut Ave., NW Washington, DC 20009 Tel: 202-884-8000 Fax: 202-884-8432 E-mail: fanta2@aed.org Website: www.fanta-2.org

  2. Goal of session • Discussed in this session: • What Layers is • Types of questions asked in Layers • Where Layers is currently being implemented • Also : • Is Layers adaptable to MYAPs’ M&E systems? • Highly structured to address Mission needs • Can it help PVO needs? • What issues may be addressed? Outcomes? MTEs?

  3. Session format • Part 1: • Presentation • Q&As • Part 2: • Working group session: potential for use of Layers by MYAPs • Part 3: • Plenary wrap up

  4. What is Layers? • Layers is a monitoring system developed to help USAID Missions fulfill their responsibilities to monitor Food for Peace Act, Title II Programs. • According to DCHA/FFP enabling regulation: “USAID Missions are expected to monitor CS's management of the commodities and use of grant funds. (…) Oversight and monitoring should include regularly scheduled visits to distribution centers and warehouses…”

  5. Typical (pre-Layers) Title II monitoring • Title II sites are periodically visited by USAID staff to verify compliance on commodity storage, disposal and distribution. • If discrepancies are found, the partner is notified. Weaknesses of this approach: • Only commodities monitored, not program activities. • Site selection based on convenience--no systematic sampling. • Only sites visited are assessed; no program-wide monitoring. • Designed for USAID’s use; little helpful feedback to partners. • No methodological guidance to field monitors (sampling, indicators).

  6. What Layers brings: • Allows monitoring of program activities as well as commodity management • Draws from a random sample of sites: results are representative of the entire program • Uses standardized indicators • Shares findings with partners to improve program performance

  7. What is LQAS (Lot Quality Assurance Sampling)? • LQAS is a sampling method developed to control the quality of manufactured goods produced in ‘lots.’ • LQAS takes a small random sample and tests the sample for quality. • The sample will tell if program activities (agriculture, health, etc.) are meeting/not a performance benchmark • The sample size is chosen so that there is a high probability of determining what indicators in a given activity are meeting or not meeting the performance benchmark. • A MYAP may get a “No” on some indicators and a “Yes” on others within the same activity type.

  8. Limitations and words of cautions about LQAS and Layers • Layers only shows whether a CS has met the benchmark for an indicator or not (e.g. does it meet 95% “fail free” standard?). It does not provide point estimates (i.e. “% of sites where standard is met”) • LQAS is good at accurately identifying a CS that is above the upper benchmark or below the lower benchmark. Classification errors are unfrequent but may occur when CS results are “borderline” (fall between UT & LT). • Planned sample size must be met—else, error rates increase which may lead to erroneous conclusion that the CS performs below the benchmark).

  9. Sampling for LQAS • A popular sample size is n=19. This does not have to be so: what sets the final n is: benchmark level, UT/LT, error rate • Calculating the right sample size for a LQAS survey is done with easy-to-use tools found on the FANTA-2 website, at http://www.fantaproject.org/layers/reference.shtml • The tool tells us: • How many sites to sample (e.g., 25) • How many of those sites must receive a “yes” (e.g., 21) in order to meet the benchmark for that indicator (called the “decision rule”). • Alpha and beta error rates based on sample size and upper and lower thresholds

  10. Indicators Used by Layers • Layers can monitor many aspects of a program to ensure the overall success of the CS in delivering services. • Several modules of indicators/questionnaires already exist. Those are modified and adapted to each country/program. • New indicators can also be created.

  11. Examples of Common Layers Indicators by Type of Activity

  12. Data entry using PDAs Use of PDAs facilitates the work of enumerators • PDAs reduce the likelihood of error • Automatically selects form/module/question to ask • Verify coherence with earlier responses • Minimizes coding error, poor transcript • Data entry errors eliminated • PDAs are economical • No need to print paper forms • Changes to the questionnaires easily made mid course • Data entry is eliminated, & info is immediately usable • PDAs are flexible: • GPS and other electronic equipment can be adapted • Reference tools can be included (manuals, calculators, etc) • Decision making tools can be included (drill down, decision/diagnosis trees) BUT • PDAs require programming, specialized training and IT support

  13. Layers Data Analysis • Analysis with LQAS is simple: • Count the number of “yes” responses for each indicator. • Apply the decision rule in each case. • # of “Yes” responses ≥ decision rule: YES – this indicator meets the benchmark • # of “Yes” responses < decision rule: NO - this indicator does not meet the benchmark

  14. Setting up Layers: what’s involved? • Staff infolved: • Mission FFPO • PVO (IT Manager, M&E manager, COP) • FANTA/2, TANGO • Steps: • Initial planning • Sampling (create TUS, chose sites) • Adapt model (questionnaire by sector, etc) • Train data collectors • Collect, tabulate and analyze data • Communicate results, recommendations to Awardees • Entire process: 7 mo (first run); shorter afterward

  15. Timeline for setting up Layers in a new Country* Assumes 1mo data collection via external contractor

  16. Final Step: Sample LAYERS Report Letter 1. Findings for commodity warehouses (documentation, management and storage) Successes: • Routine documentation procedures at MCHN sites are followed appropriately. • Storage management is appropriate: storage sites are clean, weatherproof and safe; and the food stored in those sites is kept correctly. Challenges: • The ledgers did not concur with existing inventories in many sites. This is most serious and needs urgent attention. • Minor problems related to ventilation, rodent infestation and use of the warehouse for other purposes were detected in a few sites. Letter ends with a Summary of Recommendations. PVOs expected to reply and state how they will address the recommendations.

  17. Where is Layers used? • Currently used in Haiti, Guatemala, Madagascar, Ethiopia • Currently undergoing preparation in Chad, Mali, Uganda, Malawi, Bangladesh • Will be progressively implemented in all FFP focus countries, as they turn over to new Development Programs

  18. Extensions of Layers • Outcome Monitoring • Same general approach as Layers • Adds the following dimensions: • Focuses on Outcome level indicators (random sample of target population, not sites) • Implemented in several “supervision areas” • parametric results possible (means, etc) • Identifies areas that need reinforcement • Can feed into other activities (Mid Term Evaluations? Operations Research? Etc)

  19. INPUTS PROCESSES OUTPUTS OUTCOMES IMPACTS Human resources Financial resources Physical facilities Equipment, etc Training Setting up systems Supervision of activities Networking N of mothers trained N of working cold chains FP methods distributed EBF, ORS, etc Immunization rate CPR Better nutrition Reduced morbidity Smaller families Focuses on Outcome Indicators Rainfaill, prices, policies, etc. EXTERNAL PROCESSES Evaluation indicators (BL, MTE, Final Eval) Monitoring indicators (yearly measurements) Program-level info Beneficiary-level info

  20. Why outcome and not impact? Relates to expected size of change bt/years

  21. OM Indicators are highly streamlined

  22. Use of PMAs (Program Management Areas ) This design allows us to say whether the region as a whole passes or fails the benchmark n=21

  23. n=21 n=21 n=21 n=21 n=21 n=21 n=105 n=21 n=21 n=21 n=21 What PMAs Add • Dividing the total area into PMAs allows to: • State whether each PMA fulfills the • benchmark  orients corrective action • Compare PMAs to one another •  identify “positive deviants” • (iii) Generate parameters from the total (n=105) •  estimate coverage (means, etc)

  24. Sampling design: dictated by key indicators* *Recall: LQAS requires that all samples be completed

  25. Putting the pieces together: Example of Sampling Matrix with 5 PMAs, 5 Subsamples

  26. Part 2: Adapting Layers to Awardees’ needs • How do we adapt those concepts to the needs of T-II Awardees? • Yearly monitoring of outputs? outcomes? • Use in MTE? • Use in BL/FE? • Use in special studies? • Issues to keep in mind: • Integration between Mission and MYAP data • Why sample? (if PVOs must visit all sites) • Monitor what? (Outputs, Outcomes, Impacts?)

  27. Method for Part 2 • Breakout in working groups. • Each WG discusses a possible use of Layers. Examples of options (others welcome): • Yearly monitoring • Mid term evaluations • Baseline/Final evaluations • Special studies • Report in plenary after. Have audience review and critique each approach

  28. This presentation is made possible by the generous support of the American people through the support of the Office of Health, Infectious Disease and Nutrition, Bureau for Global Health, and the Office of Food for Peace, Bureau for Democracy, Conflict and Humanitarian Assistance,United States Agency for International Development (USAID) under terms of Cooperative Agreement No. GHN-A-00-08-00001-00, through the Food and Nutrition Technical Assistance II Project (FANTA-2), managed by the Academy for Educational Development (AED). The contents are the responsibility of AED and do not necessarily reflect the views of USAID or the United States Government. Food and Nutrition Technical Assistance II Project (FANTA-2) Academy for Educational Development 1825 Connecticut Ave., NW Washington, DC 20009 Tel: 202-884-8000 Fax: 202-884-8432 E-mail: fanta2@aed.org Website: www.fanta-2.org

More Related