270 likes | 418 Views
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority. V F Kemerer Getting To Results IDEAS Global Assembly Johannesburg, South Africa 18 March 2009. Overview. Background Use of Assessment Tools for Strategic Planning
E N D
Building New Institutional Capacity in M&E: The Experience of National AIDS Coordinating Authority V F Kemerer Getting To Results IDEAS Global Assembly Johannesburg, South Africa 18 March 2009
Overview • Background • Use of Assessment Tools for Strategic Planning • NACA Specific Capacity Building Interventions • Measurement and Determination of Effectiveness and Outcomes
Background National Agencies GFATM NAC M&E UNIT Coordinate the National HIV DMRS Lead National M&E System Development UN Family USG & PEPFAR CB through TA using various tools & strategies to establish and/or improve performance Misc Bi- & Multi-Laterals
Strategic Vision: Build effective and sustainable M&E systems to provide comprehensive, high-quality information
Use of Assessment Tools for Strategic Planning • SI Assessment • MESST • DQA/RDQA
SI Assessment Activities • Map national HIV output indicators and identify data flow • Outline challenges in the DMRS • Assess M&E structures • Stakeholder Analysis – National M&E Framework • Stakeholder Analysis –HIV researchers, organisations, and research
Linking SI Capacity and SI Planning with SI Products Timely, high quality strategic information products Unified SI Strategic Planning and Coordinated SI Support Building of National SI Systems SI Systems Assessments Ongoing Monitoring & Feedback
Use of Assessment Tools for Strategic Planning • SI Assessment • MESST • DQA/RDQA
Donor-driven Country-driven Focused exclusively on capacities of specific silos Expanded to integration in the wider M&E Systems General approach More systematic approach (based on a checklist of items) Strictly Assessment Tool Broader Assessment, Management, Planning & Capacity Building Tool A new approach to M&E Assessments FROM … TO …
How does the M&E Strengthening Tool work? Step 1: Diagnosis Step 2: Analysis Step 3: Action Plan A Summary Dashboard will be automatically generated: • Displays the overall distribution of answers for each of the checklists’ sections • Provides a more detailed analysis of the specific gaps that helps respondents complete the ACTION PLAN Development of a Costed Action plan with: • Identification of Strengths and Weaknesses • Defining Strengthening Measures: • Responsibility • Timeline • Funding Implication • T/A Needs Diagnosis through completion of a Checklist: • Answers to Statements/ Questions • Provision of Comments/Explanations
Expected Benefits • Better identify M&E capacity gapsand corresponding strengthening measures, including through Technical Assistance (TA) • Guide investments in M&E(within the recommended range of 5%-10% of the overall Budget) • Ensure that such investments contribute to the strengthening of the National Systems (avoiding parallel reporting systems) • Enhance the quality of programmatic data to improve program management
Use of Assessment Tools for Strategic Planning • SI Assessment • DDIU Tools • MESST • DQA/RDQA
1 Data Verifications (Protocol 1) Quantitative comparison of recounted to reported data and review of timeliness, completeness and availability of reports. 2 Assessment of Data Management Systems (Protocol 2) Qualitative assessment of the strengths and weaknesses of the data-collection and reporting system. DQA METHODOLOGY – 2 Protocols • The methodology for the DQA includes two (2) protocols:
M&E Management Unit Service Delivery Sites / Organizations Intermediate Aggregation levels (eg. District, Region) Assess Data Management and Reporting Systems (Part 1) PROTOCOL 2: Assessment of Data Management and Reporting Systems • PURPOSE: Identify potential risks to data quality created by the data- management and reporting systems at: • the M&E Management Unit; • the Service Delivery Points; • any Intermediary Aggregation Level (District or Region). • The DQA assesses both (1) the design; and (2) the implementation of the data-management and reporting systems. • The assessment covers 8 functional areas (HR, Training, Data Management Processes , etc.)
Objective of the Data Quality Assessment Tool The Data-Quality Assessment (DQA) Tool is designed to: • verify the quality of reported data for key indicators at selected sites; and • assess the ability of data-management systems to collect manage, and report quality data. The DQA Tool is not intended to assess the entire M&E system; it relates and focuses more in depth into component 10 (i.e. Supportive supervision and data auditing) of the “Organizing Framework for a Functional National HIV M&E System”. !
NACA Specific Capacity Building Interventions MESST RDQA TA for DQ SI Assessment M&E Road Map 1/4ly Mentoring (SHAPMoS)
SI Assessment M&E Road Map Implementation • Incorporation of Data Flow Findings into SHAPMoS • Incorporation of the Institutional Inventory List into: • National Programmatic Database based at the NACA • Supporting primary efforts of WHO’s P-SAM Exercise • Participation in the Launch of the National HIV M&E Strategy
SI Assessment M&E Road Map Implementation • ToT M&E Basic Concepts • 4 quarterly mentoring & coaching visits to Regional M&E Officers • Support to the NACA to produce QSCR
Institutional Capacity Strengthened – Regional M&E Officers • Internal monitoring systems developed and in place • Trace and verification conducted in each Region at SDPs • Regional QSCRs produced • Information products developed for feedback to SDPs
M&E Plan Revision Plan for MESST Plan for DQ T/A MESST: Malaria MESST: HIV Discussions with NAC: Next Steps Data Quality Timeline: TA to Malawi
RDQA Tool To Operationalize Statement of Work • Support finalization of the Draft Data Quality Protocol • Assist to roll-out the data quality protocol • Conduct data quality training(s) for key implementing agencies Draft Services & Indicators Matrix Draft register(s) to be used by all CBO SDPs Draft clear instructions Draft DQ policies Draft data steps performed at each level of the DMRS Draft guidelines for quality controls Draft written back-up procedures Data Quality Training in 5 Zones
Institutional Capacity to Support Data Quality Developed • Data quality plan written that includes guidance, reporting forms, and indicator matrix • Key stakeholders at national and sub-national levels oriented to – and reviewed – new data quality documents • NACA M&E Officers trained to train on data quality • Key staff in all 5 Zones have been trained by NACA on data quality
Conclusions • NACA M&E Unit institutional capacity strengthened at National and Sub-National levels • Data reporting more timely, with increasingly accurate, reliable and complete data • Foundation for effective data use laid • Documented systems establish institutional memory
MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) through Cooperative Agreement GPO-A-00-03-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina in partnership with Futures Group, John Snow, Inc., Management Sciences for Health, ORC Macro International, and Tulane University. Visit us online at http://www.cpc.unc.edu/measure.