250 likes | 394 Views
Overview of M&E System. Outline. Results Framework and KPIs Program Monitoring Program Evaluation Support Activities Operations and Management Mirrors the M&E Sub-manual. Why M&E?. To know if we were able to achieve our objective:
E N D
Outline Results Framework and KPIs Program Monitoring Program Evaluation Support Activities Operations and Management Mirrors the M&E Sub-manual
Why M&E? To know if we were able to achieve our objective: Communities in target municipalities are empowered to achieve improved access to services and to participate in more inclusive local planning, budgeting, implementation and disaster risk reduction and management.
Monitoring vs. Evaluation Monitoring • a continuing function that uses systematic collection of data on specified indicators to provide management and the main stakeholders of an on-going development intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds. “when, where, how much” Evaluation • Is a periodic, systematic analysis of project or program performance in terms of achievement of higher objectives, and to process the lessons learned for improving design during implementation or applying the lessons to other projects or programs. “why, how”
Results Framework and KPIs Program Development Objective: • “Communities in the target municipalities empowered to achieve improved access to basic services and to participate in more inclusive local planning, budgeting, implementation and disaster risk reduction and management”
Results Framework and KPIs Program Development Objective • 6 (End-of-Program) Outcome Indicators • 22 (Intermediate) Outcome Indicators • 6 indicators for Component 1: Barangay Grants • 10 indicators for Component 2: Capacity Building and Implementation Support • 6 indicators for Component 3: Program Management and M&E
Types of Monitoring Community Monitoring Operations Monitoring Results Monitoring Grievance Monitoring Geo-tagging
NPMO will randomly quality check the approved data • Process and publish them into reports or maps. • Send feedback to regional level. • NPMO will feed data and manage information in the server. • Thorough quality checking • Approval and consolidation • Send feedback to SRPMT. • Check submitted data for clearance • Send feedback to municipal level • Post process data • Encoding to office desktop • Submit data to KC NCDDP server. • Geo-tagging activities, gathering of forms, GRS, etc.
Convergence Opportunity • Data Sharing • Avoids duplications • Venue to clarify guidelines and directives • Monitoring of higher level results Challenge • Development of indicators, systems and tools • Multiple stakeholders • New approach for DSWD • High expectations
Data Quality Assessment Comes in the form of… • Documents and record reviews • Site visits • Key informant interviews • Focus group discussions Purpose… • Assess the actual forms and procedures used • Review data collection and handling • Check the sufficiency and timeliness of data • Check other concerns which affect data validity, accuracy and integrity
SUPPORT ACTIVITIES Program Information Management System and Capability building
PIMS Desktop Application Web Application Business Intelligence Geo-database/GIS Android Applications Interactive Website Web Forum LGU Dashboard
Capacity Building Staff Capacity Building • Trainings • TA and Coaching LGU and Community trainings on M&E
Roles in M&E NPMO RPMO SRPMT ACT LGU counterpart staff
GRS Operational Results Talakayan External IE and Studies Forms and Geo-tagging SET KPI CM and AR Community Monitoring Geo-tagging Third Party
The Power of Measuring Results If you do not measure results, you cannot tell success from failure. If you cannot see success, you cannot reward it. If you cannot reward success, you are probably rewarding failure. If you cannot see success, you cannot learn from it. If you cannot recognize failure, you cannot correct it. If you can demonstrate results, you can win public support. Adapted from Osborne & Gaebler, 1992