210 likes | 227 Views
This document presents the lessons learned and best practices in the monitoring and evaluation of the EPAG Project, focusing on the development of the M&E system, gaining buy-in from service providers, building the monitoring team, and conducting random classroom visits. The importance of a well-designed M&E plan and the principles of mutual respect, transparency, equity, and capacity building are also emphasized. The role of the M&E team in ensuring the quality of service is highlighted, along with the collaboration between the team and service providers.
E N D
MINISTRY OF GENDER & DEVELOPMENT Economic Empowerment of Adolescent Girls & Young Women (EPAG) Project Monitoring & Evaluation System Wednesday, 29 August, 2012 AGI Technical Meeting, Monrovia-Liberia Dala T. Korkoyah, Jr. EPAG M & E Director
OVERVIEW • Monitoring & Evaluation efforts have been undermined by two inhibitors: • Compliance syndrome – system is built mainly to satisfy donor requirements • Vertical posture – a rigid system that utilizes a top-bottom approach, intimidating and policing end-users.
OBJECTIVE • To share lessons learned in the monitoring and evaluation of the EPAG Project. • Main focus: • Developing the EPAG M&E System • Gaining the buy-in of the service providers • Building the monitoring team • Conducting random, unannounced classroom monitoring visits • Problem solving with services
DESIGNING AN M&E SYSTEM • The development of EPAG M&E System involved the following core activities: • Reviewing the EPAG Operational Manual • Revising the Results Framework • Developing an M&E Plan
KEY LESSONS • Operational Manual • The Operational Manual (OM) was somewhat detached from prevailing reality: • No capacity building plan for SPs, • Different scores weight on indicators (e.g. training venues should not be in a noisy area), • Did not have tools for monitoring critical design elements (child care, job/business performance, employment verification, etc.). • The Operational Manual must be harmonized with prevailing operational needs of the project.
KEY LESSONS • Results Framework • Different versions of the results framework were discovered: • language inconsistencies (%, share of) • Impractical indicators (annual report to Parliament). • A standardized results framework should be used by all stakeholders.
KEY LESSONS • 3. Monitoring & Evaluation Plan • The above findings informed the M&E Plan: • Alignment of the project results framework with the Poverty Reduction Strategy (PRS) and the Millennium Development Goal 3, • Integrated a capacity building strategy for project key stakeholders, • Supported the drafting of monitoring tools for important project elements, • Promoted a realistic monitoring plan to accommodate volunteer monitors • An M&E Plan is critical to developing an effective M&E System – simplicity & participation
FRAMEWORK OF EPAG M&E SYSTEM • Coordination • - Quality control • - Capacity building • Technical assistance & skills transfer • Feedback to SPs • Internal monitoring • Quality control • Training sites visits • Implementation timeline • Reporting EPAG M & E Service providers M & E teams Quality monitors • QM visits • Venue assessment • Classroom observation • Trainee interviews • Verification Training services • Trainers • Attendance logs • Trainees performance • Reporting
OTHER CORE M&E ROLES • In addition to M&E of training services, the M&E served the IE survey firm and the Ministry of Gender M&E Unit • IE Survey Firm • coordinating activities between the WB, IE firm, and the service providers, • Participating in revision of survey instruments, • Conducting quality control visits during data collection and entry, • Ensuring compliance with recruitment strategy, • Reviewing reports, etc. • MoGD • Technical assistance • Coaching • Capacity building
SERVICE PROVIDERS’ BUY-IN • Monitoring and evaluation can become a fulfilling relationship, once built on the following principles: • Mutual respect and trust – the monitor is not a boss or supervisor, and should serve with integrity. • Transparency and equity – all SPs should be appraised on a set of common, agreed standards. • Opportunity for capacity building and support – be available to help find solutions, provide technical assistance and offer moral support. • When the service providers realize that you serve in their best interest, they tend to cooperate.
BUILDING MONITORING TEAM • The overall team characteristics have great influence of the quality of service: • Recruit and contract qualified monitors– academic, previous experience working with similar target group (monthly stipend provided on daily rate). • Team capacity building– organize training sessions to help team understand the project (goals, objectives, timeline, etc.). • Involve team in other project activities– engage team members at various levels; meetings, project launch, social events, etc.
BUILDING MONITORING TEAM CONT’D • Involve monitors in the definition of indicators and the development of monitoring tools. • Set clear boundaries for the roles and responsibilities of the monitors (define the scope of work). • Involve service providers staff (M&E officers, supervisors, trainers, etc.) in the training for the monitors. • A strong collaboration between monitoring team • and service providers ensures: • Synergy of efforts by all parties • Service providers are familiar with tools • Promotes effective internal monitoring
RANDOM, UNANNOUNCED SPOT CHECKS • The strategy that kept service providers on ‘their toes,’ sending a clear message for quality improvement: • Development of checklists – together with service providers a common, agreed set of indicators and score scales were developed. • Develop monitoring schedule – ensuring monitoring schedules fit into monitors’ existing plans. • Appoint a Monitors’ Supervisor – coordinates field work: distribute checklists, maintains working tools and equipment, collect scored checklists, write monthly activities report, etc.
RANDOM, UNANNOUNCED SPOT CHECKS • Weekly classroom monitoring – the monitoring plan allowed for two days of monitoring per week • A typical visit – monitors visit classes in pairs: both observe training delivery for about 15 min; one person scores the indicators while the other interviews two trainees selected on random basis. • Data analysis and reporting – both monitors tally scores, and report filled forms to Supervisors who submits data to M&E Director.
RANDOM, UNANNOUNCED SPOT CHECKS • Data analysis and report – filled monitoring forms are reviewed by Director: • Reviewed forms are submitted to data clerks for entry into computerized database, • Data is analyzed and quality performance scores calculated, • A consolidated monthly quality monitoring report is written, showing the quality performance of all service providers, • The report is shared with service providers for feedback • Sharing of the consolidated quality monitoring • report promotes a positive competition among • the service providers
PROBLEM SOLVING • Engaging service providers on monitoring findings • Involved diplomacy – enforcing compliance to • standards, yet remaining respectful and supportive. • Ensure monitors comply with guidelines • Develop follow up procedures • Regular communication with field teams • Review completed forms immediately
PROBLEM SOLVING • Communicate promptly with project managers to highlight problems (unsanitary, untidy childcare facility), and require immediate action within defined timeline. • Follow up – phone call, next visit, or report • Coordinator engages with Executive Directors or Country Directors… follow up • The performance-based nature of contracts provided • the incentive for service providers to submit • deliverables on time; payments were based on • satisfying agreed requirements.
EPAG MONITORING TOOLS • A lists of monitoring tools used by EPAG • Community Assessment Tool • Training Venue Assessment Tool • Classroom Observation Checklist • Trainee Interview Form • Trainees Attendance Tracking Tool • EPAG Trainees Directory • Job Performance Assessment Tool • Business Performance Assessment Tool • Job/Business Verification Tool • Quality Contests Tool
HIGH-QUALITY TRAINING • Two (2) mechanisms help to keep training quality high: • Project quality monitoring • Withheld incentive payment for service providers