150 likes | 339 Views
PILOTING ANNUAL REPORTING ON CWB: WV UGANDA’S EXPERIENCE. Godfrey Senkaba. Preparation to report on CWB. Our Experience Review of the NO strategy results framework Mapping alignment of programs to CWBT Communication and training staff on CWB NO DME capacity.
E N D
PILOTING ANNUAL REPORTING ON CWB:WV UGANDA’S EXPERIENCE Godfrey Senkaba
Preparation to report on CWB Our Experience • Review of the NO strategy results framework • Mapping alignment of programs to CWBT • Communication and training staff on CWB • NO DME capacity
Lessons learned: preparation to report on CWB • NO strategies with poorly stated outcome indicators or indicators which cannot be easily aligned to any of the CWBO provide challenges in interpreting program data for the CWB report. • Having national office strategy objectives aligned to CWB does not guarantee that the respective programme and project interventions automatically contribute to CWB. • Grading programs as contributing to CWBO/T based on how their interventions describe benefits to children or outcomes/indicators speak to CHILDREN is not adequate.
Lessons learned: preparation to report on CWB • For programs graded as contributing to CWBO or CWBT, there should be consistency/link in their outcomes, outcome indicators, interventions and measurement. • A communication plan on annual reporting on CWB should specify the Sponsors, Champion(s) and lead. • Capacity building efforts for preparing CWB report should not only target DME staff but Technical specialists as well.
Preparation to report on CWB Our experience: key steps followed • Gathering all FY11 program reports • Design of data entry and analysis tool • Data entry and validation • Data analysis • Preparation of the NO strategy objective summary fact sheets • Stakeholder Meeting • Writing the final CWB report
Lessons learned: Methodology • Uploading completed program reports onto PMIS/HORIZON is as important as completing the DME process/activity for which the report is prepared. • A CWB data entry/analysis tool that focuses on only the NO Strategy priority CWBT indicators limits the depth of analysis of a NO’s contribution to CWB. • Differentiating the ‘unique’ individuals reached/served from all people who benefited from the program interventions helps to eliminate double counting of beneficiaries.
Lessons learned: Methodology • The high level of awareness and understanding of CWB among staff does not resonate with documentation and reporting on programmes contribution to CWB. • It is important for the NO to ensure all programs adhere to or use common guidelines and operational definitions for key terms common to programming, else comparison across programs doing similar interventions is not feasible. • Reporting on programs implementing interventions in the ‘mainstreaming’ sectors e.g., advocacy is too subjective but can be improved with a standardized guidance regarding the CWBO/T the contribute to.
Lessons learned: Methodology • The quality of monitoring reports e.g., program annual reports, is better for programs that undertake or have good monitoring systems than those that don’t have. • Quality tests of all program reports to gauge the extent of methodological flows, data inconsistencies, clarify scope, etc, improve the level of confidence in using the reports to inform the CWB report. • Contribution to CWB by grant and project sectors that WV Uganda considers cross-cutting themes such as advocacy and peace building can be accurately reported on if the grant interventions are mainstreamed in programmes and progress captured in the ADP reports.
Lessons learned: Methodology • Preparation of summary fact sheets aligned to the NO strategy/priority sectors is good but might be interpreted by the staff to mean some sectors contribute more to CWB than others (based on evidence presented). • In addition, feedback from some of the technical teams in the NO and the Region, as well as some SO program officers may focus on their understanding of the purpose of the CWB (is it for alignment to NO strategy? performance? Accountability? Etc)
Recommendations and utilization of CWB report • Selection of CWBT in the NO Strategy (in addition, CWB indicators prioritized in the BSC). • [working draft] national M&E system (annual targeted setting and measuring performance). • Prioritizing support to ADPs that are redesigning/conducting baselines in FY11 to adopt and plan to measure CWBT
Recommendations and utilization of the CWB Report – cont… • Increase involvement of technical specialists in the redesign process as well as preparation of program reports. • Changed the National Approval Committee process for reviewing for approval, the program annual management reports • Guidelines for baselines and contextualization of CWBT tools • Piloting annual monitoring of program outcomes (use of LQAS methodology)
Recommendations and utilization of the CWB Report – cont… • Training and mentoring staff on program impact documentation • Performance of objectives for staff especially technical specialists (CWB reporting related objective) • Strengthen the NO program reports review system • There is need develop, measure and report in the CWB data from qualitative indicators.
Issues for further discussions • Annual reports prepared within a NO e.g., NO annual report vs CWB. • NO led annual surveys/monitoring of CWBO vs program reports (evaluations, baselines, monitoring) in light of preparing a CWB report. • The process map for CWB reporting