770 likes | 881 Views
Multiple Indicator Cluster Surveys Survey Design Workshop. MICS Evaluations and Lessons Learned from MICS4. Part 1: MICS Evaluations. MICS Evaluations. MICS1 Evaluation MICS2 – No evaluation MICS3 Evaluation – John Snow Inc Comparable quality with DHS and other survey programs
E N D
MICS Survey Design Workshop Multiple Indicator Cluster SurveysSurvey Design Workshop MICS Evaluations and Lessons Learned from MICS4
MICS Evaluations • MICS1 Evaluation • MICS2 – No evaluation • MICS3 Evaluation – John Snow Inc • Comparable quality with DHS and other survey programs • Fulfills important mission in global monitoring • Mismatch between where technical expertise is (HQ) and where technical decisions are taken (Country), communication problems • Short-cuts are being taken in training, listing, fieldwork • Limited human resources an impediment
MICS Evaluations • MICS4 Evaluation, Cluster 1 and Cluster 2 • Cluster 1 completed
MICS4 Evaluation - Findings • Significant expansion of the envelope of technical support resources: • Regional coordinators, support by experts, UNICEF MICS Consultants, more structured process of technical support and quality assurance • Organizational structure, communication channels, decision-making authorities remain unchanged – suboptimal for the objectives. E.g. CO not complying with guidelines, quality assurance processes (large samples, additional questions) • Not on the agenda of senior managers at HQ or RO levels
MICS4 Evaluation - Findings • Universal adherence to training guidelines (duration) • No evidence of interviews or spot-checks • Field check tables an important tool, inconsistent use • Large sample sizes, large survey teams greater than recommended, manageable levels • Shorter time for production of final reports
MICS4 Evaluation - Findings • Dramatic improvement in data quality • MICS4 and DHS have comparable quality on most indicators • Quality of some MICS data need improvement
MICS4 Evaluation - Recommendations • CO to be compelled to hire UMCs • Increase regional consultant pool • Fully integrate technical review and quality assurance processes into key documents • When MICS reports are lagging, additional implementing agency or consultant to finalize report – include in MoU • UNICEF should invest more into other data collection efforts, without hampering MICS or other household surveys, for lower administrative level data generation
MICS4 Evaluation - Recommendations • Additional data processing staff needed • Strengthen use of field check tables • Increase guidance to Ros to gauge risks in advance of MoUs, and for course-correction and withdrawal from global MICS program • Do’s and don’t’s for CO and RO managers • Tools to be developed to ensure consistency of the work of regional consultants • Documentation for sample design and implementation
MICS4 Evaluation - Recommendations • Spot checks and observations • Measurements for further improvement of anthropometric data quality • Better documentation of Global MICS Consultations • Regional coordinator turn-over – overlaps needed
Looking at data quality – Why? • Confidence in survey results • Identify limitations in results • Inform dissemination and policy formulation • All surveys are subject to errors
Data quality • Two types of errors in surveys • Sampling errors • Non-sampling errors: All other types of errors, due to any stage of the survey process other than the sample design • All survey stages are interconnected and play roles in non-sampling errors
Data quality • Sampling errors can be envisaged before data collection, and measured after data collection • More difficult to control and/or identify non-sampling errors
Data quality • We have discussed several features/recommendations for quality assurance to minimize non-sampling errors • Failure to comply with principles behind these recommendations leads to problems in data quality
Data quality analyses • Looking at • Departures from recommended procedures/protocols • Internal consistency • Completeness
Countries, UNICEF, Interagency Groups, Partners in Development Monitoring Priorities Goals And Targets Indicators Operationalization Validation, Testing, Piloting, National Surveys Standard Survey Instruments Questionnaires, Data Processing Tools, Sampling Considerations, Analysis Plans, Dissemination Strategies
Monitoring Priorities Goals And Targets Indicators Major source of poor data quality Operationalization Non-validated, untested Survey Instruments Standard Survey Instruments
Household Completion Rates Completed / Selected
MICS Protocols Observations Selection
Serious Business Out-transference Omission
Out-transference from age 15 Age 14