610 likes | 760 Views
Federal Ministry of Health. Routine Data Quality Assessment Preliminary Report FMOH and RHB in collaboration with Development Partners Policy Plan Directorate (FMOH) May ,2006EC Jigjiga ,Somali Regional state. Outline. Introduction Objectives Methods Assessment Protocols Results
E N D
Federal Ministry of Health Routine Data Quality AssessmentPreliminary Report FMOH and RHB in collaboration with Development Partners Policy Plan Directorate (FMOH) May ,2006EC Jigjiga ,Somali Regional state
Outline • Introduction • Objectives • Methods • Assessment Protocols • Results • Limitation • Conclusion And Recommendation
Introduction • Health information systems depend on multiple sources of data • Surveys, • Vital registration, • Vensus, and • Routine health information (M&E data) Routinely collected M&E data can provide important and timely information related to the delivery of national health programs when data are of high quality.
cont • Data quality assessment should always be undertaken to understand how much confidence can be placed in the health data reported • The DQA is designed for use by external audit teams while the RDQA is designed for a more flexible use, notably by Programs and projects • Data quality is a complex construct, which comprise multiple dimensions
Cont.. • HSDPIV has put HMIS as one of key strategic objective of the health sector • For monitoring program goals and objectives, • Guiding evidence-based program management, • Ensuring appropriate policy formulation and resource allocation
Cont.. • HMIS is a main source of information for health program monitoring it must comply with standards for • Accuracy, • Completeness, and • Timeliness. • expected levels of performance • Completeness > 85%, • HMIS reporting timeliness > 85%, • HMIS date quality > 90% • Compliance with performance monitoring standards (meetings held vs. meetings expected) > 100%.
General OBJECTIVES • To verify the quality of reported data for key indicators and the capacity of information systems to collect, manage and report quality data..
SPECIFIC OBJECTIVES 1.Assess the existence of HMIS data management and reporting systems processes. 2.Assess the level of technical determinants related to procedures, manuals and forms, software, 3. Assess the level of data quality (Accuracy, completeness &timeliness) 4. Assess the level of information use for decision making
Method • Study Area : all regions • Study design A cross-sectional study design • Sampling Method (SRS) • WorHO were randomly selected • Health Facility implementing HMIS for >6months were selected • Study Period: Nov – Dec , 2013 • Reporting period: Fourth quarter of 2005EFY
Sample size • Sample size determination is using single population method A total of 321 health institutions included in the study: • 11 Regions • 95 WorHOs and • 214 (32 Hospitals and 182 Health centers)
Study protocol • Three protocols • Data Verification protocol - Cross-check the reported results with other data sources -Reporting Performance: Timeliness, completeness, availability (Intermediate level and higher) • Data management and reporting system assessment protocol • Information use
Data verification • The data verification took place in two stages • In-depth verification at service delivery sites • intermediate aggregation WorHO & RHB • Verification factor (Recounted/Reported) • < 0.85 or 85% indicates over reporting, • 0.85 – 1.15 (85 – 115%) indicate acceptable accuracy level • > 1.15 (115%) signifies under reporting • A bar-chart shows the quantitative data generated from the data verifications
cont • System assessment ( It answers if all elements are in place to ensure quality reporting? • M&E structure • Availability and use of guidelines • Data collection and use • Data management process • Links with national system
Cont • Code/ categorize • System component code Value within • 2.5 – 3.0 (Green), indicates full system strength, • 1.5 – 2.5 indicates partial strength (partial) and • less than 1.5 indicating no system in place (Red). • A spider-graph displays qualitative data generated from the assessment of the data-collection and reporting system and can be used to prioritize areas for improvement
Data Collection instrument • Customized Standard WHO RDQA’s check list and question guide • Documents review • Interviews • Observations Document reviewed include: • Registers, Tally Sheets, Reporting Formats, Medical Records, etc. • Laboratory and pharmacy registers for cross check • Administrative health offices’ data aggregation and reporting formats • Performance monitoring log book
Indicators Eight indicators were selected
Data Verifications - Site Level Average Nearly 20% over report was observed in CAR(79%),all others are fall within the acceptable accuracy limits.
Data Verifications - by Region and Service Sites • TBCD is under reported In Tigray,Amhara ,Somali ,Harari and Diredewa • Except three regions Harari,DD and AA CAR is over reported in all regions • Penta 3 is over reported in Tigray, Benishangul, Gambella, and under report in AA
Data Verifications - District Level Average • Except CAR and CBA the rest indicators were over reported near to 40 % in Measels & PMTCT and more than 20 % in Penta1,Penta3,TBCD.
Verification Factors by Region and District Sites • The district level data of the selected indicators in Tigray AA, and SNNPR were relatively accurate • There is an over report in all indictor in the rest region except under report of PMTCT in Somali region
Data Verifications - Regional Level Average All indicators are relatively accurate at regional level
Data Verifications - Regional Sites • TBCD is highly over inflated in SNNPR and under reported in Somali • CAR over reported in Somali more than 70% and near to 30% increase Gambella
Data Verifications - Overall Average by Indicator Ethiopia,2014 Nationally CAR, Measles, and ART were over reported by 21%,19% and 19% respectively.
Summary on Data Verification • A tendency to over report in FP and EPI but under reporting in TBCD • A gap in recording pertinent data elements for monitoring and documentation as per standard (especially at service site & WorHO) • Relatively good performance at Regional level than Service site & WorHO • Reporting date (different reporting date from standard in SNNPR & Tigray )
Documentation Review – Overall Service Delivery Site Level Average
Over all average document review • Tigray & SNNPR and DD have available document at the same time reported timely • Amhara, Tigray and SNNPR have complete report
Trend in Variables of Document Review at Service Delivery Site • Improved in completeness of source documents • Decline in availability of source doc & maintaining reporting date within standard (Tigray & SNNPR has different date)
Document Review summary • Service level • PMTCT (77%) and ART (60%), indicators do not have proper documentation • None of the indicator were reported in the reporting period • District • Tigray ,SNNPR & Harari available report meet reporting Timeliness & completeness • Regional level • Only Tigray ,Amhara ,SNNPR & Harari meet all Reporting performance
Data Management Assessment - Overall Average All systems were found partially strong scored with a minimum of 2.01 in M&E structure functions and capabilities to the maximum of 2.48 in use of data collection and reporting forms.
M&E Structure, Functions and Capabilities National Summary • 61% facility has HMIS FP (25% HIT) • <25% trained staff is 39% new staff refreshment training • 77% of HFs integrated program • 42% of HFs do not provide 24 hr service • 13% HFs do not have card room worker
Data Management Processes It is found a quarter of (25.2%) facilities have at least 8 complete client records out of the expected ten records. None of client records were complete in 18.8% facilities. Eight out of ten facilities were recording on registers promptly upon service delivery Only 36 % of the facilities perform LQAS
Data Collection and Reporting Forms, Tools and Guidelines • 52%( cards) and 54% (Register/tally) available in all Health • facility • 71% HFs use HMIS Code • 41% FP able to calculate indicators • 72% of HC & 93% of Hospitals Use tools consistently • 51% use additional "unofficial" forms
V. Links with National Reporting System Almost all facilities are using official HMIS forms and there is multiple channel of reporting in more than 25% of facilities.
Information Use • Information use has been determined by different dimensions • Use of demographic data • Analyze plan vs. achievement • Develop action plan and disseminate • Document and follow execution • Display information • Identification and tracing mechanism for "drop out"