1 / 21

External DQA Methodology and Implementation

External DQA Methodology and Implementation. Mozambique Strategic Information Project (MSIP) JSI Research & Training Institute, Inc. (JSI) in collaboration with UCSF, I-Tech. Prepared by: Dália Traça November 4, 2015. Objectives of the eDQA strategy. Strategic Approach.

finnj
Download Presentation

External DQA Methodology and Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. External DQA Methodology and Implementation • Mozambique Strategic Information Project (MSIP) • JSI Research & Training Institute, Inc. (JSI)in collaboration with UCSF, I-Tech Prepared by: Dália Traça November 4, 2015

  2. Objectives of the eDQA strategy

  3. Strategic Approach Promote the alignment of the existing reporting systems (PEPFAR and SIS—MA/MóduloBasico) Create a sustainable Data Quality Assessment system that is affordable, accepted, owned and scalable by the MoH Prioritize the inclusion of MoH staff in all steps of the development, piloting and Implementation of the eDQA strategy

  4. DQA Objectives • To assess the quality of data registered in primary sources and data reported to the upper levels, verifying the following sources: • Daily registers vs. Monthly reports (Health Facility), • National Database “MóduloBásico” (District/NED), • “MóduloBásico” (Province/NEP), • “MóduloBásico” (Central level) • To assess the data management and reporting systems at the HF and NEDs level.

  5. Assessed Indicators

  6. Overall DQA Implementation Methodology Calendar of DQA implementation with MoH (including site selection) MoH informs Provinces Health Department (DPS) of DQA implementation dates and facilities DPS informs District Health Directorates (DDS) and Health Facility (HF) of DQA implementation and dates Training of MoH central staff (prior to departure to provinces) Training for DPS and Implementing Partner (IP) staff at province DQA implementation (with debrief at HF level) DQA debrief at province level for DPS and IP National debrief at MoH central

  7. Methodology for data collection ART Indicator Random selection of 30 active patients on ART • Confirmation of patient status, within the revision period, based on: • Drug pick-up date in the Pharmacy Register Book and Individual Drug Prick-up form (FILA) • Last medical consultation, based on individual patient file CTX Indicator Random selection of 30 active patients eligible for CTX • Confirmation of reception of CTX for eligible patients using individual patient files ANC, MAT, PCR CT & VMMC Indicators Comparison of recounts from source documents with reported data at the various levels

  8. TARV CTX CPN, MAT, PCR , UATS e CM • Data Analysis – Calculation of the deviation Deviation: <10% Goodquality data; 10 to 20%: Moderate data quality; >20% PoorQuality Data

  9. DQA RESULTSRound 2014

  10. Indicator ART

  11. Participatory Approach to DQA activity

  12. DQA Lessons Learned (1)

  13. DQA Lessons Learned (2)

  14. On the right track?

  15. Comparison between Round 2014 and 2015

  16. Quality Improvement during eDQA

  17. Key Findings eDQA Preparatory phase: • Training with MoH central – training with central level MoH staff prior to departure to the DQA, to accompany the team in the field 2. Training with DPS – Training all DPS staff and IP on the DQA implementation

  18. Key Findings (cont.) At Health Facility: During Data Collection • Clarification on the filling in of registers (missing data, misplaced data, etc) • Poorly kept registers • Counting clarification during recounting by eDQA team • Clarification of indicator definition and data collection During Debriefing (technical and management) • Highlight and discussion of specific HF strong and weak points observed during DQA activity • Leaving observations and recommendations in writing, including preliminary deviations for each indicator • Reinforce the importance of Data Use for better management.

  19. Key Findings (cont.) Debrief at Province Level with DPS and IP: • Discussion of strong and weak points observed during DQA activity, highlighting more problematic indicators. 2. Clarification of calculated deviations per indicator 3. Leaving observations and recommendations in writing in the form of a detailed Power Point presentation

  20. Conclusion “You cannot do DQA without doing QI” Even though the DQA had a specific purpose, we realized that the participatory methodology used creates the possibility to introduce to Quality Improvement elements along the process, as well as capacity building and empowerment of the MoH staff to pursue better quality data, at all levels of the hierarchy.

  21. Dália Monteiro Traça Chief of Party MSIP Maputo, Mozambique dtraca@mz.jsi.com OBRIGADA!

More Related