580 likes | 738 Views
Using Information for Health Management; Part I. - Health Information Systems Strengthening. Learning objectives . the information cycle ; tools and processes for turning data into action the relationship between data use and data quality hierarchy of standards / essential data set
E N D
Using Information for Health Management; Part I - Health Information Systems Strengthening
Learning objectives • the information cycle; tools and processes for turning data into action • the relationship between data use and data quality • hierarchy of standards / essential data set • common reasons for compromised data quality, and various counter measures • different information products for communicating different meanings
Reflecting on the data that you have been working with …. What do you think are the steps that have been taken to get the data into the Kenya HMIS?
Tally sheets Easy way of counting identical events that do not have to be followed-up (e.g. headcounts, children weighed)
Registers Record data that need follow-up over long periods such as ANC, immunisation, Family Planning, Tuberculosis (TB)
Key issues • Registers for CoC • Tally sheets • Tick registers
Reports weekly, monthly, quarterly
Data set based on minimum indicator set • Standard definitions • Data sources & tools • Interpret information: comparisons trends • Decisions based on information • Actions • Data quality checks • Data analysis: indicators • Tables • Graphs • Reports
Linking Planning with Information Planning cycle INDICATORS Information cycle
Data Collection and Collation in a Health Facility (Zambia HMIS Procedure Manual)
Data collection – at the source of data creation (point of care) Service data collected by nurses and doctors in-between attending to patients Usually several (manual) steps before it is in any database/storage • Tally sheets • Tally sheet totals at end of month • Monthly summary forms which are reported to the next level Often too much to collect for already overworked staff
What data elements should be collected? • Cannot be obtained elsewhere (e.g. survey) • Are easy to collect (cost vs usefulness) • Do not require much additional work or time • Can be collected relatively accurately • Is part of one or more indicators
Essential data sets (EDS) Hierarchy of standards
Example of a National Data Dictionary • Key issues • Top-down vs bottom-up approaches • Who to involve in discussions • Maximalist vs minimalist approaches • ZA National Data Dictionary
EDS Example: vaccination data Input(community and facility levels) • Staff attendence, vaccines, to whom, when, where Process (district) • # Children Vaccinated Output (province) • Coverage of child immunization Outcome (national) • Decreased incidence of vaccine preventable diseases Impact (international) • Decreased mortality, healthier children
Prioritising data in the EDS: • Finagle’s Law: • The information youhaveis not what you want; • The information you wantis not what you need; • The information you needis what you can get; • The information you can get costs more than youwant to pay!
Comparability of collected data Stable standardised definitions • To ensure spatial comparability between different facilities, districts, provinces and nations • To ensure comparability over time • What do you think about this statement: • “Revising poor indicators /data sets /data elements may not be advisable due to cost and loss of backward comparability”
Where do we get data from? Routine data collection • Routine health unit and community data • Activity data about patients seen and programmes run, routine services and epidemiological surveillance • Semi-permanent data about the population served, the facility itself and staff that run it • Civil registration (vital events being integrated with health e.g. India) • Non-routine data collection • Surveys • Population census (headcounts proportion/facility catchment’s area) • Quantitative or qualitative rapid assessment methods
In Summary: Data collection Input: Using data sources and tools to collect quality data Common problems: • Too much to collect • Poor understanding of data collection tools • Timeliness of reporting • Low data quality Output: relevant data
Data Processing: • What observations can you make about your experience in processing the data so far?
Processing; assuring data quality and calculate indicators • Turning data into information • How to assess data quality? • What are indicators, and why do we need them?
Why checking data is vital? • Use of inaccurate data leads to • Wrong priorities (focus on the wrong data) • Wrong decisions (not applying the right actions) • Garbage in = garbage out • Producing data is expensive • Waste of resources to collect poor data
Routine data should be.. Reliable: Correct, Complete, Consistent Timely:fixed deadlines for reporting Actionable: no action = throw data away Comparable: same numeratorand denominator definitions used by all data processers BUT striving for comparability can compromise local relevance
Complete data? • Spatial: submission by all (most) reporting facilities • Timely: is the data available within the required time • Temporal: can you do analysis over time?
Timely data? • Late reports weaken the potential for comparison, action can be too late, but still useful for documenting trends; • Better to use the data that you have even if incomplete: “Perfection is the enemy of good”
Correct data? • Are we collecting the data we need? • The data values seems sensible/plausible? • The same definition applied uniformly? • Are there any preferential end digits used?
Consistent data? Data in the similar range as this time last year or similar to other organization units No large gaps or missing data No multiplicity of data (same data from multiple sources –which one to trust?)
What are the causes of poor data quality? • Too many forms to fill out that are not useful to health workers • Absent data collection tools (Nigeria) • Data collection tools are poorly designed and hard to understand • Too many steps of manual aggregation and transfer of figures (next slide) • Limited feedback on data quality to those who collect it • Data is not used
Mate KS, Bennett B, Mphatswe W, Barker P, Rollins N (2009) Challenges for Routine Health System Data Management in a Large Public Programme to Prevent Mother-to-Child HIV Transmission in South Africa. PLoS ONE 4(5): e5483. doi:10.1371/journal.pone.
Data quality affected by Data transcribed to Doctor or nurse interacts with patient Incomplete, illegible, undated data Patient record Step 1 Multiplicity of DCT’s, duplicated, non-standardised Manual recording Sub-set of data recorded in register and/or tally sheet Step 2 Inability to collate data accurately Monthly summary report compiled Step 3 Inability to collate data accurately Monthly summaries collated Step 4 Data capture errors Incorrect data elements activated Validation not done Data capture in DHIS Step 5 No feedback Little data analysis by program managers Data analysis and feedback Step 6
Strategies to improve DQ Training and skills development Financial Technology Supervision Incomplete, illegible, undated data Supervision In-service training and formal courses Doctor or nurse interacts with patient Patient record Step 1 Multiplicity of DCT’s, duplicated, non-standardised 1) Use of DHIS daily data capture, eTools 2) Electronic sign-off of data 3) Facility level capture of ART & TB data In-service training and formal courses 1)Improve printing of DCT 2) Hardware & software at facilities 3) HIS Staffing Sub-set of data recorded in register and/or tally sheet Step 2 Supervision Inability to collate data accurately In-service training and formal courses Monthly summary report compiled Supervision Step 3 Inability to collate data accurately Monthly summaries collated Supervision Step 4 Data capture errors Incorrect data elements activated Validation not done Formal courses: Data validation, feedback, check it, etc Data capture forms Correct data element activation Data capture in DHIS Supervision Step 5 No feedback Little data analysis by program managers In-service training and formal courses Auto-reports as “push” feedback Data analysis and feedback Step 6
eTool Scenarios: Excel Aggregation Data transcribed to Doctor or nurse interacts with patient Patient record Step 1 Manual recording Sub-set of data recorded in register and/or tally sheet Step 2 Monthly summary report compiled Excel Aggregation Tool in facilities Step 3 Monthly summaries collated Step 4 Electronic data transfer Easy to install and scale across facilities with computers on site Data capture in DHIS Step 5 Data analysis and feedback Step 6
eTool Scenarios: DDC in DHIS14 Data transcribed to Doctor or nurse interacts with patient Patient record Step 1 Manual recording Daily data capture on DHIS14 in facilities Sub-set of data recorded in register and/or tally sheet Step 2 Monthly summary report compiled Step 3 Electronic data transfer • Already available for Midnight Census in hospitals • Requires DHIS in facilities – useful for some larger PHC facilities Monthly summaries collated Step 4 Data available in DHIS Step 5 Data analysis and feedback Step 6
eTool Scenarios: DDC in DHIS2 Data transcribed to Doctor or nurse interacts with patient Patient record Step 1 Manual recording Daily data capture on DHIS2 on central server Sub-set of data recorded in register and/or tally sheet Step 2 Monthly summary report compiled Step 3 Electronic data transfer • Revolutionises the availability of data and feedback processes; • Aligns NIDS with DC tools immediately • Use of tablets could replace paper registers Monthly summaries collated Step 4 Data available in DHIS Step 5 Data analysis and feedback Step 6
eTool Scenarios: EPR systems Data transcribed to Doctor or nurse interacts with patient Electronic Patient record Step 1 Sub-set of data recorded in register and/or tally sheet Step 2 Electronic data transfer Monthly summary report compiled Step 3 • Potential expansion of the EPR systems to accommodate all kinds of chronic illnesses Monthly summaries collated Step 4 Data capture in DHIS Step 5 Data analysis and feedback Step 6
What can be done to improve data quality? 1. Assess the cause by using the Information Cycle as the basis 2. Programmatic Issues • Essential dataset • Feedback routines • Use of Information 3. Database validation mechanisms • Min/Max rules in software • Data validation rules, check for consistency in logic of data • Completeness and timeliness reports
Indicators - measure service COVERAGE and QUALITY Calculated by combining two or more pieces of data, so that • They can measure trends over time • They can provide a yardstick whereby facilities / teams can compare themselves to others (spatial, organizational) • monitor progress towards defined targets • Good for measuring change To do this, indicators need to have a numerator and denominator