120 likes | 233 Views
Future directions for CHA’s Benchmarking Member Service. Performance. Dashboard indicators being collected consistently across CHA although still gaps. Performance needs to be analysed within the appropriate context.
E N D
Performance • Dashboard indicators being collected consistently across CHA although still gaps. • Performance needs to be analysed within the appropriate context. • Hospitals treat patients with different casemix and severity which can impact outcomes. • To make meaningful and fair comparisons, future studies need to allow for risk-adjustment including age, sex, co-morbidity and socio-economic status or the need for transfers • Many outcome indicators more meaningful when measured at the clinical procedure or condition level, rather than the hospital wide level. • Better to compare ‘like’ patient groups with similar risks and outcomes
CHA Dashboard Indicators Report2010-11 • Currently some 60 indicators • 2011 meeting gave feedback on improvements. • Need to think about improving structure and adding value: • Fit within broader framework • Start to report trends by year (control charts) • Would be helpful to map to existing collections so that can see overlap eg Performance agreements, Accreditation, other
Aligning performance measurement • Fit within broader framework • National • Jurisdictional • Ensure relevant to paediatrics • Influence how we deliver care • Avoid duplication
Key Questions • Do the current indicators reflect Paediatric priorities? Y/N • Where are the gaps? • Collection difficulties eg Infection control, MH • Are the current indicators of value to member Hospitals? Y/N • What can be done to improve their utility? • Large vs Small hospitals • Other indicators eg HR • How can the dashboard indicator program be improved? How could presentation of the data be improved?
The approach • Divide into groups • Each group will appoint a scribe • All will address the same question – 5 minutes • All move to new table except scribe • Repeat process for next questions in sequence • Collate responses • Summary feedback to group
Feedback Question 1Do the current indicators reflect Paediatric priorities? • Indicators do not necessarily reflect agreed paediatric priorities. • Driven by performance frameworks in each state, national, ACHS and quality and safety. • Not always relevant to paediatrics. • Many other important collections by craft groups not reflected. • Currently collect too many indicators yet have emerging gaps in areas such as new health reforms, ambulatory/ outpatients care, mental health, immunisation, obesity, diabetes and respite care.
Feedback Question 2 Are the current indicators of value to member Hospitals? • Developing a better understanding with each collection. • Data set becoming increasingly dependable and provides basic reassurance but still patchy. • Helpful to know hospitals are comparing like with like but concern raised that members do not necessarily apply consistent criteria. • Some indicators lack clarity. Tending to be more administrative than clinical. Will need to strike a balance.
Detail insufficient for more comprehensive comparative or advocacy purposes. • Burden for smaller hospitals who would get value out of a smaller dataset. • Feedback needs to be timely to be of value. • Other new measures – as per Q1, Sick leave • Overlap with other CHA data collections (benchmarking).
Feedback Question 3How can the dashboard indicator program be improved? • Agree paediatric priority areas and indicator set to monitor. • Consider a single annual collection with consolidation of CHA databases to reduce duplication of effort and facilitate integrated reporting. • Reduce to a relevant KPI set for paediatrics that address agreed existing and emerging priorities. • Use as an opportunity to influence measures in other collections.
Foster clinician leadership and engagement at each hospital and encourage clinician feedback on relevant data and performance measures eg. Diabetes. • Redefine the size of facilities into realistic groups for both data collections and reporting • Enhance reporting to flag trends and significant changes and report selectively at a specialty level. • Learn from innovation at other sites- Relate clinical models to indicators and benchmark data - Relate to other indicators (specialty) • Improve consistency, timeliness and process of collections at sites. • Increase skill base in hospitals in information management
Next steps • Circulate to participants for review and confirmation • Refer to Dashboard Indicator Steering Group for action • Brief CHA Board