1 / 43

Analysing mortality data derived from Secondary User Services

This analysis aims to identify unexpected variation and areas of concern in mortality rates, enabling informed decision-making for improvement. It includes national benchmarking and the use of indicators such as the Hospital Standardised Mortality Ratio (HSMR) and the Summary Hospital Level Mortality Indicator (SHMI).

gspitzer
Download Presentation

Analysing mortality data derived from Secondary User Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysing mortality data derived from Secondary User Services

  2. Purpose Of Mortality Indicators • Highlight unexpected variation and areas of concern for further investigation • Enable the Trust to make more informed decisions to drive change and improvement • Demonstrate progress towards a reduction in avoidable deaths • Understanding variation in mortality rates leads to the spread of best practice • A suite of indicators should always be used when analysing and interpreting mortality data

  3. National Benchmarking • Enables the Trust to compare itself against peers and provides expected rates • Uses data from a full financial year • Currently based on 2012/13 Secondary User Services (SUS) data

  4. Hospital Standardised Mortality Ratio (HSMR) • Is the ratio of observed deaths to expected deaths for 56 diagnosis groups, often expressed as a percentage - if greater than 100% then mortality has exceeded the expected level • These groups represent 80% of inpatient deaths • Expected deaths are calculated using crude mortality data adjusted for the profile of a hospital’s patients • Factors influencing that adjustment include primary diagnosis, age, sex co-morbidity, deprivation and method of admission

  5. Hospital Standardised Mortality Ratio (HSMR) • It is important not to use HSMRs in isolation and confidence intervals are vital • HSMRs can be distorted by changes in coding practice i.e coding of: • Primary diagnosis • Inclusion or exclusion of palliative care codes • Depth of coding (co-morbidities) • Place of death (more patients may die in hospital if community alternatives are limited)

  6. HSMR December 2013 – November 2014 • For the rolling 12 month period HSMR is 86.45 (significantly lower than expected) • Shows a reducing trend • WHHT is one of 9 Trusts (out of 17) across the region with lower than expected HSMR • Significant difference between the weekday and weekend HSMR for emergency admissions (84.99 and 91.86), but neither is higher than expected • However the HSMR for septicaemia (except in labour) is 137.74 and is significantly higher than expected (75 deaths vs 54.45 expected), but analysis has shown this is due to coding mistakes

  7. HSMR Trend December 2010 – November 2014

  8. HSMR Trend December 2013 – November 2014

  9. Standardised Mortality Rate (SMR) • Ratio of observed to expected deaths • Expected deaths are calculated for a typical area with the same case mix adjustment • May be quoted as a percentage • If higher than 100%, then observed deaths are higher than expected

  10. SMR December 2013 – November 2014 • All diagnosis SMR is 83.63 which is lower than expected • There are however two diagnosis groups with a higher than expected SMR: • Septicaemia (except in labour) • Rehabilitation care, fitting of prostheses and adjustment of devices • SMR by Division

  11. SMR Divisional Diagnosis Group Outliers

  12. All Diagnosis SMR Trend December 2010 – November 14

  13. All Diagnosis SMR Trend December 2013 – November 2014

  14. All Diagnosis SMR Peer Comparison December 2010- November 2011

  15. All Diagnosis SMR Peer Comparison December 2011 to November 2012

  16. All Diagnosis SMR December 2012 to November 2013

  17. All Diagnosis SMR Peer Comparison December 2013 – November 2014

  18. Summary Hospital Level Mortality Indicator (SHMI) • Mortality at Trust level across NHS England • Published quarterly by the Health & Social Care Information Centre (HSCIC) since October 2011 • It is the ratio between the number of patients who die following hospitalisation at the Trust and the number of patients expected to die • All deaths in hospital or within 30 days post discharge are counted

  19. Summary Hospital Level Mortality Indicator • Expected deaths are based on the England average given the characteristics of the patients • SHMI may also be expressed as a percentage and if greater than 100% then mortality has exceeded the expected level

  20. Differences Between SHMI and HSMR • SHMI includes deaths occurring outside of hospital • HSMR only includes in hospital deaths • SHMI includes deaths from all Clinical Classification System (CCS) Groups • HSMR Includes deaths from 56 CCS Groups • Variables used in the statistical model to calculate estimated deaths differ eg • SHMI does not include adjustment for palliative care codes or deprivation • HSMR does include adjustment for palliative care codes and deprivation

  21. SHMI June 2013 – June 2014 SHMI 90.33 - within expected range SHMI (in hospital) 88.75 - significantly lower than expected SHMI (adjusted for palliative care) 90.51 significantly lower than expected 3 diagnosis groups with a significantly higher than expected SHMI: Septicaemia 178.59 Cancer of breast 193.32 Leukaemias194.67 Hence all deaths now coded by a consultant

  22. SHMI Trend July 2010 – June 2014

  23. SHMI Trend Quarter 2 2011 – Quarter 1 2014

  24. SHMI and HSMR By Peers For All Admissions July 2011 – June 2012 July 2010 – June 2011

  25. SHMI AND HSMR By Peers For All Admissions July 2013 – June 2014 July 2012 – June 2013

  26. Cumulative Sum Analysis (CUSUM) • The CUSUM chart provides an early warning system for changing mortality rates • Plots patients’ actual outcomes against their expected outcomes sequentially over time. The chart has upper and lower thresholds and breaching these threshold triggers an alert • Can reveal when a change occurred • Is used by the CQC to monitor Trust performance

  27. CUSUM December 2013 – November 2014

  28. Patient Safety Indicators • Currently two metrics are available through Dr Foster • Death in low risk diagnosis groups which is as expected • Death after surgery which is also as expected

  29. Septicaemia Tracking Relative risk remains significantly higher than expected for the rolling 12 month period at 136.93 But Relative risk is reducing month on month and the latest rolling 6 month picture shows that relative risk is as expected at 71.59 Correction of a coding error during the last 6 months has led to an improvement on the previous 6 months indicators Amongst peers the Trust is improving its position

  30. Septicaemia Trend December 2013 – November 2014

  31. Septicaemia Trend December 2010 – November 2014

  32. Septicaemia vs. Peers December 2010 – November 2011

  33. Septicaemia vs. Peers December 2011 – November 2012

  34. Septicaemia vs. Peers December 2012 – November 2013

  35. Septicaemia vs. Peers – Current 6 months vs. Previous December 2013 – May 2014 June 2014 – November 2014

  36. Fractured Of Neck Of Femur Tracking • #NOF relative risk is statistically as expected at 108.62 • An improvement from 2 months earlier when relative risk was higher than expected at 118.57 • 6 month data shows relative risk within the expected range at 90.16 • Significantly higher than expected number of deaths on a Sunday and for those admitted on a Sunday, though this is not reflected across the 6 month data • Amongst peers the Trust is improving its position

  37. #NOF Trend December 2010 – November 2014

  38. #NOF Trend December 2013 – November 2014

  39. #NOF vs. Peers December 2010 – November 2011

  40. #NOF vs. Peers December 2011 – November 2012

  41. #NOF vs. Peers December 2012 – November 2013

  42. #NOF vs. Peers – Current 6 Months vs. Previous June 2014 – November 2014 December 2013 – May 2014

  43. In Summary • The picture is one of general improvement • The Trust is performing well within its peer group • Several areas for further focus have been highlighted, including the difference between mortality on weekdays and weekend days • Data highlights the importance of correct coding and demonstrates the impact of coding errors on performance analysis (scepticaemia)

More Related