610 likes | 720 Views
workshop . Lighthouse data. Building a Lighthouse In a sea of data. IHI APAC FORUM New Zealand - September 2012 Richard Hamblin – HQSC Andrew Terris – Patients First. Workshop. By the end of this session you will be
E N D
workshop Lighthouse data Building a Lighthouse In a sea of data IHI APAC FORUM New Zealand - September 2012 Richard Hamblin – HQSC Andrew Terris – Patients First
Workshop • By the end of this session you will be • able to identify the focal points to drive a small number of measures that will influence quality • equipped with an approach to ensure measures have broad input while remaining consistent in definition and purpose • aware of some of the pitfalls of selecting and combining measures to stimulate improvement
Answering the Key Questions How do you know you are any good? How do you rank with the best? How do you know you are improving?
Agenda Why measurement matters Why it can go wrong How to avoid this How Health Quality Measures New Zealand helps
The four stages of denial • The data isn’t right! • The data is right but is isn’t relevant to me! • The data is right and it is relevant to me but there’s nothing I can do about it! • The data is right • It is relevant to me • I can do something about it • Better get on with it
Why measurement matters ‘We can only be sure to improve what we can actually measure.’ Raleigh and Foot 2010 ‘Evidence suggests that publicly releasing performance data stimulates quality improvement activity at the hospital level.’ Fung et al 2008
How is reporting performance supposed to work? Publicly reported performance data Knowledge Selection Change Motivation Performance Effectiveness of care Safety Patient Centredness Unintended consequences From Berwick, James and Coye 2003
Links to incentives • Intrinsic – altruism/professionalism • 60k want to do the right thing • Implicit – kudos and censure • It’s not my DHB is it • Indirect – market share/ career enhancement • Gongs and gold • Direct – P4P, targets with explicit rewards and sanctions • Punished by reward?
But sometimes the change is perverse NHS ambulance response times New York Cardiac Surgery P4P in California Mid Staffordshire Hospital
Why? Rarely because of bad people Bevan and Hood, 2005
So, why does it go wrong? Wrong approach Wrong framing Wrong system incentives Wrong measure Wrong construction
A brief word about data quality Data quality a cop-out Get the rest wrong and perfect data will still not work Get the rest right and imperfect data will still be useful
Wrong approach Framing systems help to avoid this e.g. Donabedian (structure, process, outcome) Carter and Klein (tin-openers and dials)
What structure process and outcome actually is Structure is what you put into a system Process is what you actually do Outcome is what happens as a result SO Use of a specific electronic system is not a process Clinical actions are not outcomes
How outcome contextualises process • “Hitting the target and missing the point” • Failure to recognise improvement • Gaming and other nasty things • But did it make any difference? Elation Process Despair
Conclusion – process is not a proxy for outcome – doing the “right” thing may not lead to a desired outcome And you may end up incentivising some pretty damaging behaviour
NHS Ambulance response times 75% of life-threatening calls responded to in 8 minutes – a clinically relevant process
NHS Ambulance response times 75% of life-threatening calls responded to in 8 minutes – a clinically relevant process ‘Corrections’’ only 2% to 6% 75% < 8 minutes 75% < 8 minutes Source: http://www.chi.nhs.uk/eng/cgr/ambulance/index.shtml
Variations in outcome • At least four causes • Genuine variation in quality • Unadjustable variation in casemix • Local issues or organisation and recording • Statistical noise • Or some combination thereof – and you don’t know which is causing the variation from the data alone
Implications – 1) setting targets for outcome impossible 2) simplistic benchmarking of outcomes counter-productive
How outcome contextualises process Outcome ?Hitting the target and missing the point ?Is there a new problem Looks to be working (but keep watch out for confounders!) Process ?What else is happening ?Regression to the Mean Get on with it!
Tin openers and dials • Concept from Carter and Klein (e.g. 1992) • Tin openers open up cans of worms • Dials measure things • Most of the time you need to ask the right questions as much as you need to get the right answers
Wrong framing Four frames of measurement Aggregation, aggroupation and synecdoche
Aggregation, Aggroupation, Synecdoche Summary in a sea of data Aggregation – some sort of arithmetic summary of multiple pieces of data Aggroupation – organisation to present a (usual visual) analogue Synecdoche – assuming the part represents the whole
Aggregation - Risks Hospital X Quality Scorecard HSMR 97 Patient Satisfaction Rating Top Quartile Days since CLAB 38 Overall Grade B+
Even if you’re sophisticated you still have the same problem
Synecdoche What if the part isn’t the whole? e.g. Low waiting times do not equal high quality
Wrong system incentives There are times when intrinsic motivation isn’t enough
Measures For? Or Judgement Improvement
Measures For? http://www.kingsfund.org.uk/publications/quality_measures.html
So, how do you view the Three Faces of Performance Measurement? As a… As… Or, Research Improvement Judgment
…BUT MRSA too Baseline year Target year
Wrong measure Doesn’t measure what it sets out to Impossibility of data collection Ambiguity of interpretation Over-interpretation
Examples Unacknowledged use of structural measures as proxies for outcomes Average LOS as a measure of whole system productivity Mortality rates as direct measures of quality
Wrong construction Technical checklist Threshold effects
Technical checklist Numerator/denominator consistency Standardisation – when and how Exclusions Attribution
Threshold effects Ambulances ED/A&E time limits Measure distributions and measure effects Seriously consider whether direct incentivising around the threshold is a good idea
Exercise 2 take offline in the interests of time
The Approach Creating measures that matter – the “pull” of demand from various health sector stakeholders Democratising the process – balancing top-down consistency with bottom up feedback and design Aligning information – road testing measures Considering the system view – balancing process, system and outcome measures The information considerations – dataset, utility and relevance at local and aggregate levels
Patients First Framework Outcome 1 Outcome 2 Outcome 3 Informed\ by ... Creating the “pull” through a quality lens Model 2 Model 1 ... Measurement Evaluation Prioritisation Quality Improvement Inter-op Dataset Shared Care ... Informs PMS Requirements GP2GP Pathways ... Information
Key Principles Collaborative/peer to peer Share experience/expertise Promote transparency Be sustainable Avoid duplication Create a system that can evolve Web based