210 likes | 233 Views
Explore board trustees' role in ensuring continuous quality improvement in healthcare, addressing past impediments and emphasizing clinical leadership. Learn about quality metrics, targets, and patient-centered care approaches for efficient, high-quality services.
E N D
Continuous Improvement in the Quality of Care –What is the role of the board? Keith Palmer April 22nd 2008 keith.palmer32@btinternet.com
The three questions for today • What can boards of trusts (PCTs and providers) do to ensure greater attention is paid to continuous improvements in the quality of care? • What have been the impediments to greater focus on quality in the past and how can they be overcome going forward? • Why is clinical leadership central to quality improvement and how can it be brought about?
Why focus on quality? PCTs should be focused on the quality of care provided, the quality of patient outcomes and on quality improvement Hospital trusts should be focused on delivering a comprehensive quality improvement agenda - much more than ‘must do’ targets In past national targets focussed on access times, a very narrow range of quality indicators and financial balance Healthcare Commission assessments focussed primarily on processes rather than outcomes More transparency about quality of care is essential for good commissioning and effective patient choice Need to address the concern that financial pressures could erode the quality of patient care
Quality, Efficiency and Funding Quality Efficiency Funding High quality care should be affordable if the provider is efficient and funding is cost reflective
What do we mean by high quality? There will be different but overlapping quality metrics for commissioners and providers Provider quality domains • Clinical quality • Patient safety • Patient satisfaction • Convenient care • Timely care Define quality metrics at service level eg cancer, cardiac, stroke, maternity etc - and aggregate at Trust level There will be different key quality metrics for different services
Services provided by tertiary hospital • Cancer, heart disease and stroke – the three big killers – account for 30-40% of activity by value • Emergency medicine accounts for 15-20% of activity • Women’s and children’s services account for c 10-15% of activity • Management of kidney disease accounts for 5-10% of activity • Elective surgery accounts for c 10% of activity Appropriate quality metrics will differ considerably depending on the nature of each service
What do we mean by quality improvement? Quality improvement is measured by progress from baseline metric values to ‘best practice’ metric values
Setting quality improvement targets • International and national evidence-based benchmarks – adjusted for local circumstances • Agree desired medium term values and determine achievable trajectory for improvement Metric value Best practice value Baseline value Annual plan targets Years
Clinical quality metrics (I) Outcome measures • Mortality rates (gross and risk adjusted) • Post-discharge morbidity (illness) – quality of life measures • Time to discharge – shorter stay often equals better outcome? Examples • Heart attack – survival rate/ALOS/post-discharge ‘normal’ life <X weeks • Stroke – survival rate/ALOS/post-discharge quality of life measures • Maternity – survival rate/% serious complications • Cancer – 5 year survival rate/post-treatment morbidity
Clinical quality metrics (II) ‘Input’ measures • Compliance with National Service Frameworks • Compliance with best practice models of care • Compliance with NICE guidance Examples • Heart attack – Thrombolysis - call to needle < [-] hours (%) • Stroke – CT scan/thrombolysis when appropriate - call to treatment < [-] hours • Maternity – 1:1 midwife care/min consultant obstetrician presence • Cancer – max 2 week wait diagnosis to treatment for all patients (%)/adoption of best practice models of care for all patients (%)
Patient safety metrics Trust-wide and service-level measures Currently a relative paucity of quantifiable measures of clinical safety • NHSLA(CNST) levels achieved • SUIs (target number) + evidence of learning lessons [specify metrics] • Infection rates + deaths from infection + evidence of root cause analysis/actions resulting • Compliance with clinical guidelines (esp by junior doctors) • Strong culture of clinical audit + evidence of learning lessons (targets) • Do all surgeons undertake minimum number of relevant procedures per annum? (reports cf clinical guidance) • Medication errors (%)/patient falls (no)/ deaths in ITU/deaths from non-elective procedures (%) – all with reference to evidence of risk adjusted best practice
Patient satisfaction metrics The aim – all patients to be treated as you would wish your family to be treated • Choose and book (%) • Few late cancellations (%) • Friendly, attentive, efficient staff (score on patient surveys – service level) • Few/no mixed sex wards (no of patients per service) • Telephone waiting times (minutes/rings) • One stop shop service where appropriate (target per service) • Choice offered where appropriate eg birthing, end of life care • Psychological/pastoral aspects of care addressed (target per service) • Good information provided to patients and family (target per service)
Convenient care metrics The aim – minimise disruption for patient and family caused by illness, especially long term illness • Local access to ‘frequent’ care eg kidney dialysis, chemotherapy (max travel distance targets) • Local access to post-discharge rehabilitation eg stroke, COPD • Access to care in the home when appropriate eg birthing, end of life care • Local access to non-complex diagnostic tests/rapid results • Access to day case/ambulatory care when appropriate (% by service)
Timely care metrics Not just about waiting times Early presentation, diagnosis and treatment – crucial to good outcomes eg cancer. • Target intervention rates based on predicted disease incidence [use to address ethnic incidence variation] • Relevant services include cancer, cardiac, diabetes, sexual health • % of first diagnosis considered ‘early presentation’ (%) Further progress on waiting times • Entire patient journey for all episodic care < [X] weeks. Set average and maximum target waiting times for each service [need to agree with PCTs] • Eg max cancer waiting time from referral to treatment for all cancers of [2] weeks, immediate access chest pain clinics
Quality improvement – the example of stroke Clinical quality/safety • Survival rates are easily measured – how does your trust stack up? • Close correlation between compliance with ‘best practice’ care guidelines and survival/morbidity • ALOS in acute stroke unit is good measure of post-stroke morbidity Convenience of care/Patient satisfaction • Is patient supported to return home/close to home promptly? • Does patient receive sustained support/rehab post-stroke episode? • Does patient/family receive appropriate psychological support? Timely care • Strong correlation between timeliness of care and survival/morbidity • Does your trust adopt the best practice model of care for stroke? Metrics can be developed that capture the most important dimensions of the quality of stroke care Baseline values and ‘best practice’ values for each metric can be specified by clinicians
What have been the impediments to greater focus on quality? • Lack of transparency about the current quality of care and outcomes • Concern that quality metrics would be used to drive a central process of ‘name and shame’ (league tables) • Concern that quality metrics would not adequately take account of need for risk adjustment and local circumstances • Priority given to national agenda, eg waiting times and financial balance (at expense of quality improvement agenda?) • Often expressed view that it is ‘too hard’ to measure quality and quality improvement
How going forward can the impediments be overcome? • Renewed focus by commissioners and providers on defining measures of quality and establishing baseline values for key aspects of quality performance • Start simple - with initial focus on measures which are of particular importance for patients and where there is already consensus on useable metrics eg heart attacks, stroke • Stress the responsibility pg PCT and provider boards to drive this agenda forward
Why is clinical leadershipcentral to quality improvement? • Only clinicians can develop useful quality performance metrics and establish ‘best practice’ values for each metric • Clinicians means doctors, nurses and other clinical staff • Need ‘external’ involvement to overcome local prejudices and pressures • Shared agenda across all trusts – a clear case for collaboration defining appropriate service level metrics and best practice values
National or local quality improvement targets? • National collaborative work (within the NHS) to agree most useful quality metrics and best practice metric values – there is already some work underway in some services • However quality improvement targets should be set locally reflecting different starting points, different local circumstances and different local priorities – within a framework where all providers must comply with national minimum standards • Transparency around quality performance of providers combined with patient choice and ‘comparative’ benchmarking (ie each provider seeking to do better than peers) will add powerful impetus to quality improvement. Suggests that at some point trusts should be required to publish quality performance metrics
Is quality improvement affordable? • Some quality improvement is cost reducing or cost neutral eg laparoscopic (keyhole) surgery, higher day case rates where clinically appropriate – there are already incentives in PbR to adopt • Some quality improvement is cost increasing for efficient providers eg ‘best practice’ stroke and heart attack centres. Needs to be funded in tariffs • Some quality improvement is cost neutral to the NHS but redistributes costs across provider trusts – commissioning and PbR need to support this sort of quality improvement • There are unresolved issues around funding of quality improvement by efficient providers – average cost PbR tariffs often under-fund provision of high quality care and over-fund provision of ‘average’ quality care.
Concluding thoughts • Need for renewed focus on evidence-based quality metrics – what does not get measured does not get managed! • Need to define relevant quality metrics at service level and establish baseline values – both commissioners and providers • Need for evidence-based values for ‘best practice’ quality of care. Requires national and local clinical engagement and patient involvement • Need for explicit monitoring of progress from baseline values to best practice over time – eventually include as contractual obligations on providers • Need to revisit funding arrangements to support quality improvement where cost of efficient provision either increase or are redistributed across trusts