1 / 32

Feeding back Clinical Outcomes to Frontline Teams

Feeding back Clinical Outcomes to Frontline Teams. UKRCOM 22 nd January 2015. Outcomes Analyses. Have wanted to embed the routine measurement, analysis and feeding back of clinical outcomes to frontline teams for several years

Download Presentation

Feeding back Clinical Outcomes to Frontline Teams

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feeding back Clinical Outcomes to Frontline Teams UKRCOM 22nd January 2015

  2. Outcomes Analyses • Have wanted to embed the routine measurement, analysis and feeding back of clinical outcomes to frontline teams for several years • Improve clinical effectiveness through reflective practice, shared learning, identifying gaps in service, training needs etc • Have had to deal with re-organisations and loss of data, changing priorities etc

  3. CQUINs • Have used CQUINs to promote the use of outcome data – has ensured the Trust devoted resource from the Information team to develop analyses • CQUINs initially required recording of HoNOS scores at certain events eg acceptance to service, admission, discharge, CPA review • This year’s CQUIN required evidence that analyses of outcomes were actively fed back to teams

  4. 2014/15 CQUIN for CNWL & WLMHT

  5. CQUIN Milestones

  6. The Presentations:“How much do we help our patients?” • How do we know whether the interventions we provide are effective? Eg: • An antipsychotic? • A “brief intervention” eg course of CBT? • An admission to an acute ward? • A 2 year admission to a rehabilitation/ forensic unit? • How interested are we in whether we make a difference to our patients health and quality of life?

  7. Why would we want to know if we were being clinically effective? • Delivering clinically effective interventions is arguably the most important thing we do for patients! • GPs, patients, NHSE, Monitor, CQC all want to know whether we provide a good (effective) service to patients • Commissioners want to know that actual clinical outcomes for patients using our services do improve • Measurement and analysis of outcomes provides this evidence • Really importantly, there is also clinical utility to measuring outcomes: • Systematic analyses of outcomes provide evidence of teams’ clinical effectiveness • Enrich clinicians & managers understanding of morbidity in their locality

  8. Is there any evidence we make a difference? • For several years we have been recording HoNOS scores at key times during patients’ pathway through our services: • At first assessment • When there is a significant change in need eg admission • At CPA • At discharge • Comparing a patient’s scores from eg point 1 to point 4 gives us a measure of our effectiveness

  9. Outcomes analyses • After years of collecting HoNOS scores, we now want to analyse these at a team level, identifying how we are doing • What are we doing well? • In what areas are we providing the most help for our patients? • Where are we doing less well and could benefit from further training, different staff mix etc • We have several analyses of outcomes scores in different formats and are really interested in your views as to which (if any) you find most helpful to understand whether you are delivering clinically effective care

  10. METHODOLOGY Paired HoNOS scores for selected Service-lines per CCG covering the period April 2013 to September 2014 were analysed using the following method: • Scores extracted from JADE at point 1 and point 2 for each selected patient • For new patients, Point 1 consisted of first assessment scores. For existing patients, Point 1 consisted of scores at the start of a new cluster episode. • Point 2 will be scores on discharge to GP or at start of new cluster episode There are four potential pathway scores: • New to Discharge • New to Review • Review to Discharge • Review to Review Sufficient paired HoNOS scores were found for pathways 1 and 4.

  11. DATA ANALYSES Analysis of HoNOS scoring will consist of comparing aggregated mean scores for patients at point 1 and point 2 using: • Mean total HoNOS scores at point 1 and point 2 and the difference for each sub sample. • HoNOS Four factor model showing differences in scores between point 1 and point 2 – for each sub-sample. • The HoNOS Categorical Change model. HoNOS scales were rated - 0 to 2 as LOW and 3 to 4 as HIGH. Scores were then classified as follows, from point 1 to 2: - Low score to Low score - Low score to High score - High score to Low score - High score to High score • Mean individual HoNOS scores at point and point 2 – for each sub-sample.

  12. Note: The four factor score is derived using the sum of the items in each factor/dimension. Note item 12 (problems with occupations and activities) appears in both personal and social wellbeing factors. This is because this item contributes equally to both factors.

  13. SAMPLE

  14. MEAN TOTAL HoNOS SCORES

  15. FOUR FACTOR CHANGE PWB: Personal Wellbeing EWB: Emotional Wellbeing SWB: Social Wellbeing SD: Severe Disturbance

  16. ABT [K&C] CATEGORICAL CHANGE

  17. ABT [K&C] HoNOS PROFILE CHANGE

  18. How do we compare with other ABTs? • Different demographics, but in comparison with the other ABTs, how are we doing?

  19. SAMPLE

  20. Paired HoNOS Categorical Change: CLUSTERS 1-5

  21. Paired HoNOS Categorical Change: CLUSTERS 6-8

  22. Paired HoNOS Categorical Change: CLUSTERS 10-15

  23. Conclusion of Presentations: • Providing interventions which make a genuine, positive contribution to our patients’ lives is (or should be!) our top clinical priority • It is not always easy to determine how successfully we are achieving our aims • Systematic measurement and analysis of outcomes can help us to understand where we as individuals and teams are doing well and where we might need more development • Please let us have your thoughts on the utility of outcome measurement, so we can improve how scores are analysed & fed back to teams in the future

  24. Staff were asked to evaluate the sessions: • How useful was it to receive an analysis of team outcomes using the 4 models? • Which model was most helpful? • Are there alternative ways of presenting outcomes which might be more useful? • How often should outcomes analyses be presented to teams? • Which other staff might benefit from being fed back outcomes analyses?

  25. Results of Evaluation (n=26) How interested are you in finding out whether the patients you treat get better? • Very Interested - 24/26 • Interested - 2/26 • Not sure/ Not Interested - 0/26 How useful was it to receive an analysis of team outcomes using the 4 models? • Very Useful - 10/22 • Useful - 11/22 • Not Sure - 1/22 • Not Useful - 0/22

  26. Which models were useful in helping you understand changes in patients’ symptoms?

  27. Evaluation How else could outcomes analyses be presented? (free text response) • Benchmarking against other teams, to identify service or demographic differences, or to highlight where teams are doing well/ less well. • Compare results with patient / carer responses • Use GP feedback • Undertake further analysis for patients whose scores remain high despite treatment • Separate out by diagnosis (as well as cluster) eg do personality disorder patients do differently? • Function on Jade to produce individual change graphs which can be shown to patients • Use case studies alongside outcomes analyses  Who else would benefit from attending presentations on outcomes analyses? (free text) • Whole of the team including admin, managers / Senior management team • Service user groups...Commissioners

  28. Conclusions • Results showed all staff who completed the feedback forms were interested in knowing whether the patients they treated improved as a result of their interventions. • The outcomes analyses that were shared with teams looked at paired HoNOS scores using 4 different models. All but one responder found the HoNOS analyses useful or very useful. • Most staff were unfamiliar with the models before the presentation. However responders found all four of the models either useful or very useful in helping them understand their outcomes (Total score change 83%, 4 factor model 65%, categorical change model 87%, profile change 88%). 4 factor was the model with the highest proportion of staff being unsure or finding it not useful (35%)

  29. Conclusions • Responders preferred feedback to be given at either 3 monthly intervals (40%) or six monthly intervals (36%). Although numbers were relatively small, rehab staff had a preference for longer periods between presentations (6-12 monthly) • Additional outcomes analyses which staff thought would be useful including patient completed measures. Triangulation with PROMS would help add validity to clinician rated measures • Added contextual information such as diagnosis and demographics was thought to be helpful • Staff thought all members of the team including admin staff and senior managers should be presented analyses of outcomes. Some also supported outcomes analyses to be presented to commissioners

  30. Next Steps • Roll out outcomes analyses to all frontline mental health teams during the course of 2015. • Analyses should be actively presented to all members of teams (eg for 30 minutes during an MDT), by outcomes leads who understand the models and can facilitate discussions on what analyses mean.

  31. Next Steps • Each Divisional Medical Director to identify an outcomes leads for their mental health teams. The role of the leads will include: • To liaise with the Information Team to ensure the correct analyses are being prepared for their allocated clinical teams • To attend training on the models and on how to facilitate a feedback session. • To develop a programme to deliver presentations to each of their allocated teams during the course of 2015.

  32. Questions?Advice?

More Related