1 / 42

Detecting Error and Clinical Consequences

Detecting Error and Clinical Consequences. Professor Geoff Delaney South Western Sydney Cancer Services October 2012. My interest in error Significance of error in radiation oncology Clinical detection of error Incident reporting and review methodology

odessa
Download Presentation

Detecting Error and Clinical Consequences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Detecting Error and Clinical Consequences Professor Geoff Delaney South Western Sydney Cancer Services October 2012

  2. My interest in error • Significance of error in radiation oncology • Clinical detection of error • Incident reporting and review methodology • Radiotherapy Incident reporting system • Relate this to other disciplines Aims

  3. Manager held responsible for errors • Radiation Oncology significantly relies on other staff to do their job, strong element of trust • Difficult problem solving • Root cause analysis • Didn’t like the old system (blame-based) • Recent treatment error in the department • Error not isolated to less competent or experienced staff members • Suggested a new system that has stimulated my thinking/reading about error • Several mandatory reporting responsibilities • Don’t like to make mistakes but keen to learn from them What stimulates me about error?

  4. Hands Up: Who has made an error at work?

  5. Hands Up: Who has made an error at work?

  6. “Everyone makes mistakes, that’s why they put erasers on top of pencils” • Lenny Leonard, The Simpsons

  7. Who has seen a likely error before it occurred?

  8. “Mistakes are a fact of life. It’s the response to the error that counts” Nikki Giovanni • “With hindsight, it is easy to see a disaster waiting to happen. We need to develop the capability to achieve the much more difficult – to spot one coming” DoH, UK 2001.

  9. Hindenburg • Titanic • Three Mile Island • Chernobyl • Challenger space shuttle disaster • Paddington train crash • Walkerton water contamination • World Trade Centre fatalities They all involved human error and failure of communication and could have been prevented

  10. The “Swiss cheese” model of human error causation (adapted from Reason, 1990).

  11. Error-prone • Under-resourced • Access all hours • High demand • Chronic under-staffing • Multiple hand-offs/communication barriers • Differing groups of people interacting • Significant amount of communication is verbal • 44 000 – 98 000 people die p.a. in US due to medical error (equal to number of breast cancer deaths) • Cost of error = US$37.6 X 109 Medical systems

  12. Random • Often treatment delivery based • May be compensated for in future treatment • Systematic within one patient’s treatment • Usually an error in dose calculations or set-up instructions • Systematic across a number of patients • Usually due to calculation errors e.g. TPS, calibration factors, linac calibration, brachy source factors Errors in radiation oncology

  13. Decision to give radiotherapy Data transfer CT scan Treatment field placement Dosimetry Data transfer The radiotherapy planning and treatment process Treatment

  14. Multiple people / work groups handing work off to others • Systematic process relying on the process and equipment being correct • If process or equipment incorrect then this can be transferred across many patients until detected • Most severe radiotherapy side effects can take years to manifest • Each patient has multiple treatments = multiple errors if original data accepted • Computer data transfer has reduced some error but automation also encourages risk-taking • Medical physicists (MP) and the QA of XRT pivotal and most large errors have been when there has been inadequate MP resources or inadequate QA or equipment installation or maintenance, yet MPs often not viewed as frontline staff by bureaucrats • Similar to the aerospace industry • Thankfully does not happen often although publicity around errors might suggest otherwise • Lots of QA goes into radiotherapy When radiotherapy can go seriously wrong

  15. An expected part of treatment (on occasion) • Acute < 6 months • Usually related to rapidly dividing tissue e.g. mucosa, skin • Usually self-limiting • Sub-acute 6-12 months • Chronic 12+ months • Often organ-specific e.g. liver, kidney, bone, brain. May worsen with time Complications of radiotherapy..

  16. Overdose • Death • Nerve dysfunction • Organ dysfunction • Fibrosis and pain • Ulceration • Underdose • Failure to control tumour Consequences of late radiotherapy effects

  17. UK 1982 – 1045 patients underdosed • Panama 2000 – 28 patients over-dosed, 8 patients died. • Beatson Cancer Centre, Glasgow – one patient severely overdosed with brain XRT, 39 others discovered on RCA • France 2005, 23 patients overdosed • Coffs Harbour – 4% overdose on 68 patients • Royal Adelaide – 869 patients under-dosed • Liverpool and Campbelltown – 2 patients overdosed • POWH – 10 patients under-dosed palliative oesophageal brachy. Examples of major accidents in radiotherapy

  18. Overdose of XRT to brain for brain metastases in 1995 • Overdose of a woman with early breast cancer -> fibrosis and pain in the breast • Major findings • Errors occurred with experienced staff • Problems related to process and calculation errors • Multiple errors occurred leading to final outcome • 2 Major treatment mishaps in 16 years • ~ 1600 patients treated with radiotherapy in our departments each year currently Incidents at Liverpool and Campbelltown

  19. QA (near miss) • Incidental (by chance) pick-up • Often more junior members of the team • Error takes place • Patient comment • Toxicity review – systematic or one-off • Under-dose particularly hard to pick up – lesser toxicity or local control really difficult to monitor without specific audit How might incidents be discovered?

  20. Can lead to panic (and further errors if snap-decisions) • Lots of staff emotions • Dealing with the patient – open disclosure, support staff, patient options • What to do with patient care? • Medical defence? • Is this the only patient affected? • Analysis of what went wrong. Experiences as a manager when an error occurs

  21. Process mapping • Root Cause analysis • Task analysis (swimming lanes or Sequential Timed Events Plotting) • Hazard identification • Risk surveys and audits • Failure Modes Effects Analysis (FMEA) • Hazard and Operability Study (HAZOP) Processes for investigating errors

  22. Often a lot of defensiveness when review undertaken. This can hamper the review • Lots of different emotions and concerns – personal, institutional and professional • Media interest • Patient interest • Care needs to be taken not to knee-jerk • Management of the people (staff, incident reporter, managers, patients, carers) Experiences of performing external review

  23. To find the root cause(s) • To identify solutions to prevent recurrence • To walk away with a better process than prior to incident What the goals of the reviews are

  24. Keep an open mind even when reading documentation etc. • There are always two sides (or more) to a story • Overcome resistance by reassurance • The target is the process not the individual • Don’t be bullied by parochialism or protectionist response • Open communication • Multidisciplinary team important, diverse backgrounds good Important aspects of an investigation

  25. Department of Health – establish terms of reference • Establish appropriate team • Whistle-blower – what occurred, what do you want to get out of it, what do you think should have occurred? • Staff involved +/- support people – very stressful for all staff – don’t like making error + suspicion on what might happen based on what they say • Review of documentation (or lack of documentation) • Review of processes/culture etc. Interviews

  26. Everyone makes mistakes • Open-mindedness is important • Of great interest to probe another department’s procedures and makes you look at your own department more critically • Makes you hyper-alert to error • Important to communicate and be open about the findings but this requires DoH co-operation. Lessons learned

  27. Personal approach • Systematic approach Two basic approaches to human error

  28. A focus on system improvement instead of blaming • Reporting and learning from errors • Infrequent unsafe acts • Attitudes about teamwork (all have input, anyone can speak up) • Good communication between groups • Cohesion Components of a Safety Culture

  29. Encouragement to report errors and near-misses • Blameless reporting • Staff given opportunity to suggest solutions • De-identified reports and trend analysis presented to RTQI committee • Action/investigation for regular or serious error The system

  30. Incident report

  31. What our system shows

  32. Figure 1: Reduction in errors and near misses 04/05-06/07. 60000 500 No. of Attendances Actual Error 450 Near Miss 50000 400 350 40000 300 Attendances Incidents 30000 250 200 20000 150 100 10000 50 0 0 04/05 05/06 06/07 21788 38134 55006 No. of Attendances 63 58 34 Actual Error 184 199 150 Near Miss What our system shows

  33. PUBLISHED REPORTS ON ERROR RATES IN RADIATION ONCOLOGY

  34. Difficult to know what a “correct” rate of error is as it depends on the length of the process, the types of errors and the reporting culture • It would worry me if we had no error • A reporting department looks worse than a non-reporting department • However, pooling of results useful to look at trends or at least “common” errors Benchmarking for error rates

  35. Good in comparison • Other possible reasons for better rate • Less reporting • Less scope of reporting • Technology improvements Interpretation of error rate

  36. Our system quite good • Other systems used • Room for improvement • No benchmark data (how many incidents is the ideal number???) • Many challenges in maintaining enthusiasm in reporting • Many challenges in getting this accepted more broadly What I discovered…..

  37. Encourage all departments to have a quality plan that includes a reporting culture and mechanisms for change • Refinement of the reporting system • Australian endorsement and use with centralised reporting • Australian “incident in radiotherapy” workshop The plan

  38. Radiotherapy already has lots of QA and error incidence is low • Need to ensure QA always focussed on error and not process for sake of process • Vigilance in the clinic • Reporting tools and reporting culture improves actual error rate • No matter how much you work at this there is an background error rates, after all, we are only human Conclusions

More Related