490 likes | 636 Views
Risk Theory. Matt Hill Consultant Anaesthetist Plymouth Hospitals NHS Trust. Error myths. Error equates to incompetence the more experienced you are the bigger the mistakes you will make. Errors are intrinsically bad Just world hypothesis Errors occur out of the blue. Today.
E N D
Risk Theory Matt Hill Consultant Anaesthetist Plymouth Hospitals NHS Trust
Error myths. • Error equates to incompetence • the more experienced you are the bigger the mistakes you will make. • Errors are intrinsically bad • Just world hypothesis • Errors occur out of the blue
Today • Why is medicine dangerous? • Error classification • How our minds work • Sources of error • The individual • The system • The culture • What we can do about it
Why is medicine dangerous? Perrow 1984
Levels of performance control. Cognitive control modes Conscious Mixed Automatic Situations Routine Skill based Rule based Trained for problems Knowledge based Novel problems
Interacting with long term memory base • Similarity matching and frequency gambling are automatic unconscious and continually operative • What barks has four legs wags its tail cocks its leg at lampposts? • This is similarity matching • Examples of a four legged animal • This is frequency gambling as it is guided by familiarity with the animal
Mistake types • Definitions: • Error: planning lapse • Lapse: storage • Slip: action not as planned • Violation: deliberate deviation from normal behaviour
Errors • Not achieving the desired outcome does not necessarily signify a mistake • A plan has two elements • A process • An outcome
Error classification based on action • Omissions • Intrusions: appearance of unintended actions • Repetitions • Wrong objects: correct actions but on wrong object • Mis-orderings: right actions but in wrong order • Mis-timings: right actions but at wrong time • Blends: unintended merging of two action sequences meant to serve different goals
Contextual factors. • Anticipations and perseverations • Errors shaped by what has happened or what is coming up • Priming • Interruptions and distractions • May believe we were further along in task or lead to repetition • Stress
The individual as a source of error. We are free agents capable of choosing between safe and unsafe behaviours
Human as hazard versus human as hero. • Emphasis on the errors and violations that we make, with very little emphasis on the insights, recoveries, adjustments, adaptations, compensations and improvisations performed daily.
The downside of human as hero • Local workarounds • Normalisation of deviance • Doing too much with too little • Forgetting to be afraid
Swiss Cheese Active failures Latent conditions Reason 2000
System problems • Uncoordinated approach • Bolt on • Normalisation of deviance • Doing too much with too little • Arrogance • Forgetting to be afraid
Vulnerable system syndrome • Three pathological entities • Blame • Denial • Pursuit of the wrong kind of excellence
Person and system models: getting the balance right • Personal qualities of diligence and attention to detail matter • Can’t fall prey to learned helplessness – it’s the system • Institutional resilience • Error wisdom on the frontline
Why do people violate safety rules? • I can handle it • I can get away with it • I can’t help it • Everyone does it • It’s what they (the company) really want • They’ll turn a blind eye
Violations • Who is most likely to viloate? • Young men • Having a high opinion of their work skills relative to others • Who may be relatively experienced and not especially error prone • Who are more likely to have a history of incidents and accidents • Who are significantly less constrained by whatother people think and by negative beliefs about outcomes
Mental economics of Violating • Costs and benefits of non-compliance vs compliance • Often easier way of working and brings no obvious bad effects • Benefits are immediate and the costs are seemingly remote and unlikely
Performance control problems • Long-term memory containing mini theories rather than bald facts, leaves us liable to tunnel vision and confirmation bias. • Limited attentional resources, necessary for coherent planned action leave us prey to inattention and information overload.
“We know there are known knowns: there are things we know we know. We also know there are known unknowns: that is to say we know there are things we know we don't know. But there are also unknown unknowns — the ones we don't know we don't know” What did he mean?
Three bucket model • Individual mindfulness Self Context Task Recognise situations which have high error-provoking potential
Rubber band model Compensatory corrections Safe operating zone Danger Stable system Perturbed system Corrected system
Resources Protective resources Safe operating zone Productive resources Stable system Perturbed system Corrected system
Culture. • Definition: • “those attitudes, assumptions and values which condition the way in which individuals and the organisation work” • Kennedy 2001 • Is it important?
Safety culture • Pathological • Muzzle,malign or marginalise whistle blowers and shirk collective responsibility, punish or cover up failures • Bureaucratic • Generative • Encourage individuals and groups to observe, to inquire, to make their conclusions known and where observations concern important aspects of the system actively bring them to the attention of higher management
HRO’s • Pre-occupation with failure • Reluctance to simplify • Sensitivity to operations • Commitment to resilience • Deference to expertise Weick and Sutcliffe 2001
A prolonged period without an adverse event does not signal safe enough.. ……..correctly seen as a period of heightened danger
Briefings • Pre-operative briefings: • reduce the number of communication failures and promote proactive and collaborative teamwork (Sexton et al 2006, Lingard et al, 2008) • reduce unexpected delays through a decrease in communication problems (Nundy et al, 2008) • ensuring everyone knows each others names facilitates communication and breaks down the hierarchical gradient (Singer at al, 2009) • improve teamwork (Sexton et al, 2006) • reduce peri-operative mortality (Mazzocco et al, 2009)
Communication • Surgeons reporting better communication and collaboration had an association with lower patient mortality rates (Davenport) • Improved communication resulted in better patient outcomes (de Leval, 2000) • Improved teamwork counteracts some of the negative effects of fatigue on performance (Sexton et al, 2006)
Checklists • Reduction in morbidity and mortality (Haynes et al, 2010) • CVC checklist reduced infection rates (Provonost et al, 2008) • Is it the checklist or is it the associated change in culture? • It is alright to challenge someone who is not doing a task correctly
Management • Managers have a more positive view of safety than frontline workers (Singer et al, 2008) • Improved safety climate through safety walkrounds by executives (Thomas et al, 2005) • Perceptions of management inversely correlate with ICU outcomes (Huang et al, 2010) • Decreased sick rate with improved safety climate (Kivimaki et al, 2001)
The greater the danger, the greater is the need for error wisdom. James Reason