1 / 36

Identifying Risk via Confidential Reporting Systems RAe Soc HF Group EMSG Conference, 10 th October 2007 Dr Mike Rejma

Identifying Risk via Confidential Reporting Systems RAe Soc HF Group EMSG Conference, 10 th October 2007 Dr Mike Rejman Director, CIRAS. Recent Career 1988 - present . Head of Human Factors Unit, UK Army Air Corps (accident investigator, set up their confidential reporting system)

qamar
Download Presentation

Identifying Risk via Confidential Reporting Systems RAe Soc HF Group EMSG Conference, 10 th October 2007 Dr Mike Rejma

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Identifying Risk via Confidential Reporting SystemsRAe Soc HF Group EMSG Conference, 10th October 2007Dr Mike Rejman Director, CIRAS

  2. Recent Career 1988 - present • Head of Human Factors Unit, UK Army Air Corps (accident investigator, set up their confidential reporting system) • Principal Consultant, DERA, QinetiQ, • Assistant Director of Patient Safety at National Patient Safety Agency (accident investigation training, reporting system, risk assessment) • Director of CIRAS, the confidential reporting system for the UK railways

  3. Understanding the Problem • ~ 80% of accidents are attributable to human factors issues, at the individual level, the organisational level, or more commonly both • This is probably a conservative figure, and is irrespective of domain • To manage this we need to identify and understand the risks • Without this understanding we can’t put appropriate remedial action in place

  4. Reason’s Swiss Cheese Model Organisation and processes - Deficiencies “Latent failures” - precursors, pre- existing enabling conditions “Active failures” (skill, rule and knowledge errors) Multiple Defences Accident or incident

  5. 1 ? ? ? The Accident Iceberg (Heinrich, Bird, and others) accidents serious incidents incidents near misses & concerns often unreported

  6. Prior Indicators of Risk • Herald of Free Enterprise, 1987 • Numerous accounts of previous sailings with the bow doors open • Economic pressures to spend less turnaround time in port

  7. Prior Indicators of Risk • Challenger Space Shuttle • Ample previous evidence of seals shrinking in cold temperatures • Voiced by some engineers • Political pressure to launch

  8. Prior Indicators of Risk • Kings Cross Fire, 1987 • Numerous fires, bundles of rubbish and wooden escalators with greased tracks, some records but nothing done • Complacency and no system to evaluate risk

  9. Prior Indicators of Risk • Hillsborough, 1989 • Two Police Officers on horseback prevented similar incident the previous year by blocking access to the same terrace area • No organisational memory

  10. Investigating Accidents with the Army Air Corps • Experience while conducting investigations indicated that the workforce held lots of important information that could have been used to prevent the accident • The culture didn’t encourage reporting and there was no ‘safe’ route for this information to be passed on and assessed

  11. AAC views on confidential reporting • (Some) Senior Command • Blame & punishment • Would subvert the chain of command • Others • System required • RAF-run Condor system inappropriate

  12. AAC maintenance issues • Gazelle maintenance team change • habit, slip • Lynx tail rotor gear box problem and cannibalisation • culture, knowledge error, perceptual problem, assumptions (some of these issues could have emerged via a confidential reporting system)

  13. The situation in the NHS Prof Sir Liam Donaldson, CMO England • ‘Organisation with a Memory’ (2000) • National Patient Safety Agency formed (2001) • NHS Staff views – e.g. Primary Care • Pharmacists • GPs • Dentists

  14. Patient safety risk

  15. The situation in the NHS • The NHS is a labour-intensive industry • leading to a large number of human interactions • increasing the risks of decision-making errors • and increasing the risk of communication errors

  16. The national reporting and learning system • Positive:- the system now has over well over one million reports • Negative:- however the quality of much of the data is not high • The system was originally set up to be anonymous, rather than confidential

  17. NHS example • ~ I30 types of infusion pump available • Study revealed that 47 different types were present in one Trust • And 6 different types were found on one ward • Leading to high level of performance errors

  18. NHS examples • Removal of the wrong kidney • Chemotherapy drug vincristine delivered intrathecally instead of intravenously

  19. Rail Confidential Incident and Analysis System • Pilot study in Scotland 1996 • Rolled out nationally after Ladbroke Grove accident and Cullen Inquiry 2000 • 3 regional offices run by contractors, now one in-house team in London • In 10 years of operation there has never been a breach of the confidentiality guarantee for reporters

  20. UK Rail • Is one of the safest forms of transport both for travellers and staff • But it could be even safer as the accident in Cumbria in February showed

  21. CIRAS reports on non-compliance with rules

  22. Non-compliance by sector

  23. Causal factors for each sector

  24. Quotes from reporters “ We’ve refused to do the job on the grounds of safety … …. and we’ve been threatened with disciplinary action...” “I’ve been told that if I didn’t want to do the job [on safety grounds] then I shouldn’t bother coming to work tomorrow”

  25. How can we best identify risks? • Reactive/retrospective • Accidents and incident investigation • Root Cause Analysis (what, who, how, WHY) • Reporting systems: incidents • But we need to move people from ‘fire-fighting’ the last error to trying to prevent the next one • Proactive/prospective • Confidential Reporting systems: near misses, and safety concerns • Hazard identification, and prospective risk assessments

  26. Barriers to Reporting • Fear (including concerns re confidentiality) • blame culture, job loss, commercial issues • Practicalities • time to report • complex forms • Ignorance • what to report (definitions e.g. ‘near miss’) • Apathy • lack of perceived benefit to individual vs potential cost • lack of faith in the system to change things • lack of any feedback (‘black hole’ syndrome)

  27. Reporting rates and triangulating risk • For these reasons, incidents, near-misses & safety concerns will always be under-reported whatever the system • But there are ways to increase reporting, by targeting specific issues, and seeking other ways to triangulate risk including surveys

  28. Reporting rates and triangulating risk • More reports = good news, not bad news. With more information to analyse we can act as an Early Warning System • The industry must therefore encourage and reward reporting, not penalise it • CIRAS has recently moved beyond the model of the ‘passive post-box’ and is seeking to be more ‘proactive’ in engaging industry groups e.g. with workshops and surveys

  29. Confidential Reporting Systems In an ideal world we wouldn’t need confidential reporting systems. Staff would be happy to volunteer information about errors they made, or safety concerns they had, without fear of blame or victimisation, and management would willingly address all the issues raised by their staff…………

  30. Confidential Reporting Systems But we don’t live in an ideal world ! Consequently in virtually all safety-critical industries, here and abroad, it has been necessary to incorporate confidential reporting systems as part of the suite of reporting systems available and operating within safety management systems

  31. Identifying Risk by Donald Rumsfeld As we know, there are known knowns. There are things we know we know. We also know there are known unknowns. That is to say We know there are some things we do not know. But there are also unknown unknowns, The ones we don’t know we don’t know.

  32. Confidential reporting systems • Are there to access all these categories • Are an indispensable part of the process of identifying risk in any safety-critical industry • Should be an integral part of any safety management system

  33. mike.rejman@ciras.org.uk

More Related