460 likes | 971 Views
Human Factors. Risk Management Services Department. Are You Perfect?. Have you ever pushed the wrong button on a soda machine, left your car headlights on, or unintentionally deleted a file on your computer? Wonder how often these (and more serious) errors occur?. Laws of Nature.
E N D
Human Factors Risk Management Services Department
Are You Perfect? • Have you ever pushed the wrong button on a soda machine, left your car headlights on, or unintentionally deleted a file on your computer? • Wonder how often these (and more serious) errors occur?
Laws of Nature We accept and design for the laws of nature. Example: If a bridge falls down, we don’t list “gravity” as the root cause. Example: If there is an asphyxiation, we don’t list “people need oxygen” as the root cause of an injury.
Human Error • Law of nature: HUMANS MAKE MISTAKES! • DON’T BLAME IT…PLAN FOR IT!
How Often Do Humans Make Mistakes? • Trained, not under stress, not fatigued or overloaded, and enough time: Error occurs about 1 in every 100 times the operation is done
How Often Do Humans Make Mistakes? • Not trained, or under stress, or overloaded or short period of time: Error occurs about ½ to every time the operation is done
How Often Do Humans Make Mistakes? • Trained and not under stress and not fatigued and not overloaded and enough time AND with built in feedback: Error occurs about 1 in every 1,000 times the operation is done
What is Feedback? • Buzzer when you leave your lights on • Bell if the keys are in the ignition when the car door is opened • Control system asking you to confirm that the charge amount you entered was correct and showing the proper pumps and valves are open/closed If you can see that you are doing the right thing, then you can be sure that you did it.
Can a Human Check a Human? • Principle: If a person knows that someone else checked, they are not likely to reliably recheck Human checking is not generally a reliable safeguard against errors made by other humans (Exception: airline industry, although it is not 100% reliable…)
Helios Plan Crash Aug. 2005 3 checks by two pilots missed the switch in the wrong position Ineffective response to loss of cabin pressure and incapacitation of crew
Is Technology the Panacea? Principle: If a safety system is installed to protect against human error, the human will depend on it. Then the safety system becomes the only layer of protection. Principle: All mechanical things break. Safety systems need to be tested to ensure that they are working properly.
Real-Life Example • An operator loading a tank overflowed the tank • Management put a high level shutoff on the pump • The operator relied on the switch and did not watch the tank level closely • One day, the switch failed and the tank overflowed
BP Texas City • Operators did not fully understand Raffinate Splitter Tower operation • Startup procedures not fully followed • Material fed to column but did not exit; critical valve not opened during startup • Level exceeded safe limits; level device failed; not recognized • Level instrumentation in blow-down tank failed, but not repaired • Blow-down tank overflowed, material reached an ignition source, and a vapor cloud explosion resulted
Caveat Any system human beings devise to prevent failure can be overcome by human beings with sufficient determination and authority If there is a will, there is a way!
Guiding Principles for Preventing Human Error • Humans and systems designed by them are vulnerable to error • Existing facilities contain many traps that can cause human error • Designers can provide systems to facilitate error/deviation detection and to enable recovery before the error/deviation becomes serious
Design Considerations • Ergonomics – Can the operator reach what he needs to and work safely? • Operability – Is the work flow designed to minimize taking shortcuts? • Procedures – Are they clear, easy to follow, and explain the consequences of deviations? • Maintenance – Is there access and capability to maintain equipment? • Simplify – less chance of error
Design Considerations • Be consistent – orient valves the same way, use computer diagrams that look like the equipment layout • Human limitations – consider color-blind operators, different heights • Safety systems – make sure they can’t be bypassed • Alarm management – Don’t shower the operator with alarms he can’t process at once!
Chernobyl, Soviet Union 1986 • Nuclear meltdown resulted in 56 direct deaths, relocation of 336,000 people, and a plume of radioactive fallout • Significant design flaws in reactor • Safety systems switched off • Operator errors/training • Alarm showers confused the operators (also at Three Mile Island)
Cultural Stereotypes • GREEN is on, RED is off…but not in Japan! • H is hot water, C is cold…except in non-English countries (chaud or caliente both mean hot in French and Spanish) • Light switch is up for on…except in the UK!
Human Factors Philosophy • Make the right way THE ONLY WAY • Make the right way THE EASIEST WAY • Give the operators feedback that it was done the wrong way • Provide safeguards for when it is done the wrong way
Remember Other Operations… Don’t forget about maintenance, startup, and shutdown. These are the most risky times in a process. There must be EHS reviews, management of change, permitting procedures, training and communication systems to avoid human error.
Piper Alpha, North Sea, UK • Operators switched on a pump that was undergoing maintenance – poor lockout/tagout and communications • Significant leak/fire ensued • Piper Alpha was destroyed • 167 fatalities, loss of millions in revenue per day
Safety Culture A safety culture that promotes and reinforces safety as a fundamental value is inherently safer than one which does not • Do we have to follow the standards? • Do we really have to shut down? • Do we have to install this safety system? If these questions are asked, it is an indication of a poor safety culture!
Summary • Human error is a fact of nature – plan on it • Design process to minimize “traps” • Provide training and clear guidance • Provide feedback that the operator action taken is right/wrong • Don’t expect humans to check humans • Provide safety systems • Remember to consider startup, shutdown, and maintenance • Support an interdependent safety culture