150 likes | 516 Views
Human Failure Modes. Dr . Azad M. Madni Professor , Epstein Department of Industrial and Systems Engineering Director , SAE Program Co-Director, CSSE March 6, 2012. Outline. Human Failure Modes Demanding Systems Requirements Implications for Humans Evolving Human Roles
E N D
Human Failure Modes Dr. Azad M. Madni Professor , Epstein Department of Industrial and Systems Engineering Director, SAE Program Co-Director, CSSE March 6, 2012
Outline • Human Failure Modes • Demanding Systems Requirements • Implications for Humans • Evolving Human Roles • Systems Engineering Mindset • The Remarkable Human Brain • Human Error Sources • Potential Remedies and Opportunities
Human Failure • Comprises • human errors, which are unintentional behaviors • violations, which are willful disregard of rules and regulations • Human errors fall into specific categories • slips, lapses of memory • mistakes in following rules and procedures • mistakes in understanding
Demanding System Requirements • Adaptability • Reconfigurability • Composability • Resilience These requirements pose formidable challenges for humans that work with and within complex systems
Implications for Humans • System adaptability implies changing contexts and potential changes to human-system interactions • System reconfigurabilityimplies potential changes to human roles and human-system function allocation • System resilience implies potential dynamic changes to human role and attendant changes to cognitive load • System composability(as in SoSs) implies potential changes in collaborators (lack of shared conceptual model) These changes can increase likelihood of human error.
Evolving Human Roles • From that of operatoroutside system to that of agent within an adaptablesystem • decision maker • supervisor • monitor with override authority • re-assignable participant (peer, assistant) These roles require new behaviors.
Systems Engineering Mindset • Humans are suboptimal job performers that need to be shored up and compensated for during task performance • This perceptionleads to systems that are inherently incompatible withhuman conceptualization of work • The resulting mismatch inevitability creates human reliability issues that show up as human error This mindset fails to capitalize on human ingenuity
The Remarkable Human Brain • YuorBarin Can Raed This • For emaxlpe, it deson’tmttaer in wahtoredr the ltteers in a wrodaepapr, the olnyiprmoatnttihngis taht the frist and lsatltteer are in the rghitpcale. The rset can be a toatlmses and you can sitllraedit wouthitpobelrm. • S1M1L4RLY, Y0UR M1ND 15 R34D1NG 7H15 4U70M471C4LLY W17H0U7 3V3N 7H1NK1NG 4B0U7 17. • How ? • Source: LiveScience.com
Human Error Sources (Examples) • Erroneous/Incomplete Mental Model • often traceable to poor design- results in mistakes • lack of complete info causes user to make unwarranted assumptions about system state • also results from misrecognition of cues/state info • Drop in Vigilance/Arousal during Monitoring • occurs with infrequent stimulus leading to missed cue detection • Loss of Focus during Task Performance • results in slips (execution errors) arising from inattention • Cognitive Overload • causes: multi-tasking, context switching, decision making under stress • can lead to suboptimal behaviors and human errors (mistakes and slips)
Key Findings • Humans change cognitive strategies under overload • Inverted-U relationship: performance& stress • Humans unable to distribute attention under stress • Adaptability of human-in-the-loop system is upper-bounded by acceptable human error rate • System inspectability facilitates human intervention and avoids having to make erroneous assumptions • For robust performance • need to minimize multitasking and context switching • employ alerting/automation to monitor and flag rare events • need to understand cognitive strategies under overload for effective aiding
Potential Remedies • Design human work to avoid multi-tasking and frequent contextswitching to the extent possible • Assign rare event monitoring to automation or alerting mechanisms • Provide decision aiding and performance support for decision making under stress • Design appropriate incentives to counter risk compensation tendency • Employ automation and dynamic function allocation to assure manageable cognitive load Most complex problems will require a combination of many of these remedies.
Potential Opportunities • Exploit human ingenuity and creativity in: • adapting to shifting contexts • generalizing from specifics • recognizing novelty and improvising • aggregating information in the absence of an algorithm • detecting and filling gaps (e.g., in narratives) Most complex problems will require a combination of human creativity and ingenuity.
My References • Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” Keynote Presentation, 22nd Annual Systems & Software Technology Council, Salt Lake City, Utah, April 26–29, 2010. • Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” INCOSE Journal of Systems Engineering and, Vol. 13, No. 3, 2010. • Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” Keynote Presentation, INCOSE 2010 LA Mini-Conference, Loyola Marymount University, October 16, 2010. • Madni, A.M. “Integrating Humans With and Within Software and Systems: Challenges and Opportunities,” (Invited Paper) CrossTalk, The Journal of Defense Software Engineering, May/June 2011, “People Solutions.”