1 / 14

Human Failure Modes

Explore human error sources, demanding system requirements, and evolving human roles in the context of systems engineering. Learn about potential remedies and opportunities to address human failure in complex systems.

castrom
Download Presentation

Human Failure Modes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human Failure Modes Dr. Azad M. Madni Professor , Epstein Department of Industrial and Systems Engineering Director, SAE Program Co-Director, CSSE March 6, 2012

  2. Outline • Human Failure Modes • Demanding Systems Requirements • Implications for Humans • Evolving Human Roles • Systems Engineering Mindset • The Remarkable Human Brain • Human Error Sources • Potential Remedies and Opportunities

  3. Human Failure • Comprises • human errors, which are unintentional behaviors  • violations, which are willful disregard of rules and regulations • Human errors fall into specific categories • slips, lapses of memory • mistakes in following rules and procedures • mistakes in understanding

  4. Demanding System Requirements • Adaptability • Reconfigurability • Composability • Resilience These requirements pose formidable challenges for humans that work with and within complex systems

  5. Implications for Humans • System adaptability implies changing contexts and potential changes to human-system interactions • System reconfigurabilityimplies potential changes to human roles and human-system function allocation • System resilience implies potential dynamic changes to human role and attendant changes to cognitive load • System composability(as in SoSs) implies potential changes in collaborators (lack of shared conceptual model) These changes can increase likelihood of human error.

  6. Evolving Human Roles • From that of operatoroutside system to that of agent within an adaptablesystem • decision maker • supervisor • monitor with override authority • re-assignable participant (peer, assistant) These roles require new behaviors.

  7. Systems Engineering Mindset • Humans are suboptimal job performers that need to be shored up and compensated for during task performance • This perceptionleads to systems that are inherently incompatible withhuman conceptualization of work • The resulting mismatch inevitability creates human reliability issues that show up as human error This mindset fails to capitalize on human ingenuity

  8. The Remarkable Human Brain • YuorBarin Can Raed This • For emaxlpe, it deson’tmttaer in wahtoredr the ltteers in a wrodaepapr, the olnyiprmoatnttihngis taht the frist and lsatltteer are in the rghitpcale. The rset can be a toatlmses and you can sitllraedit wouthitpobelrm. • S1M1L4RLY, Y0UR M1ND 15 R34D1NG 7H15 4U70M471C4LLY W17H0U7 3V3N 7H1NK1NG 4B0U7 17. • How ? • Source: LiveScience.com

  9. Human Error Sources (Examples) • Erroneous/Incomplete Mental Model • often traceable to poor design- results in mistakes • lack of complete info causes user to make unwarranted assumptions about system state • also results from misrecognition of cues/state info • Drop in Vigilance/Arousal during Monitoring • occurs with infrequent stimulus leading to missed cue detection • Loss of Focus during Task Performance • results in slips (execution errors) arising from inattention • Cognitive Overload • causes: multi-tasking, context switching, decision making under stress • can lead to suboptimal behaviors and human errors (mistakes and slips)

  10. Key Findings • Humans change cognitive strategies under overload • Inverted-U relationship: performance& stress • Humans unable to distribute attention under stress • Adaptability of human-in-the-loop system is upper-bounded by acceptable human error rate • System inspectability facilitates human intervention and avoids having to make erroneous assumptions • For robust performance • need to minimize multitasking and context switching • employ alerting/automation to monitor and flag rare events • need to understand cognitive strategies under overload for effective aiding

  11. Potential Remedies • Design human work to avoid multi-tasking and frequent contextswitching to the extent possible • Assign rare event monitoring to automation or alerting mechanisms • Provide decision aiding and performance support for decision making under stress • Design appropriate incentives to counter risk compensation tendency • Employ automation and dynamic function allocation to assure manageable cognitive load Most complex problems will require a combination of many of these remedies.

  12. Potential Opportunities • Exploit human ingenuity and creativity in: • adapting to shifting contexts • generalizing from specifics • recognizing novelty and improvising • aggregating information in the absence of an algorithm • detecting and filling gaps (e.g., in narratives) Most complex problems will require a combination of human creativity and ingenuity.

  13. So,….Is Human Error a Cause or Consequence? Thank You

  14. My References • Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” Keynote Presentation, 22nd Annual Systems & Software Technology Council, Salt Lake City, Utah, April 26–29, 2010. • Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” INCOSE Journal of Systems Engineering and, Vol. 13, No. 3, 2010. • Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” Keynote Presentation, INCOSE 2010 LA Mini-Conference, Loyola Marymount University, October 16, 2010. • Madni, A.M. “Integrating Humans With and Within Software and Systems: Challenges and Opportunities,” (Invited Paper) CrossTalk, The Journal of Defense Software Engineering, May/June 2011, “People Solutions.”

More Related