380 likes | 681 Views
PSY205s: The psychology of aviation - Situational Awareness. Dave Nunez, MPhil Department of Psychology University of Cape Town. Admin info. I am in room 4.22 ( 650 4606) Paper: Endsley 1999 from Work Return Room Slides for the time being: http://www.cs.uct.ac.za/~dnunez/teaching
E N D
PSY205s: The psychology of aviation - Situational Awareness Dave Nunez, MPhil Department of Psychology University of Cape Town
Admin info I am in room 4.22 (650 4606) Paper: Endsley 1999 from Work Return Room Slides for the time being: http://www.cs.uct.ac.za/~dnunez/teaching (on the course web page later)
Bleriot Model XI - 1909 Aerospatiale-BAC Concorde - 1969 Why study aviation in psychology? • Airplanes get more complex…. • ….but the people who fly them remain the same.
Why study aviation in psychology? How safe is flying? Deaths per year by cause (USA 1981-1994) Commercial flight: 100 Electrical current: 850 Bicycle riding: 1000 Pedestrian: 8000 Falling: 12000 Auto accidents: 46000 So flying to Joburg should be about 460 times safer than driving there. • Goal: Increase safety (purely applied) • Aviation is a potentially dangerous activity • Increase safety by engineering (better engines, etc) • What about the people doing the flying?
A little history • During the ‘psychometrics boom’ (1920s-1930s), psychologists get involved • Pilot selection tests • During WW2, psychologists begin to look at aviation loses • More aircraft lost to accidents than the enemy! • Bartlett, Chapanis, Craik, Gibson and other ‘big names’ got involved
Pilot error vs. Human error • General rule was: If the plane didn’t fail, it was “pilot error” • Pejorative phrase; laid blame • Implication – ‘not good enough’ • Evidence from Chapanis and others showed it was actually “human error” • Acknowledge limits of human beings • Certain system features create situations where an error is more likely • Problem becomes worse under certain environmental conditions Alphonse Chapanis (1917-2002), was a leading figure in the psychology of aviation safety since the 1940s
Example: Human error (and a solution) • Problem: Pilots shut down engines in flight by pulling the wrong lever Beechcraft Duchess (late 1970s) Douglas DC-4 (early 1940s) • Chapanis’ solution: shape coding the lever handles (also color coded by function)
Human Factors (Ergonomics) • Understand patterns of errors • Many errors can occur regardless of experience • What about being human makes us likely to commit errors? • Examine cognitive processes to identify ‘danger spots’ • Goal: To create systems which reduce the probability of errors • ‘Pilot friendly’ aircraft which reduces errors and creates a better working environment
A major contribution: SA • A recent important contribution: Situational awareness • Combination of mental models, working memory and situated cognition theory • Tries to predict how and when errors can happen • Applied to operation of complex systems (nuclear powerplants, ships, aircraft, cars) • Much research, and is taught to pilots • Increase their awareness of when things can go wrong
A quick recap of the info-processing model • [Herbert Simon’s model] • Attention, WM (STM), LTM. • Attention filters irrelevant (unexpected) stimuli out • Stimuli are transformed in WM according to active rules and schema (from LTM) • Contents of WM in turn activate rules and schema as required by the data • Behaviour/consciousness is based on the contents of WM and active scripts
SA: definition • “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future” (Endsley 1998) • Cognitive task (probably expertise bound) • Dynamic (over time and space) • Several levels of processing (perception, comprehension, prediction)
Levels of SA (1) • Level 1: Perception of the environment • Other aircraft, terrain, own aircraft systems, navigation, radios • Deals with the present – loads perceptual buffers, attention and working memory Mica Endsley (of SA Technologies, previously of MIT) is one of the leading experts on situational awareness in aviation
Levels of SA (2) • Level 2: Comprehension • Synthesis of disjoint level 1 elements (‘big picture’) • Assign importance to each element (goal-directed) • Forms a holistic understanding of what is happening now • Reaching this level requires experience • It is mostly a top-down task; loads working memory, and requires LTM
Levels of SA (3) • Level 3: Projection (predicting) • Requires both Level 1 and Level 2 SA • Also expertise bound – More experienced pilots spend more time predicting what will occur • Effectively gives the pilot more time for decision making The heavier an aircraft, the longer it takes to respond to a pilot’s input. In such situations, Level 3 SA is essential. This is partly why airlines pick their most experienced pilots to fly such aircraft.
Cognition and SA levels • All three levels require some WM and attention • Level 1: Speech comprehension, decoding system interfaces; attentional filtering (mostly bottom-up) • Level 2: Activating mental models and schemata (mostly top-down) • Level 3: Take LTM information and apply it to active models (mostly top-down) • So WM and attention loads can be high during SA • But: Experts will use less (better chunking strategies, better at filtering out irrelevant elements)
Information load: Example Data Sources: Outside (terrain) Outside (weather) Outside (aircraft) Inside (gauges) Inside (maps) Inside (checklists) Inside (crew) Aural (crew) Aural (control) Aural (aircraft) Aural (alarms) Haptic (controls) Haptic (buffets, etc)
Individual factors in SA (1) • Limits of attention • Novices or experts in novel situations require more attention to be placed on the environment • Information overload can exceed the capacity – ‘miss’ important relevant information • Giving more attention to one SA task reduces it on another • Serious problem: NTSB review – 31% of human errors due to problems acquiring relevant data
Individual factors in SA (2) • Can be overcome by ‘sampling’ information • Learn a way to ‘scan’ the world (avoids fixation) • Strongly trained patterns become habitual • Sampling can fail: • Non-optimal strategy (focus on the wrong things) • Visual dominance (forget other inputs) • Memory failures (forget relative importance of elements) • In overload, ‘leave out’ certain elements
Individual factors in SA (3) • Attention limits can be helped by expertise • Top-down knowledge creates expectations, which can increase processing speed • BUT: if unexpected information occurs, more likely to make an error (‘superiority’ effects) • WM is used mostly for integration and projection (Levels 2 & 3) • If much new info is being processed, little WM will be left for integration (and vice-versa!) • Projection places a particularly heavy load on WM (need to store multiple states)
Coping mechanisms (1) • All is not lost – cognitive strategies/structures exist to deal with this • ‘coping mechanisms’ (not really) • Normal info sorting/learning structures • Generally: use previous knowledge to order the world • Some trained (automaticity); some developed (mental models) • Generally automatic, subconcious processes
Coping mechanisms (2) • Structured knowledge from experience (LTM) • Schemata, scripts & mental models • ‘fill in’ missing info (default values) • Help with structuring & comprehension (reduce WM & attention used) • Increase accuracy of predicting the future • Can be a ‘fuzzy fit’ • Almost essential for higher levels of SA
Coping mechanisms (3) • Goal-driven processing • Goals determine how resources are allocated also • Goals provide a structure in which to process (allows higher levels of SA) • Automaticity (habitual responses) – ‘scripted’ • Allows processing with minimal attention • Can miss novel stimuli • Safe for routine situations (is there such a thing?)
Factors which reduce SA – Stress (1) • Physical stressors • Noise, vibration, lighting, temperature, fatigue, ‘jet lag’ • Social/psychological stressors • Anxiety/fear, uncertainty, self-esteem, career advancement, time pressure • Stress effects are complex – a little can help (yerkes-dodson law) Stress produces many physical and psychological effects which can reduce SA and undermine a pilot’s ability to act correctly.
Factors which reduce SA – Stress (2) • Why does stress affect SA? • Attentional narrowing (high arousal/anxiety) • Oversampling of dominant cues • Scan patterns disrupted • Premature closure (hasty decisions) • Reduction in WM capacity / LTM retrieval (affects Level 2 & 3 most) • Training reduces these effects • Automaticity reduces attention and WM requirements • More cues, better associations improve retrieval
Factors which reduce SA – under/overload • Mental overload • WM and attention limits reached • Incomplete/erroneous perception • Stressor (being ‘behind the plane’) • Mental underload • No active search for info • Low vigilance/motivation Air traffic controllers (ATCs) also require high levels of SA. In busy sectors (such as London, Atlanta or Tokyo) the volume of traffic can lead to mental overload.
Factors which reduce SA – bad systems • The aircraft’s interface can present information poorly • Presenting too much can lead to overload • Hiding too much can lead to unawareness • The layout of information can interfere with the scan • Recently: ‘Smart’ planes (‘glass cockpit’) • Aware of the information required in a flight phase • Show what is necessary, but watch for problems in the background • Alert the crew if a problem exists (speech, icons, etc)
Improvements in interfaces Boeing 737-200Adv (late 1960s) Boeing 737-800 (Early 2000s)
Factors which reduce SA - Complexity • Aircraft keep getting more complex • Technology demands • Increases workload – more system components, more interactions • Effectively increases number of goals and tasks • An expert in these systems will be protected (a little) • Pilots vary widely on their self-reported understanding of the systems • A difficult road to becoming an expert!
Factors which reduce SA – Automation (1) • Habitual procedures can take crews ‘out of the loop’ • Reduce vigilance, increase complacency • Become a passive recipient of information • Automatic states have bad cognitive consequences • Pilots are slower to detect problems • Slower to re-orient after realizing the problem (schemata de-activation/re-activation)
Factors which reduce SA – Automation (2) • But: automation can help SA also • Computers can monitor many variables • Remove unnecessary manual work (navigation) • Can present many variables already integrated (for Level 2) • The trick is: create systems which aid but do not promote complacency
How serious is a failure in SA? • Jones & Endsley looked at accidents in the USA over a 4 year period (major carriers) • 77% had a substantial human error component • Of those, 88% due to a failure of SA • SA failures not even distributed among levels • Level 1: 72% • Level 2: 22% • Level 3: 6%
Teamwork: SA in CRM • Most aircraft are flown by a team • Do other people increase or decrease SA? • Spread the work: Effectively have more WM and attention • But: Is that enough for collaboration? Can too many cooks spoil this broth?
Sharing data • To work together, people must share info • Keep mental models etc. aligned • Each must know what information the other needs • Must also share higher level understanding, and projection (level 2 and 3) • Essential: shared mental models • High functioning crews communicate less than low functioning crews