410 likes | 544 Views
You: The most important piece of the bulk power system Human factors in supporting reliability and recovery from physical and cyber events. Mike Legatt, Ph.D. Principal Human Factors Engineer Electric Reliability Council of Texas, Inc. Michael.Legatt@ercot.com. Introduction.
E N D
You: The most important piece of the bulk power system Human factors in supporting reliability and recovery from physical and cyber events Mike Legatt, Ph.D. Principal Human Factors Engineer Electric Reliability Council of Texas, Inc. Michael.Legatt@ercot.com
Introduction This exercise is intended to prepare you for things that don’t seem “quite right” – how you can communicate and collaborate to identify events and reduce their impacts Furthermore, it’s intended to serve as a brief primer on maintaining human performance by tracking stress, accuracy, and keeping cognitive biases in check
Objectives You will: • Identify the systematic strengths and weaknesses of people, technology, and their interactions • Recognize the role of your “gut feelings” in operations • Identify when and how to share these feelings within and outside your organization and prevent against biases
Definitions • Running estimate • Common operational picture (COP) • Cognitive bias • Situation awareness • Selective attention • Ego depletion • Hyperstress / hypostress
PATTERN RECOGNITION: The core human activity
In which state of stress are you most likely to make a mistake in an emergency? • Hypostress • Hyperstress • Hypostress before the emergency, then hyperstress when it happens • Being in the “zone of maximum adaptation”
Human Performance Under Stress Stress and performance, from Hancock (2008)
In which state of stress are you most likely to make a mistake in an emergency? • Hypostress • Hyperstress • Hypostress before the emergency, then hyperstress when it happens • Being in the “zone of maximum adaptation”
If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause? • Inattention • Misinterpretation of a rule • Inaccurate mental model • Organizational bias
How we make mistakes From: NERC Cause Analysis Methods
If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause? • Inattention • Misinterpretation of a rule • Inaccurate mental model • Organizational bias
In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? • Ego depletion • Semmelweis reflex • Outgroup homogeneity • Hindsight bias
Ego Depletion • Self-control is a limited resource, and like a muscle, it tires out.
Cognitive Biases (a sampling) • “Apparently, when you publish your social security number prominently on your website and billboards, people take it as an invitation to steal your identity.” – Zetter, K. “LifeLock CEO’s Identity Stolen 13 Times.” Wired.com, April 2010.
There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? • Zero-risk bias • IKEA affect • Organizational bias • Confirmation bias
You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? • Cognitive dissonance avoidance • Google effect • IKEA effect • Attentional bias
Cognitive Biases (a sampling) • Anchoring – something you’ve seen before seems like the benchmark (e.g., first time you paid for gas) • Attentional bias – You’re more likely to see something if you’re thinking about it • Cognitive dissonance – uncomfortable to have to conflicting thoughts • Confirmation bias – pay attention to things that support your belief
Cognitive Biases (a sampling) • Diffusion of responsibility – “someone else will take care of it” • Google effect – easy to forget things that are easily available electronically • Groupthink – people less likely to contradict ideas in a large group • Hindsight bias – the past seems perfectly obvious
Cognitive Biases (a sampling) • IKEA effect – things you’ve built seem more valuable to you than things others have built • Illusion of transparency-expect others to understand your thoughts/feelings more than they can • Loss aversion – you’re more likely to try avoid losing than gaining
Cognitive Biases (a sampling) • Organizational bias – you’re likely to think ideas within your organization are better • Outgroup homogeneity – you’re likely to think that people in another group all think the same • Semmelweis reflex – rejecting new ideas that conflict with older, established ones • Zero-risk bias – likely to choose worse overall solutions that seem less risky
There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? • Zero-risk bias • IKEA affect • Organizational bias • Confirmation bias
You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? • Cognitive dissonance avoidance • Google effect • IKEA effect • Attentional bias
In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? • Ego depletion • Semmelweis reflex • Outgroup homogeneity • Hindsight bias
Group exercise: Scenario 1 • Substation X • Camera malfunction • Low oil level alarm on a transformer • Dispatch troubleshooter • Bullet holes in camera and transformer • Random act of vandalism, ploy or directed threat?
Group exercise: Scenario 2 • Substation Y • Communications vaults for 2 providers damaged (AT&T and Level3). • > 100 shots fired at transformers, oil leakages in several transformers (> 51k gallons spilled). • Only energized transformers shot. • Attackers never entered substation • Initial assumption: vandalism? • Dress rehearsal for future attacks? • It happened: April 16, 2013, Metcalf Substation
Group exercise: Scenario 3 • Utility control room • Telemetry doesn’t look quite right – not sure why • Sees significant flow into substation without a load, then goes away • RTU failure, manipulated data, cyberattack?
Group exercise: Scenario 4 • ISO Control Room • News report of civil unrest in an area • Call from utility: substation transformer • Call from utility: telemetry issues • Several other “below the line” calls • To whom do you share this information?
Summary • Value of communication and collaboration when “things are not quite right.” • Reporting structure for handling incidents • Remember – your data may just be part of something larger
References NERC CAP Annex D, Phase 0 (draft) NERC CIPC Report to Texas RE MRC NERC Cause Analysis Methods Macmillan, N.A., Creelman, C.D. (1991). Detection theory: a user’s guide. New York: Cambridge University Press Hancock, P.A., & Szalma. J.L. (Eds.). (2008). Performance under stress. Ashgate, Chichester, England..
Questions ? ?
There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? • Zero-risk bias • IKEA affect • Organizational bias • Confirmation bias
In which state of stress are you most likely to make a mistake in an emergency? • Hypostress • Hyperstress • Hypostress before the emergency, then hyperstress when it happens • Being in the “zone of maximum adaptation”
If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause? • Inattention • Misinterpretation of a rule • Inaccurate mental model • Organizational bias
You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? • Cognitive dissonance avoidance • Google effect • IKEA effect • Attentional bias
In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? • Ego depletion • Semmelweis reflex • Outgroup homogeneity • Hindsight bias