690 likes | 804 Views
Decision Making. chapter 8. Columbus Day? Officially, Tuesday classes don’t meet next week (Monday meets on Tuesday). However, the syllabus has us meeting next Tuesday. If we do not meet next week, and move everything back a week, does this conflict with anyone’s schedule? Homework 2 & 3.
E N D
Decision Making chapter 8
Columbus Day? • Officially, Tuesday classes don’t meet next week (Monday meets on Tuesday). • However, the syllabus has us meeting next Tuesday. • If we do not meet next week, and move everything back a week, does this conflict with anyone’s schedule? • Homework 2 & 3
Daniel Kahneman • Daniel Kahneman (e.g. Kahneman & Tversky, Kahneman & Treisman) won the 2002 Nobel Prize in Economics along with GMU Professor Vernon Smith. “Kahneman has integrated insights from psychology into economics, especially concerning human judgment and decision-making under uncertainty, the Royal Swedish Academy of Sciences said in its citation.” – CNN.com
What makes a decision good? • Maximize the expected value of the return • Good decisions produce good outcomes; bad decisions produce bad outcomes. • Expertise -- Experts tend to produce better decisions that novices.
What makes a decision good? • Maximizing expected value of return? • Over time, make sure the average payoff of a series of decisions is as high as possible? • Problems include: • the value/cost of a payoff might be hard to estimate • expected value is only realized over the long term; value of individual decisions may vary • it may be more important to avoid loss than to secure a benefit (“loss aversion”)
What makes a decision good? • Example • An investor might believe the higher return of a stock investment is less valuable than the security of a savings account. • The expected value of a series of roulette bets is negative, but a single bet might return a positive value
What makes a decision good? • Producing a good result for a given decision? • problem: since outcomes are probabilistic, a single good result might not reflect the value of average, long-term results. • Example: a single winning bet on red on a roulette wheel does not indicate that betting red consistently is a good strategy.
What makes a decision good? • Producing a good result for a given decision? • problem: since outcomes are probabilistic, a single good result might not reflect the value of average, long-term results. • Example: • the Stark did not fire on an incoming contact, and was badly damaged by an Exocet missle. • the Vincennesdid fire on an incoming contact, and shot down an innocent Iranian airliner.
What makes a decision good? • Domain experts say that it’s good? • Problem: experts’ opinions about what is a good decision might conflict with those of first two criteria • Example: Expert Systems (computer programs) can usually diagnose diseases better than doctors. • An expert doctor from 1600 might do worse than a novice from today equipped with an internet connection to WebMD.
What Constitutes a Good Decision? • Maximizing Expected Value • Producing a ‘good’ outcome • Meeting experts’ criteria • Expected Value • To calculate expected value: • List all potential outcomes, along with their value • Calculate probability of each outcome • Multiply probability of each outcome by value of each outcome. • Sum the resulting values.
Assume that in a lottery, 6 numbers are randomly chosen from the range 1-52. To win, a player must match all 6 numbers. The prize is $1 million. Q: What is the expected value of a $1 lottery ticket with 2 chances to win? • Probability of winning = .0000001 • Value of winning = $999,999 • Probability of losing = .9999999 • Value of losing = -$1 • Expected Value of $1 bet = = -$0.90 “Buying lottery tickets is a tax on people who aren’t very good at math” – Garrison Keillor
Assume that two people agree to bet on tosses of a coin. For every toss that lands heads, Person A wins $1. For every toss that lands tails, Person A loses $1 Q: What is the expected value of a coin toss to Person A? P(heads) = .5 Value(heads) = $1 P(tails) = .5 Value(tails) = -$1 E(coin toss) = (.5 x $1)+(.5 x -$1) = $0.5 - $0.5 = 0 Note that at after any given coin toss, one person or the other may be momentarily ahead of the other
Perception of Cues • To make a good decision, people need to be able to properly assess the situation. • That is, they need to look for clues that will guide them in their decision making. • However, humans being humans, there are some inherent biases and weaknesses in their ability to correctly perceive clues…
Perception of Cues • Humans are good at • estimating the mean of multiple values • estimating proportions that aren’t too extreme • Humans are poorer at: • estimating extreme proportions • If I have seen 99 normal parts, then detecting 1 abnormal part will have more of an impact than the 100th normal part
Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends
Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends True trend People’s estimate
Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends • estimating variance • estimations are affected by overall magnitude • tend to estimate ratio of variance to mean magnitude
Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends • estimating variance • estimations are affected by overall magnitude • tend to estimate ratio of variance to mean magnitude Same variance misjudged as having greater variance
Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends • estimating variance • estimating degree of correlation in scatter plots • tend to overestimate small correlations and underestimate large correlations
Cues and Cue Integration • Observer must attend to & integrate cues to diagnose a situation & establish an hypothesis about the state of the world. • Cues can be characterized by 3 properties: • Diagnosticity • How much evidence does cue provide for a given hypothesis? • Reliability • How much can the cue be trusted? • Salience • How conspicuous is the cue?
Cues and Cue Integration • Cues can be characterized by 3 properties: • Diagnosticity • If trying to distinguish between stomach flu and food poisoning, vomiting as a symptom does not provide much diagnostic info. • Reliability • Salience
Cues and Cue Integration • Cues can be characterized by 3 properties: • Diagnosticity • Reliability • Independent of Diagnosticity • unreliable witness says “Joe did it” • highly diagnostic, low reliability. • engine sensor might be faulty, giving erroneous readings on occasion. • Information Value = Diagnosticity x Reliability • Salience
Cues and Cue Integration • Cues can be characterized by 3 properties: • Diagnosticity • Reliability • Salience • How conspicuous (physically) is the cue? • Salience bias: • Cues that are highly conspicuous are processed more than others
Cues and Cue Integration Example: A financial website predicts that some company’s stock will rise. Should you consider investing in this stock? • Diagnosticity • Does prediction say that rise is almost certain, or is only a bit more likely than not? • Reliability • Have this website’s predictions been accurate in the past? • Salience • Is prediction written in big letters at the top of the page or in smaller letters near bottom of page?
Quality of situation assessment? • How well a person assesses the situation is dependent on: • Comprehensiveness of Info • Is important information missing? • Quantity of information • too little? too much? • Relative Salience of cues • Are some cues more salient than others • Relative weighting (importance) given to cues • Are some cues considered more important?
Quality of situation assessment? • Comprehensiveness of info • Important information might be lacking • a good decision maker should know what info is lacking • Example • A computer user calling for tech support might neglect to report a valuable symptom of his computer's problem. • An experienced technician should recognize that symptom has not been discussed and ask for relevant info.
Quality of situation assessment? • Quantity of information • Cues rarely have an Information Value of 1.0, so we need additional cues to help us make a decision. • However, there might be too much info for decision maker to attend to and remember. • After about two cues, ability to integrate addition info declines • Decision makers might seek more info than they can effectively utilize • Example • a financial web site might provide minutiae which cannot be read & comprehended.
Quality of situation assessment? • Relative Salience • Some cues might be bigger/brighter/louder than other info, regardless of relative value. • Salient cues tend to be weighted higher in decision making • Example: • website might place info at top of page, above other info of equal value. • brochure might place some info in small type at bottom of page.
Quality of situation assessment? • Relative weighting given to cues • DM might fail to properly discount low value cues. • Often do not give more reliable cues enough weighting. • People tend to weight cues of equal salience equally. • Example: • Nurses might make diagnoses based on number of symptoms present, rather than on Diagnosticity of symptoms
Cue Conclusions • Humans good at estimating: mean, non-extreme proportions • Humans bad at estimating: • extreme proportions • extrapolating non-linear trends • estimating variance • estimating degree of correlation in scatter plots • Cues have 3 characteristics • Diagnosticity • Reliability • Salience • Cue Assessment (person side of things) • Comprehensiveness of info • Quantity of Info • Relative Salience • Relative Weighting
Expertise and Automaticity Expertise generally gives 3 advantages: 1) Better Cue Sampling 2) Recognition-Primed decision making 3) Better risk and probability calibration
Expertise and Automaticity 1) Better Cue Sampling • The expert is more likely to have a better mental model. • That, in turn, will guide the expert in where to look for clues. • “My car won’t start. It makes a grinding noise when I turn the key” • An expert mechanic would look at the starter motor and not the battery.
Expertise and Automaticity 2) Recognition-Primed DM • Rather than making a calculated decision, experts sometimes employ pattern matching to make rapid decisions. • That is, “I’ve encountered this in the past, and here is the solution I used” • Because detailed analysis is not performed, matching can sometimes occur in error and lead to incorrect or suboptimal choices. • (but it is fast!)
Expertise and Automaticity 3) Better risk and probability calibration • An expert surgeon will know that the risks of a certain surgery may outweigh the possible benefits.
Heuristics in Decision Making • Decision makers seem to behave as if guided by a number of Heuristics or “rules of thumb” • simplify decision making • don’t always produce a correct decision, but might be good enough most of the time
Example • If a coin is flipped 6 times, which of the following outcomes is most likely? • H T T H T H • H H H T T T • H H H H H T All are equally likely. Gamblers fallacy: assumes independent events are correlated.
Example 2 • A group of people contains 70 engineers and 30 lawyers. A person is drawn at random. This person is a bit shy, and enjoys math and science. What is the likelihood that this person is a lawyer? • A group of people contains 30 engineers and 70 lawyers. A person is drawn at random. This person is a bit shy, and enjoys math and science. What is the likelihood that this person is a lawyer?
Example 2 • A group of people contains 70 engineers and 30 lawyers. A person is drawn at random. This person is a bit shy, and enjoys math and science. What is the likelihood that this person is a lawyer? • 30% • A group of people contains 30 engineers and 70 lawyers. A person is drawn at random. This person is a bit shy, and enjoys math and science. What is the likelihood that this person is a lawyer? • 70%, but your representation/stereotype of engineers might say that the person is probably an engineer.
Heuristics & Biases in DM • Representativeness Heuristic • “If it walks like a duck, and quacks like a duck, it’s probably a duck.” • Assessments of situation based on similarity to mental representation of hypothesized situation. • Ignores base rate of events (e.g. what if ducks are rare?) • Example: • If a patient has 4 symptoms that match disease A and 2 symptoms that match disease B, doctor might diagnose A even though B is much more common.
Heuristics & Biases in DM • Availability Heuristic • judged likelihood of event might be based on the ease with which event comes to mind. • That is, your estimate for the frequency of an event will be biased towards items that easily come to mind. • might be biased by irrelevant characteristics (salience, recency, or simplicity of diagnosis). • Example: • people tend to overestimate the frequency of deaths by an exciting cause (e.g. airplane, sniper) and underestimate frequency of death by mundane cause (e.g. heart disease).
Heuristics & Biases in DM • Anchoring Heuristic • after forming a belief, people are biased not to abandon it. • in other words, primacy of info increases its weighting in judgments • Example: • When surprised by reported earnings, analysts naturally anchor to their old earnings until they are convinced the earnings change is due to permanent rather than temporary factors.
Heuristics & Biases in DM • Anchoring Heuristic • after forming a belief, people are biased not to abandon it. • in other words, primacy of info increases its weighting in judgments • Example: • When showed intelligence suggesting that there was no evidence that Iraq was trying to acquire uranium from Niger, administration officials stuck to the belief that Iraq was trying to acquire uranium.
Heuristics & Biases in DM • Anchoring Heuristic • When a long series of simple cues must be integrated, primacy effects are shown. • When cues are complex (detailed, unfamiliar), recency effects are more likely. • Punchline: • Beliefs might depend on order in which info is presented. • Where info should be equally weighted, should be presented simultaneously.
Heuristics & Biases in DM • Confirmation Bias • After forming a belief, people tend to seek evidence consistent with that belief, and discount inconsistent evidence. • Example: • A person that believes in the abilities of a psychic will tend to pay attention to the successes and ignore the failures.
Heuristics & Biases in DM • Overconfidence bias • People tend to assume that their judgments are much more accurate than is true. • Example: • of all instances when a given stock picker estimates a a stock is 90% likely to climb, it might actually climb only 60% of the time.
Heuristics & Biases in DM • Overconfidence bias • People tend to assume that their judgments are much more accurate than is true. • Example: • the average driver believes that they are within the top 25% of drivers….
Heuristics & Biases Conclusions • Experts might rely on automaticity (time-stressed situations) • Heuristics & Biases • Representativeness • Availability • Anchoring • Confirmation Bias • Overconfidence Bias
Choice of Action • Certain Choice • Results of action are known with certainty • A DM can optimize choice: • List attributes of potential choices, decide how important each attribute is • For each potential choice, multiply value of attributes by importance of attribute. • 3 Sum products, and choose option which maximizes this sum.