1 / 55

Decision Making

Decision Making. chapter 8. Daniel Kahneman. Dan Kahneman (e.g. Kahneman & Tversky, Kahneman & Treisman) won the 2002 Nobel Prize in Economics along with GMU Professor Vernon Smith.

rhonda
Download Presentation

Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Decision Making chapter 8

  2. Daniel Kahneman • Dan Kahneman (e.g. Kahneman & Tversky, Kahneman & Treisman) won the 2002 Nobel Prize in Economics along with GMU Professor Vernon Smith. “Kahneman has integrated insights from psychology into economics, especially concerning human judgment and decision-making under uncertainty, the Royal Swedish Academy of Sciences said in its citation.” – CNN.com

  3. What makes a decision good? • Maximize the expected value of the return • Good decisions produce good outcomes; bad decisions produce bad outcomes. • Expertise -- Experts tend to produce better decisions that novices.

  4. What makes a decision good? • Maximizing expected value of return? • Over time, make sure the average payoff of a series of decisions is as high as possible? • Problems include: • the value/cost of a payoff might be hard to estimate • expected value is only realized over the long term; value of individual decisions may vary • it may be more important to avoid loss than to secure a benefit (“loss aversion”)

  5. What makes a decision good? • Example • An investor might believe the higher return of a stock investment is less valuable than the security of a savings account. • The expected value of a series of roulette bets is negative, but a single bet might return a positive value

  6. What makes a decision good? • Producing a good result for a given decision? • problem: since outcomes are probabilistic, a single good result might not reflect the value of average, long-term results. • Example: a single winning bet on red on a roulette wheel does not indicate that betting red consistently is a good strategy.

  7. What makes a decision good? • Domain experts say that it’s good? • Problem: experts’ opinions about what is a good decision might conflict with those of first two criteria • Example: circumstances might meet criteria for a police officer to fire a weapon, even if suspect is not a threat.

  8. What Constitutes a Good Decision? • Maximizing Expected Value • Producing a ‘good’ outcome • Meeting experts’ criteria • Expected Value • To calculate expected value: • List all potential outcomes, along with their value • Calculate probability of each outcome • Multiply probability of each outcome by value of each outcome. • Sum the resulting values.

  9. Assume that in a lottery, 6 numbers are randomly chosen from the range 1-52. To win, a player must match all 6 numbers. The prize is $1 million. Q: What is the expected value of a $ 1 lottery ticket with 2 chances to win? • Probability of winning = .0000001 • Value of winning = $999,999 • Probability of losing = .9999999 • Value of losing = -$1 • Expected Value of $1 bet = = -$0.90 “A tax on people who aren’t very good at math” – Garrison Keillor

  10. Assume that two people agree to bet on tosses of a coin. For every toss that lands heads, Person A wins $1. For every toss that lands tails, Person A loses $1 Q: What is the expected value of a coin toss to Person A? P(heads) = .5 Value(heads) = $1 P(tails) = .5 Value(tails) = -$1 E(coin toss) = (.5 x $1)+(.5 x -$1) = $0.5 - $0.5 = 0 Note that at after any given coin toss, one person or the other may be momentarily ahead of the other

  11. Perception of Cues • To make a good decision, people need to be able to properly assess the situation. • That is, they need to look for clues that will guide them in their decision making. • However, humans being humans, there are some inherent biases and weaknesses in their ability to correctly perceive clues…

  12. Perception of Cues • Humans are good at • estimating the mean of multiple values • estimating proportions that aren’t too extreme • Humans are poorer at: • estimating extreme proportions • If I have seen 99 normal parts, then detecting 1 abnormal part will have more of an impact than the 100th normal part

  13. Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends

  14. Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends • estimating variance • estimations are affected overall magnitude • tend to estimate ratio of variance to mean magnitude

  15. Perception of Cues • Humans are poorer at: • estimating extreme proportions • extrapolating nonlinear trends • estimating variance • estimating degree of correlation in scatter plots • tend to overestimate small correlations and underestimate large correlations

  16. Cues and Cue Integration • Observer must attend to & integrate cues to diagnose a situation & establish an hypothesis about the state of the world. • Cues can be characterized by 3 properties: • Diagnosticity • How much evidence does cue provide for a given hypothesis? • Reliability • How much can the cue be trusted? • Information Value = Diagnosticity x Reliability • Salience • How conspicuous is the cue?

  17. Cues and Cue Integration Example: A financial website predicts that some company’s stock will rise. Should you consider investing in this stock? • Diagnosticity • Does prediction say that rise is almost certain, or is only a bit more likely than not? • Reliability • Have this website’s predictions been accurate in the past? • Salience • Is prediction written in big letters at the top of the page or in smaller letters near bottom of page?

  18. Quality of situation assessment? • How well a person assesses the situation is dependent on: • Comprehensiveness of Info • Is important information missing? • Quantity of information • too little? too much? • Relative Salience of cues • Are some cues more salient than others • Relative weighting (importance) given to cues • Are some cues considered more important?

  19. Quality of situation assessment? • Comprehensiveness of info • Important information might be lacking • a good decision maker should know what info is lacking • Example • A computer user calling for tech support might neglect to report a valuable symptom of his computer's problem. An experienced technician should recognize that symptom has not been discussed and ask for relevant info.

  20. Quality of situation assessment? • Quantity of information • Cues rarely have an Information Value of 1.0, so we need additional cues to help us make a decision. • However, there might be too much info for decision maker to attend to and remember. • After about two cues, ability to integrate addition info declines • Decision makers might seek more info than they can effectively utilize • Example • a financial web site might provide minutiae which cannot be read & comprehended.

  21. Quality of situation assessment? • Relative Salience • Some cues might be bigger/brighter/louder than other info, regardless of relative value. • Salient cues tend to be weighted higher in decision making • Example: • website might place info at top of page, above other info of equal value. • brochure might place some info in small type at bottom of page.

  22. Quality of situation assessment? • Relative weighting given to cues • DM might fail to properly discount low value cues. • Often do not give more reliable cues enough weighting. • People tend to weight cues of equal salience equally. • Example: • Nurses might make diagnoses based on number of symptoms present, rather than on Diagnosticity of symptoms

  23. Cue Conclusions • Humans good at estimating: mean, non-extreme proportions • Humans bad at estimating: • extreme proportions • extrapolating non-linear trends • estimating variance • estimating degree of correlation in scatter plots • Cues have 3 characteristics • Diagnosticity • Reliability • Salience • Cue Assessment (person side of things) • Comprehensiveness of info • Quantity of Info • Relative Salience • Relative Weighting

  24. Expertise and Automaticity • Recognition-Primed DM • Rather than making a calculated decision, experts sometimes employ pattern matching to make rapid decisions. • That is, “I’ve encountered this in the past, and here is the solution I used” • Because detailed analysis is not performed, matching can sometimes occur in error and lead to incorrect or suboptimal choices.

  25. Heuristics in Decision Making • Decision makers seem to behave as if guided by a number of Heuristics or “rules of thumb” • simplify decision making • don’t always produce a correct decision, but might be good enough most of the time

  26. Example • If a coin is flipped 6 times, which of the following outcomes is most likely? • H T T H T H • H H H T T T • H H H H H T All are equally likely. Gamblers fallacy: assumes independent events are correlated.

  27. Example 2 • A group of people contains 70 engineers and 30 lawyers. A person is drawn at random. This person is a bit shy, and enjoys math and science. What is the likelihood that this person is a lawyer? • A group of people contains 30 engineers and 70 lawyers. A person is drawn at random. This person is a bit shy, and enjoys math and science. What is the likelihood that this person is a lawyer?

  28. Heuristics & Biases in DM • Representativeness Heuristic • “If it walks like a duck, and quacks like a duck, it’s probably a duck.” • Assessments of situation based on similarity to mental representation of hypothesized situation. • Ignores base rate of events (e.g. what if ducks are rare?) • Example: • If a patient has 4 symptoms that match disease A and 2 symptoms that match disease B, doctor might diagnose A even though B is much more common.

  29. Heuristics & Biases in DM • Availability Heuristic • judged likelihood of event might be based on the ease with which event comes to mind. • incorporates base-rate info, because common events come to mind more easily • might be biased by irrelevant characteristics (salience, recency, or simplicity of diagnosis). • Example: • people tend to overestimate the frequency of deaths by an exciting cause (e.g. airplane, sniper) and underestimate frequency of death by mundane cause (e.g. heart disease).

  30. Heuristics & Biases in DM • Anchoring Heuristic • after forming a belief, people are biased not to abandon it. • in other words, primacy of info increases its weighting in judgments • Example: • When surprised by reported earnings, analysts naturally anchor to their old earnings until they are convinced the earnings change is due to permanent rather than temporary factors.

  31. Heuristics & Biases in DM • Anchoring Heuristic • When a long series of simple cues must be integrated, primacy effects are shown. • When cues are complex (detailed, unfamiliar), recency effects are more likely. • Punchline: • Beliefs might depend on order in which info is presented. • Where info should be equally weighted, should be presented simultaneously.

  32. Heuristics & Biases in DM • Confirmation Bias • After forming a belief, people tend to seek evidence consistent with that belief, and discount inconsistent evidence. • Example: • A person that believes in the abilities of a psychic will tend to pay attention to the successes and ignore the failures.

  33. Heuristics & Biases in DM • Overconfidence bias • People tend to assume that their judgments are much more accurate than is true • Example: • of all instances when a given stock picker estimates a a stock is 90% likely to climb, it might actually climb only 60% of the time.

  34. Heuristics & Biases Conclusions • Experts might rely on automaticity (time-stressed situations) • Heuristics & Biases • Representativeness • Availability • Anchoring • Confirmation Bias • Overconfidence Bias

  35. Choice of Action • Certain Choice • Results of action are known with certainty • A DM can optimize choice: • List attributes of potential choices, decide how important each attribute is • For each potential choice, multiply value of attributes by importance of attribute. • 3 Sum products, and choose option which maximizes this sum.

  36. optimize choice

  37. Choice of Action Or more likely… • Satisfice, or pick a choice which might not be optimal, but is good enough. • BMW dealer is 100 miles away, Mazda dealer is down the street… • or • choose by heuristic elimination by aspects: choose a single attribute, discard any choices which do not meet criterion for that attribute. • Example: • I only have $20k to spend on a vehicle. Therefore, the BMW is out of my price range and the Miata is iffy…

  38. Choice of Action • Uncertain Choice • Consequences of choice are not certain • That is, the consequences are probabilistic • Maximize expected utility of outcome Utility = subjective value • Utility is not always the same as objective value

  39. Utility Gain Value Gain Loss Loss Distortions of Value and Cost • People underestimate gains in value. • Potential losses are perceived as having greater subjective consequences.

  40. Uncertain Choice • Biases in setting Utility • people are loss aversive, prefer to avoid risk of loss rather than gamble on a gain. • Example: • Imagine that you're a contestant on a TV game show. You have just won $10,000. The host offers you a choice: You can quit now and keep the $10,000, or you can play again. If you play again, there is a 0.5 probability that you will win again, and wind up with $20,000. If you play again and lose, you lose your $10,000 and take home nothing. You quickly calculate that the expected value of playing again is $10,000, the same as sticking with the $10,000 you have won so far. Which do you chose?

  41. Utility Gain Value Gain Loss Loss Distortions of Value and Cost • There are progressively smaller gains in utility as value increases. A gift of $100 seems like a lot if you are a poor graduate student, but is a pittance of you are a movie star.

  42. 1.0 Subjective Probability 0 1.0 Stated Probability Perceptions of Probability • Probability of rare events is overestimated e.g. probability of needing to file insurance claim is low

  43. Perceptions of Probability • Example: Insurance • The estimated value of purchasing insurance is negative: on average, you will lose money (other wise the insurer would not make money). • Example: • the stock market might appear more risking than a savings account. However, with interest rate below the rate of inflation, a savings account guarantees losses

  44. 1.0 Subjective Probability 0 1.0 Stated Probability Perceptions of Probability 2. Reduced Sensitivity to probability changes at low P(). e.g. sluggish beta, representativeness heuristic, & ignorance of base rates

  45. 1.0 Subjective Probability 0 1.0 Stated Probability Perceptions of Probability 3. Perceived probability is less than real probability e.g. the probability of a risky but high-payoff outcome might be underestimated, and a surefire (but low-paying) outcome might be chosen instead.

  46. Uncertain Choice • Biases in setting Utility • Perception of outcome as a gain or loss is determined by perceived neutral point. This is the framing effect. • Example: • people see a treatment with 90% survival rate as being preferable to a treatment with 10% mortality rate.

  47. Biases in Uncertain Choice • Direct Retrieval/automaticity • Experts might automatically retrieve a solution when the cues fit a given pattern, rather than analyzing the situation. • Error example: • Policeman fires his gun when a person reaches into his pocket.

  48. Choice Conclusions • Certain Choice • outcomes are known • Optimize choice • Satisfice • Elimination by aspect • Uncertain Choice • outcomes are probabilistic • maximize expected utility • gain is underestimated • loss is overestimated • rare events are overestimated • if not rare, overall probability is underestimated • Direct Retrieval/automaticity

  49. Improving Decision Making • Practice in DM doesn’t necessarily make perfect. • Expertise in a DM task doesn’t make one immune from the effects of biases and heuristics.

More Related