270 likes | 698 Views
Utility Theory. Here the focus is on the Utility Theory. Before we do so let’s consider an example. Say one option for you is to take a bet that pays $5,000,000 if a coin flipped comes up tails and you get $0 if the coin comes up heads.
E N D
Here the focus is on the Utility Theory. Before we do so let’s consider an example. Say one option for you is to take a bet that pays $5,000,000 if a coin flipped comes up tails and you get $0 if the coin comes up heads. The other option is that you will get $2,000,000 with certainty. (Say your grandmother will give you $2,000,000 if you do not bet.) EMV of the bet = .5(5,000,000) + .5(0) = 2,500,000 EMV sure deal = 1(2,000,000) = 2,000,000 Choosing the option with the highest EMV has been our decision rule. But with a sure bet we may decide to avoid the risky alternative. Would you take a sure $2,000,000 over a risky $5,000,000?
Utility Theory is a methodology that incorporates our attitude toward risk into the decision making process. It is useful to employ a graph like this in our analysis. In the graph we will consider a rule or function that translates monetary values into utility values. The utility values are our subject views of preference for monetary values. Typically we assume higher money values have higher utility. Utility value Monetary value
Say we observe a person always buying chocolate ice cream over vanilla ice cream when both are available and both cost basically the same, or even when chocolate is more expensive and always when chocolate is the same price or cheaper. So by observing what people do we can get a feel for what is preferred over other options. When we assign utility numbers to options the only real rule we follow is that higher numbers mean more preference or utility. Even when we have financial options we can study or observe the past to get a feel for our preferences. The book we use goes through an elaborate story for assigning utility values. It is just one story and is valid, but other ways have validity as well. Our point is to become aware of the method and see how the method works, assuming the values assigned are realistic to the problem at hand.
In general we say people have one of three attitudes toward risk. People can be risk avoiders, risk seekers (or risk lover) , or indifferent toward risk (risk neutral). Utility Value Risk neutral Risk avoider Risk lover Monetary Value Utility values are assigned to monetary values and the general shape for each type of person is shown above. Note that for equal increments in dollar value the utility either rises at a decreasing rate (avoider), constant rate or increasing rate.
Say we have the following payoff table States of Nature Alternatives s1 s2 s3 d1 30000 20000 -50000 d2 50000 -20000 -30000 d3 0 0 0 If we have P(s1) = .3, P(s2) = .5, and P(s3) = .2, then the EMV of each alternative is: d1 .3(30000) + .5(20000) + .2(-50000) = 9000 d2 .3(50000) + .5(-20000) + .2(-30000) = -1000 d3 = 0 So here the best option is d1. Next we calculate the expected utility of each option. The calculation is similar to the expected value calculation.
Say we have been given the utility value for each monetary value and so we change all the values in our table to utility values States of Nature Alternatives s1 s2 s3 d1 9.5 9. 0 d2 10 5.5 4.0 d3 7.5 7.5 7.5 We had P(s1) = .3, P(s2) = .5, and P(s3) = .2. Next we calculate the expected utility of each option. The calculation is similar to the expected value calculation. The expected utility for each is d1 .3(9.5) + .5(9.0) + .2(0) = 7.35 d2 .3(10) + .5(5.5) + .2(4.0) = 6.55 d3 .3(7.5) + .5(7.5) + .2(7.5) = 7.5 Option 3 has the best expected utility, so we choose option 3. Option 3 is to do nothing here. Option 1, while having the best EMV, also has a 20% chance of losing 50000. Expected utility theory incorporates our subjective view of the loss and here rules out that risk in d1.
Once you have the utility values associated with the money values if you graph them out you can see what type of person you have: risk avoider, lover of a person indifferent to risk. What I do in the rest of this section is expand your “feel” for this method. You can probably stop here to work on problems, but I carry on if you need that feel.
Say we have an opportunity that is uncertain. We could get 50000 p percent of the time, but we could get -50000 (1-p) percent of the time. The expected value of the uncertain opportunity is defined as p(50000) + (1-p)(-50000). Once we have a value for p we could calculate the expected value. As an example, say p = .3. Then the expected value is .3(50000) + .7(-50000) = -20000. Now, let’s think about the three types of people again. Say each has to compare an opportunity with a certain -20000 with the uncertain opportunity I just mentioned.
On the next slide I show the utility function for each type of person. Note at -20000 I have a higher number assigned (by height of curve) for risk avoider, then neutral, then lover. Note the number just represents preference or utility for a person. Higher numbers for a person means higher utility for that person. But, we can not compare numbers across people and say because one has a higher number than another that one prefers it more than another. We will not compare across people.
Remember the uncertain option had possible outcomes 50000 or -50000 and an expected value = -20000 (because we had p = .3). The certain option was to get -20000 (lose 20000) a b c Monetary Value -50 -30 -20 0 20 30 50 For the certain option the utility is just read off the utility function or curve for each person. The values are represented by letters a, b, and c for the risk avoider, neutral, and lover types, respectively.
For the uncertain option we have to talk about expected utility because we do not know for sure what will happen. The expected utility of an uncertain opportunity is a weighted average of the utility of each component of the opportunity, where the weights are the probabilities of occurrence of each outcome. In our example this would be .3U(50000) + .7U(-50000) In the graph the expected utility is found as the height of a straight line that connects the two uncertain outcomes and it occurs above the expected value. Let’s go back to the graph. For the risk avoider the sure loss of 20000 has more utility than the expected utility of the uncertain opportunity (a is higher than b). For the risk lover the utility of the certain outcome is less than the expected utility (c is less than b). For the risk neutral person the two are the same.
The risk avoider prefers certain amounts over uncertain amounts with the same expected value. The risk lover prefers uncertain amounts that have an expected value equal to the certain amount. The risk neutral person is indifferent, or has a tie in terms of preference, between the two. This does not mean the risk avoider will never take risk (choose an opportunity where the payout is uncertain). It just means the expected value of the uncertain option has to be higher than a sure amount by an amount sufficient to induce them to take the risk. This also does not mean the risk lover will never take a certain option. But the value of the certain option should be sufficiently high. I elaborate on the risk avoider next.
Here we show a generic example with a risk avoider. Two monetary values of interest are, say, X1 and X2 and those values have utility U(X1) and U(X2), respectively Utility U(X2) U(X1) $ X1 X2
Say the outcome of a risky decision is to have X1 occur p% of the time and X2 occur (1 – p)% . Then the EMV is p(X1) + (1 – p)(X2). The expected utility of the risky decision is found in a similar way and without proof I tell you the expected utility is Utility U(X2) U(X1) EU $ EMV X1 X2 along the straight line connecting the points on the curve directly above the EMV for the decision. We have the expected utility as EU = pU(X1) + (1 – p)U(X2)
The decision maker may have an option that is certain. If so, the EU is simply the utility along the utility curve. So in this diagram we see that any sure bet greater than Y has an expected utility greater than the expected utility of the risky option. Utility U(X2) U(X1) EU $ Y EMV X1 X2
Any opportunity that has a certain payoff between y and EMV would have a value lower than the expected value of the uncertain option, but would have higher utility and would be preferred. If we call any certain amount between Y and EMV C, then EMV minus C is called the risk premium the decision maker is willing to “pay” to avoid getting the low value of the uncertain option.
Example States of Nature Alternatives s1 s2 s3 d1 30000 20000 -50000 d2 50000 -20000 -30000 d3 0 0 0 Also say P(s1) = .3, P(s2) = .5, and P(s3) = .2. Next we will go through a process used to assign utility values to each value in the payoff table. In general we will say U(x) is the utility value assigned to the payoff value x. Note that 50000 is the highest payoff and -50000 is the lowest payoff in this problem.
Let’s assign the number 10 as the utility for the highest payout, 50000. Thus U(50000) = 10. Let’s assign 0 as the utility for the lowest payout, -50000. So, U(-50000) = 0. We could have used other numbers to represent the utility, but the main point is higher payouts get higher utility numbers. Next, we will consider a method to assign utility values to all the other payoff values in the table. First consider a lottery. Say that there is probability p that if the lottery is played the manager will win 50000 (the highest value in the payoff table), and there is a (1-p) probability the manager will win -50000 (the lowest value in the payoff table). The lottery has an expected utility pU(50000) + (1-p)U(-50000) = p(10) + (1-p)0 = p(10).
Remember the expected utility is on a straight line between the uncertain points on the utility function. The straight line would be the utility function for the risk neutral person. The curved line shown is for a risk avoider. The straight line is also the expected utility line for the risk avoider. Utility is 10 here Utility is zero here -50 -30 -20 20 30 50
Next we have to assign utility values to each of the other payoff values in the table. Let’s start with the value 30000. We will assume for a short time that 30000 can be had with certainty. If p, the probability of getting the 50000 in the lottery, is 1 or close to 1, then the person is very likely to get the 50000 and thus the lottery is likely to be preferred to the certain 30000. But if p is close to 0 the person is likely to get -50000 and thus the certain 30000 would be preferred. There is a value of p such that the person would have an equal preference for the lottery and the certain 30000. The individual would then be said to be indifferent between the lottery and the certain value. p is then called the indifference probability.
The 30000 is the certain deal. For the risk avoider the utility of 30000 is on the curve above 30000. The expected value of the lottery is p(50000) + (1-p)(-50000) = p100000 – 50000. This expected value would equal 30000 if p = .8 Let’s look at the expected utility of the lottery again. We had p10. If p = .8 the expected utility of the lottery would be 8 and this would be the height of the straight line above 30000. So, the utility curve above 30000 has to be assigned a value above 8 if the person is risk averse or is a risk avoider. There is some value for p above .8, but less than 1 that would make the lottery equally attractive as the certain 30000. This probability is called the indifference probability. In the real world the decision maker has to grapple with what this value is. We will take it as a given piece of information.
For the certain value 30000, say the indifference probability is .95. With p = .95 the expected utility of the uncertain option is .95(10) + .5(0) = 9.5 and since the utility of the certain option is the same, U(30000) = 9.5 Similarly, say for all the other values in our payoff table we get the indifference probabilities as seen in the table below and the associated utility values. Payoff value indiff. Prob utility 30000 .95 9.5 20000 .90 9.0 0 .75 7.5 -20000 .55 5.5 -30000 .4 4.0
So, we take the original payoff table States of Nature Alternatives s1 s2 s3 d1 30000 20000 -50000 d2 50000 -20000 -30000 d3 0 0 0 And change all the values to utility values States of Nature Alternatives s1 s2 s3 d1 9.5 9. 0 d2 10 5.5 4.0 d3 7.5 7.5 7.5 We had P(s1) = .3, P(s2) = .5, and P(s3) = .2. Next we calculate the expected utility of each option. The calculation is similar to the expected value calculation.
The expected utility for each is d1 .3(9.5) + .5(9.0) + .2(0) = 7.35 d2 .3(10) + .5(5.5) + .2(4.0) = 6.55 d3 .3(7.5) + .5(7.5) + .2(7.5) = 7.5 Option 3 has the best expected utility, so we choose option 3.