230 likes | 449 Views
Quantitative Analysis for Management. Chapter 4 Decision Trees. Chapter Outline. 4.1 Introduction 4.2 Decision Trees 4.3 How Probability Values Are Estimated by Bayesian Analysis. Learning Objectives. Students will be able to: Develop accurate and useful decision trees
E N D
Quantitative Analysis for Management Chapter 4 Decision Trees 4-1
Chapter Outline 4.1 Introduction 4.2 Decision Trees 4.3 How Probability Values Are Estimated by Bayesian Analysis 4-2
Learning Objectives Students will be able to: • Develop accurate and useful decision trees • Revise probability estimates using Bayesian Analysis 4-3
Introduction Decision trees enable one to look at decisions: • with many alternatives and states of nature • which must be made in sequence 4-4
Decision Trees A graphical representation where: • a decision node from which one of several alternatives may be chosen • a state-of-nature node out of which one state of nature will occur 4-5
Thompson’s Decision Tree Fig. 4.1 Favorable Market A State of Nature Node 1 Unfavorable Market Construct Large Plant A Decision Node Favorable Market Construct Small Plant 2 Unfavorable Market Do Nothing 4-6
Five Steps toDecision Tree Analysis • Define the problem • Structure or draw the decision tree • Assign probabilities to the states of nature • Estimate payoffs for each possible combination of alternatives and states of nature • Solve the problem by computing expected monetary values (EMVs) for each state of nature node. 4-7
Thompson’s Decision Tree Fig. 4.2 A State of Nature Node Favorable (0.5) Market $200,000 1 EMV =$10,000 Construct Large Plant -$180,000 Unfavorable (0.5) Market A Decision Node $100,000 Favorable (0.5) Market Construct Small Plant 2 EMV =$40,000 -$20,000 Unfavorable (0.5) Market Do Nothing 0 4-8
Expected value of best decision with sample information, assuming no cost to gather it Expected value of best decision without sample information Expected Value of Sample Information EVSI= 4-14
Estimating Probability Values by Bayesian Analysis Bayes Theorem Prior probabilities Posterior probabilities New data • Management experience or intuition • History • Existing data • Need to be able to revise probabilities based upon new data 4-15
Table 4.1 4-16
Table 4.2 Probability Revisions Given a Positive Survey Conditional Posterior Probability Probability State P(Survey Prior Joint of positive|State of Probability Probability Nature Nature) 0.35 FM 0.70 * 0.50 0.35 = 0.78 0.45 0.10 UM 0.20 * 0.50 0.10 = 0.22 0.45 0.45 1.00 4-17
Table 4.3 Probability Revisions Given a Negative Survey Conditional Posterior Probability Probability State P(Survey Prior Joint of negative|State Probability Probability Nature of Nature) 0.15 FM 0.30 * 0.50 0.15 = 0.27 0.55 0.40 UM 0.80 * 0.50 0.40 = 0.73 0.55 0.55 1.00 4-18