250 likes | 448 Views
Data Analysis – Workshop . Decision Making and Risk Spring 2006 Partha Krishnamurthy. Outline. Introduction to Decision Analysis Review Decision Trees Terminology/Orientation Introduction to DA software Workshop (hands-on) Basic decision tree Sensitivity analysis
E N D
Data Analysis – Workshop Decision Making and Risk Spring 2006 Partha Krishnamurthy
Outline • Introduction to Decision Analysis • Review Decision Trees • Terminology/Orientation • Introduction to DA software • Workshop (hands-on) • Basic decision tree • Sensitivity analysis • Incorporating costs and utilities
Basic Decision Tree • Nodes: • Decision node, • Chance node, • Terminal node, • Branches, connect nodes • Outcomes, labels, probabilities (fixed, variable) • Values/Utilities
Data Analysis Conventions Distal/Downstream Upstream/Proximal What is the relevance of this distinction?
Analysis – Basic Principle • Evaluation of trees typically proceeds from the terminal nodes to the decision nodes, i.e., upstream. • At each chance node, expected value is calculated in the usual fashion; EVj=∑i=1 to npij*Uij • The expected value serves as the “Utility” at the chance node, as the analysis proceeds upstream. • This process is called average-out/fold-back.
The Use of Variables • Both outcomes (payoffs) and probabilities can be specified as raw numbers or as variables. • Specifying them as variables facilitate sensitivity analysis. • Also, you can specify correlations among them. • Let us look at a simple decision tree.
Market Intervention Example • Problem: Declining sales. • Possibilities: Product out-of-sync with market, or nothing wrong (seasonality, things will get better). • Interventions: Launch promotion or do nothing. • Data: • Outcomes • 22.5m if it succeeds with/without promo. • 12.5 if it fails without promo and 10m if it fails with promo • Probabilities • p(outofsync)= .3 • p(promoeffective_if outofsync)=.86 • p(failure_outofsync) = 0.90
Remainder of the Session • We will do the following with the castle decision problem. • Structure the tree. • Perform sensitivity analysis. • All of us walk through the same decision problem, step-by-step.
Reading Break • Take 10 minutes and read the first three pages of the castle decision problem.
Application Break • Take 10 minutes to answer the questions in hand-out.
Structuring the Tree in DATA • In this segment, we are going to follow the steps in the handout (pages 4 through 8).
Sensitivity Analysis • What if our assumptions about the probabilities and/payoffs are different? How would the decision change? • Conceptually, what does sensitivity analysis help us accomplish?
Mechanics • Specify how many different variables you want to analyze. • Specify the range you want analyzed. • Specify the number of intervals. • DATA computes expected values at the intervals only, and performs linear interpolation of expected values in between. • More intervals, smoother curve. • More intervals, significant computational resources.
Performing Sensitivity Analysis • In this segment, we will go through the steps in the handout session 2.
Insights from Sensitivity Analysis • Can you generate some insights about the castle decision by performing different types of sensitivity analyses?
Modeling Costs Separately • Previously, we modeled payoff for each outcome state as a single net revenue. • It is more reasonable to think of the payoffs has having two components, a revenue component and a cost component. • Your decision may be sensitive to your revenue assumptions as well as cost assumptions. • Modeling your decision as a cost-benefit tree allows you to gauge the importance of revenues and cost assumptions separately.
Decision Context • Refer to Castle product introduction decision. • Payoff for each outcome state has two components, a revenue component and a cost component. • We modeled only the revenue component. • Our goal is to assess what happens to the decision if the cost of introduction (now and later) and development cost if product fails in 6 months are modeled explicitly.
Modeling Strategy • First change the calculation method to “Benefit-Cost”. • Second, specify costs of introduction now and later, and cost of development if product fails in 6 months. • Initially set all costs to zero at the root node and recover the model expected value (should be the same as before). • Later, use sensitivity analysis to find out the impact of these three variables on the decision.
Application Break • Go back to the castle decision tree (if you have saved the tree, good, if not, let me know, I will give you the file). • Follow the steps in cost-benefit modeling from hand out session 3.
Generalized Multi-Attribute Models • What if every outcome state had payoffs that are not necessarily like costs and benefits? • Notice that they are traded-off using a positive and a negative coefficient, equally important and opposite in effect. • This is where multi-attribute models come into play. • You can tell the DA program, how to combine multiple payoffs, and how to evaluate them.
Warrant for Multi-Attribute Models • Each outcome has more than one attribute. For example: • Revenue • Market Share • Strategic Fit • Profit • Decisions have to tackle multiple attributes at the same time.
Market Segmentation Decision - Example • Segments under consideration • Cash Cow • 10 on revenue, 5 on market share growth, 3 on strategic fit, 6 on profitability. • Star of the Future, Dog for now. • 3 on revenue, 7 on market share growth, 8 on strategic fit, 2 on profitability. • Multi-segment • 5 on revenue, 4 on market share growth, 7 on strategic fit, 6 on profitability.
Modeling Strategy • Create three branches off of the decision node, each for one segment. • Define each choice as a terminal node, and enter the four payoffs for each choice. • Change model to generalized multi-attribute. • Tell DATA how to combine the attributes. • Set importance of each attribute • Specify each attribute importance coefficient as a variable. • Set the value for the each attribute importance, say 0.25, equally weighted. • Then perform sensitivity analyses to see how shifting decision criteria changes decisions.