180 likes | 303 Views
EML4550 - Engineering Design Methods. Decision Theory. Hyman: Chapter 9. Decision theory. Optimization Well-defined variables (we know what to ‘manipulate’) Well-defined objective function (we know what to minimize/maximize) Strict mathematical framework (we know what we are doing)
E N D
EML4550 - Engineering Design Methods Decision Theory Hyman: Chapter 9
Decision theory • Optimization • Well-defined variables (we know what to ‘manipulate’) • Well-defined objective function (we know what to minimize/maximize) • Strict mathematical framework (we know what we are doing) • Economic analysis • Well-defined costs or economic benefits • Decision is based on a single criterion that reduces to a dollar sign (like optimization, single criterion) • If at all possible, design decisions should be based on models amenable to optimization or economic analysis. But more often than not, this is not possible, and the designer must reach a decision without such simple models
Decision theory • Many times a designer will be faced with decisions that cannot be easily (or appropriately) reduced to an optimization or an economic analysis model • Multiple criteria • Non-quantifiable variables • Uncertainty (probabilistic variables) • ‘apples and oranges’ comparison • Many different approaches to decision-making. We will cover: • Multiple criteria - Decision matrices (already partially covered in concept selection) • Decision under uncertainty (probabilistic analysis) • Risk-based decision-making (utility functions)
Multiple criteria • Like in the example below (car bumper) • 3 design options, 4 decision criteria • Loosely defined metric of ‘goodness’ on each criterion • How can a decision be made?
Decision Matrix (Unweighted Pugh Matrix) Assign: excellent +, adequate 0, and poor -
Decision matrix • Identify criteria • Develop criteria BEFORE the options are clear (prune later if necessary) • Include only those attributes for which a differentiation exists • Refine criteria as options are clarified or we gain further knowledge (e.g., eliminate, combine, etc.) • Develop criteria metrics • Tie criteria to quantifiable variables (e.g., ‘cost’ ---> dollars, ‘durability’ ---> fatigue limits, etc.) • Some criteria will be hard to tie to a metric (define a self-consistent metric, e.g., ‘excellent’, ‘very good’, …, ‘poor’, etc.) • All metrics are different (‘apples and oranges’), we need a consistent and common evaluation scale
Decision matrix (Cont’d) • Evaluation scales • Assign similar values on common scale (e/g/, ‘excellent’, ‘adequate’, and ‘poor’) • Tie these scales to a metric (e.g., cost<$1000 ---> ‘excellent’) • Numerical scales • Assign numbers (e.g., from 1 to 10) instead of categories • Even though it opens the possibility of quantitative comparison, it does not remove the subjective nature of assigning values • A strong correlation between the evaluation scale and the corresponding metric is needed
Example • Power transmission between two shafts • Hierarchical tree structure to determine criteria (and associated metrics) • Total 15 criteria to consider Focus on these two criteria later
Example: Connecting evaluation scale with metrics • Specify end points • Note end points have been defined for each metric (e.g., highest achievable spec.) • Numerical values (and corresponding ‘word’ values) have been assigned to each range
Comparison (decision) matrix • Assign values (from scale) to each option and for each criteria • Compile scores (‘raw’ scores) • Add up the score for each option and normalize it • Torque: Max 50,000 (scale of 10), min 1,500 (scale of 0); increment: 4,850 • Load: Max 5,000 (10), min 500 (0); increment: 450 (4200-500)/450=8.22 sum15+14+5=34 =15/34 (35000-1500)/4850=6.91
Relative importance of criteria • The previous example shows how to select one option (e.g., option C is clearly inferior and must be discarded, but options A and B are very similar) • Question: should all criteria be treated as equally important? • Before we proceed with a method to include all 15 criteria (in the example) we must determine the relative importance of each criterion, and assign ‘weights’ to them • How do we determine the relative importance of criteria? • How do we assign a ‘weight’ to each one?
Relative importance of criteria: Pairwise Comparison • Returning to the car bumper example • Matrix with all four criteria in rows and columns • For each row (criterion) include a “1” on the column where that criterion is more important than the one on that column (ex. Damage control is more important than cost assign 1 to damage control row) • Add the “1s” on each row to assess relative importance of criterion • Normalize weights over the total for all rows =1/6 1+2+0+3=6
Relative importance of criteria: Pairwise comparisons • There are 16 elements in the matrix, 4 (on the diagonal) are not necessary. Of the 12 remaining elements, not all are independent, in this 4x4 example, 6 independent comparisons are needed • For N criteria, N(N-1)/2 comparisons are needed for self-consistency (e.g., 15 criteria like in the simple shaft transmission example require 15(15-1)/2=105 comparisons)
Use objective tree 3(2)/2=3 • Instead of making all 105 comparisons (tedious and prone to errors), use the objective tree • Only establish relative importance of criteria within a subgroup (in this example that reduces the total number of pairwise comparisons from 105 to 23) • (10)level 2 + (3+1+1+1+3)level 3 + (3+1)level 4 = 23 3 1 5(4)/2=10 1 1 1 3
0.4+0.5+0.1=1.0 Using objective trees to assign weights Example: at level 2, a total 5 criteria with the following weight assignment: maintenance (k=0.1), geometry (k=0.1), health/safety (k=0.3), operating conditions (k=0.25), power/load rating (k=0.25). Sum(ki)=0.1+0.1+0.3+0.25+0.25=1.0 Individual weight=group weight*k Example 1: @ level 2, power/load rating k=0.25, group weight from level 1 wlevel 1=1.0, w=(1.0)(0.25)=0.25 Example 2: @level 4, speed flexibility k=0.3, group weight from level 3 w op. speed=0.162 w=(0.162)(0.3)=0.049
Using objective trees to assign weights Sum=0.925+0.825+0.35=2.1 0.925/2.1=0.44
Analytical Hierarchy Process (AHP) • The previous processes (relative importance or objective trees) of assigning weights is still subjective and open to inconsistencies • They do not take into account the options to be evaluated (‘a priori’ selection of weights and end points for scale) • It could be that the scale is too coarse for the options at hand, or that one or more options are ‘off the scale’ • For very large $ value projects, it may pay to do some preliminary engineering and to apply AHP See Hyman, Section 9.3, for further details on this method
Decision-making under uncertainty • Usually the decision to go with one design option or another has to be made in the middle of the design process, that is, when not all information is available and knowledge is fragmentary • Probability also plays a role, sometimes the ‘best’ decision has to be based on the odds of certain events to happen (or not) • The designer (like politicians) must be able to make decisions under uncertainty. It helps to understand the decision-making process