280 likes | 307 Views
Selection Methods. Factors to consider System component costs Methods. FACTORS. Cost: often hard to estimate If technology enabled without modifications, easier to estimate development costs Still hard to know benefit impact Intangibles Market share, better customer service
E N D
Selection Methods Factors to consider System component costs Methods
FACTORS • Cost: often hard to estimate • If technology enabled without modifications, easier to estimate development costs • Still hard to know benefit impact • Intangibles • Market share, better customer service • Better corporate image • Better access to data
Hidden Outcomes • Organizational power shifts • Imposition of work methods • Get more done with fewer people • Big impact on how people work • Subject to technology changes
Hinton & Kaye (1996) • Two capital budgeting approaches: • Capital investment • Apply rigorous cost/benefit analysis • Revenue enhancement • Disregard cost/benefit details IT tends to be treated as a capital investment Maybe it shouldn’t be • If critical to firm strategy • If needed to keep up with competitors
IT Budgeting Techniques • FINANCIAL • Discounted cash flow • Long term – reflects time value of money • Cost-benefit analysis • Do not need to include time value of money • Payback • Quick and dirty • Usually sufficient
IT Risk Factors • Project manager ability (often hire consultant) • Experience (probably won’t have in ERP) • Experience with programming • Availability of critical equipment, software • Probably not a problem in ERP • Project team completeness • Make it complete • Personnel turnover • Especially after adoption of ERP • Project team size (make sure enough) • Relative control of project manager over project team
IS/IT Project Evaluation Technique UseBacon [1992]: Financial
IS/IT Project Evaluation Technique UseBacon [1992]: Managerial
IS/IT Project Evaluation Technique UseBacon [1992]: Developmental
Cost Benefit Example • Adopt a small ERP ($3 million) • 1st year BPR • Internal team doubles in cost in year 2 • Consulting costs double in year 2 • Year 2 – more hardware • Year 3 – finish implementation • Operate ERP begins year 4 • Cost of capital 20% per year
Value Analysis • Keen (1981) • DSS benefits usually very nebulous • Unfair to apply cost-benefit analysis • benefit estimates unreliable • Costs - identify as in cost-benefit • Benefits - leave in subjective terms • Managerial decision: are you willing to pay this much for that set of benefits?
SMARTSimple Multi-attribute Rating Technique develop hierarchy, score alternatives, weight value = weight x score
terminology • objectives - what you want to accomplish • attributes - features of a thing • criteria - measures of things of value • tradeoffs - one alternative better on one attribute, the other better on another attribute • Vendor fast, has better perceived quality, higher price • ASP fast, but less control • In-House slow, risky, but best fit
SMART technique 1. identify person whose utilities are to be maximized 2. identify the issue or issues 3. identify the alternatives to be evaluated 4. identify the relevant dimensions of value for evaluating alternatives (attribute scales) 5. rank the dimensions in order of importance 6. rate dimensions in importance, preserving ratios 7. sum the importance weights, & divide by total(wi) 8. measure how well each alternative does on each dimension(sij) 9. U = wi sij
points • in Step 4, limit criteria • there are only so many things a human can keep track of at one time • 8 plenty • if weight extremely low, drop
methodology • Step 4: Vendor, ASP, In-house • Step 5: rank order criteria • cost > quality > control • Step 6: rate dimensions • least important = 10 control = 10 quality = 35 cost = 50
methodology • Step 7: sum, divide by total cost = 50/95 = .526 quality = 35/95 = .368 control = 10/95 = .105 • SWING WEIGHTING (check) give most important 100, others proportional cost = 100, quality = 60, control = 25 cost = 100/185 = .541, quality = .324, control = .135 • maybe average: cost .53; quality .35; control .12
methodology • purpose of swing weighting • the input is admittedly an approximation • giving values based on a different perspective • additional check • should yield greater accuracy
scores • Step 8: score each alternative on each criterion • need as objective a scale as you can get • doesn’t have to be linear COST: maximum feasible = 0 minimum available = 1.0 Vendor about 0.8 Outsource 1.0 In-House 0
scores • QUALITY: Vendor excellent 1.0 ASP less 0.4 In-House good 0.8 • CONTROL Vendor average 0.3 ASP low 0.1 In-House maximum 1.0
calculation of value U = wi sij COST QUAL Control weights.53 .35 .12 scores: TOTALS Vendor 0.8 1.0 0.3 0.810 ASP 1.0 0.4 0.1 0.682 In-House 0.0 0.8 1.0 0.400 recommends the Vendor system
SMART • provides a very workable means to implement the principles of MAUT • in fact, it can be MORE accurate than MAUT (more realistic scores, tradeoffs) • identify criteria • develop scores over criteria • identify alternatives available, measure scores • simple calculation