100 likes | 290 Views
Cost Estimation Toolkit (Version 1). Kathy Fontaine, NASA Greg Hunolt, Bud Booth, Mel Banks, SGT ES DSWG 2 nd Joint Working Group Meeting October 18, 2004. Agenda. CET Goals and Approach Concept for use of the CET How the CET is Implemented Evaluation, Testing, Performance of the CET
E N D
Cost Estimation Toolkit(Version 1) Kathy Fontaine, NASAGreg Hunolt, Bud Booth, Mel Banks, SGT ES DSWG 2nd Joint Working Group Meeting October 18, 2004
Agenda • CET Goals and Approach • Concept for use of the CET • How the CET is Implemented • Evaluation, Testing, Performance of the CET • CET Status
Goal and Approach • Goal: Provide the capability for NASA funded principal investigators or NASA staff to estimate the life cycle costs of implementing and operating a new “data activity”. • A new “data activity” can be a ‘stand-alone’ data service provider such as a DAAC, SIPS, ESIP, etc., or… • A new “data activity” can be a data service provider function within a flight project or other science or applications project. • A new “data activity” can be wholly new or an addition to an existing base (e.g. a DAAC taking on a data service provider role for a flight project). • Approach: • Adopt Cost Estimation by Analogy methodology: base life cycle cost estimates on experience with existing data activities; • Develop reference model; • Collect information from DAACs, SIPSs, ESIPSs, other DAAC-like activities (space science, NOAA); • Build database, develop and test estimating tool.
Concept for Use of CET • For the PI planning a new Data Activity: • To help PI consider the full range of items that will contribute to the life cycle cost of a new data activity (Processing function example follows); • To assess which items are significant cost drivers, to examine sensitivities to possible changes in workload or schedule (e.g. by running ‘what if’s); • To make an estimate that the PI can compare to estimates produced by other means. • For the NASA program staff: • To perform a ‘sanity check’ on PI proposals, to identify areas for follow up with PI; • To make overall program budget estimates. • Cautions for Users: • CET estimate is not definitive or “The Answer”. • Current answers based on what is (maybe or maybe not what ‘should be’). • Estimates five to ten years into the future are fraught with hazard: • Shifts in operating paradigms (driven by technology or changing user needs) may make estimates based on even recent experience less accurate.
How the CET is Implemented • Cost Estimation Toolkit (CET): • Excel VBA (Visual Basic for Applications) Project; • CET workbook includes Activity Dataset worksheets that the user builds – these describe the user’s data activity(s) or user’s ‘what-if’ variants to be estimated, and other internal worksheets; • Estimator, Reviewer, CDB Analyzer implemented in VBA software; • Runs on PC or Macintosh platforms. • Comparables Database (CBD): • Excel Workbook, one worksheet for each data activity; • Identity of CBD data activities is hidden. • Package Available for Users: • CET and CDB workbooks; user needs PC or Mac, Excel 97-or newer; • Users’ Guide, Technical Description Document; • All available on CD from Kathy Fontaine, kathy.fontaine@nasa.gov • Points of contact for help provided.
Independent Testing Results • Results are based on testing with 21 CDB sites: • Eight DAAC data activities, seven SIPS’s, three ESIPs, and one NOAA DAAC-like activity (SAA), two space science DAAC-like activity (STScI, NSSDC). • Test Results for Version 1 CET, Sept 2004: • The typical annual error of estimate is 2.78 FTE (average absolute error, so positive and negative errors don’t cancel). The typical error % of actual is 21.2%. • The standard deviation of the typical error is 1.96. • The overall annual average error across the 21 sites is –0.43 FTE, which is –3.3%, showing low overall bias. • For the individual estimates for the 21 data activities: • 11 have errors less than 20%; • 15 have errors less than 25% • 16 have errors less than 30%; • 17 have errors less than 50%. • So – if you need accuracy within 20% - this suggests you have roughly a 50% chance of getting it – if you need accuracy within 25%, roughly a 70%, within 30%, roughly a 75% chance, and within 50%, roughly an 80% chance.
Improving Average Annual Typical Error for CET Estimator 7.00 6.08 – 39.2% 6.00 4.89 – 34.6% 5.00 4.00 3.29 – 24.4% 2.78 – 21.2% 3.00 2.00 1.00 0.00 IOC September 2003 CET Version: Working Prototype May 2003 Beta Test May 2004 Version 1 Sept 2004 Progress with CET Independent Testing Performance
CET Status • CET Version 1 delivered September, 2004: • Full ‘operational’ version of toolkit: • CET Estimator – take data activity description and produce life cycle cost estimate, run ‘what-if’s; • CET Reviewer – use can refine estimate to reflect technology re-use, building on existing base, institutional support, practical constraints; • CDB Analyzer – provides summary of CDB contents, • CET Users’ Guide: • ‘How to’ on installation and operation of the CET; • Guidance on building your data activity description; • Guidance on interpretation and use of CET estimates; • Background on DSP reference model and functional areas (drop Working Paper 3). • CET Technical Description Document: • Detailed description of estimating processes used by CET • Overall approach information (drop Working Paper 2); • Parameter information (drop Working Paper 4). • Opportunities for use are welcome – especially from those who submitted info initially. • CET and CDB will be sustained as needed to support users. Next scheduled full release is September 2005.
Today’s Demo • You’ll see a demo of the toolkit, and will have an opportunity to test drive it using your own data.