240 likes | 971 Views
Reducing Estimation Uncertainty with Continuous Assessment: Tracking the “ Cone of Uncertainty ”. Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm {aroonvat, sinthop, boehm} @usc.edu November 2, 2010. Outline. Introduction and Motivation Framework Model Experiment Results
E N D
Reducing Estimation Uncertainty with Continuous Assessment:Tracking the “Cone of Uncertainty” Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm {aroonvat, sinthop, boehm} @usc.edu November 2, 2010
Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE
The Cone of Uncertainty Also applies to project estimation accuracy Inexperienced teams Experienced teams © USC-CSSE
Definition • Inexperience • Inexperienced in general • Experienced, but in a new domain • Anything that is new with little knowledge or experience © USC-CSSE
The Problem • Experienced teams can produce better estimates • Use “yesterday’s weather” • Past projects of comparable size • Past data of team’s productivity • Knowledge of accumulated problems and solutions • Inexperienced teams do not have this luxury No tools or data that monitors project’s progression within the cone of uncertainty © USC-CSSE
The Problem • Imprecise project scoping • Overestimate vs. underestimate • Manual assessments are tedious • Complex and discouraging • Project estimation not revisited • Insufficient data to perform predictions • Project’s uncertainties not adjusted • Limitations in software cost estimation • Models cannot fully compensate for lack of knowledge and understanding © USC-CSSE
The Goal • Develop a framework to address mentioned issues • Help unprecedented projects track project progression • Reduce the uncertainties in estimation • Achieve eventual convergence of estimate and actual Must be quick and easy to use © USC-CSSE
Benefits • Improve project planning and management • Resources and goals • Improved product quality control • Actual project progress tracking • Better understanding of project status • Actual progress reports © USC-CSSE
Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE
Estimation Model Integration of the Unified Code Count tool and COCOMO II estimation model Adjusted with REVL © USC-CSSE
Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE
Experiment Setup • Performed simulation on 2 projects from USC software engineering course • Project similarities • Real-client • SAIV: 24-weeks • Architected agile process, 8-member team • Size, type, and complexities • Product • E-services • Web content management system • JSP, MySQL, Tomcat © USC-CSSE
Obtaining Data • Source code files retrieved from Subversion server • Simulation of assessment done weekly • Both teams were closely involved • Provide estimation of module completion • Rationale © USC-CSSE
Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE
Results ~18% Initial estimate ~50% Initial estimate Adjusted estimate Adjusted estimate Accumulated effort Accumulated effort © USC-CSSE
Results • Project progress reaches 100% • Reflects reality • Estimation errors reduced to 0% © USC-CSSE
Outline • Introduction and Motivation • Framework Model • Experiment • Results • Conclusion and future work © USC-CSSE
Conclusion • Both teams demonstrated the same phenomenon • Gaps in estimation errors decrease • Representation of “cone of uncertainty” • Estimation framework reflects the reality of project’s progress • Assessment process was quick and simple • Requires few inputs • Little analysis needed • Assessment framework help inexperienced team improve project tracking and estimation © USC-CSSE
Future Work • Tool development currently in progress • Determine the frequencies of assessments required • The sweet spot • Observe prediction accuracies • Experiment on projects of larger scale • Experiment on projects of different types • Use concept of value-based • Apply weights to calculation of each software module based on priorities and criticalities • How to adjust COCOMO parameters © USC-CSSE
References • Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B. K., Horowitz, E., Madachy, R., Reifer, D. J., and Steece, B. Software Cost Estimation with COCOMO II, Prentice-Hall, 2000. • Cohn, M. Agile Estimating and Planning, Prentice-Hall, 2005 • DeMarco, T. Controlling Software Projects: Management, Measurement, and Estimation, Yourdon Press, 1982 • Fleming, Q. W. and Koppelman, J. M. Earned Value Project Management, 2nd edition, Project Management Institute, 2000 • Jorgensen, M. and Boehm, B. “Software Development Effort Estimation: Formal Models or Expert Judgment?”IEEE Software, March-April 2009, pp. 14-19 • Nguyen, V., Deeds-Rubin, S., Tan, T., and Boehm, B. "A SLOC Counting Standard," COCOMO II Forum 2007 • Stutzke, R. D. Estimating Software-Intensive Systems, Pearson Education, Inc, 2005. © USC-CSSE
Backup Slides © USC-CSSE
Related Work • Software estimation methods • Estimating Software-Intensive Systems [Stutzke, 2005] • Expert-judgement vs. parametric-model [Jorgensen, 2007] • Agile estimation [Cohn, 2005] • Software estimation uncertainty • PERT sizing methods [Nguyen, 2007] • Wideband Delphi estimate distributions [Boehm, 2000] • Software project tracking methods • Controlling Software Projects [DeMarco, 1982] • Earned Value Management [Fleming, 2000] © USC-CSSE