1 / 20

Next-Generation Software Sizing and Costing Metrics Workshop Report

Next-Generation Software Sizing and Costing Metrics Workshop Report. Wilson Rosa, Barry Boehm, Ray Madachy, Brad Clark USC CSSE Annual Research Review March 9, 2010. Topics. AFCAA Study Overview Current and Future Challenges for Software Cost Estimation and Data Collection

urvi
Download Presentation

Next-Generation Software Sizing and Costing Metrics Workshop Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Next-Generation Software Sizing and Costing Metrics Workshop Report Wilson Rosa, Barry Boehm, Ray Madachy, Brad Clark USC CSSE Annual Research Review March 9, 2010

  2. Topics • AFCAA Study Overview • Current and Future Challenges for Software Cost Estimation and Data Collection • Proposed Metrics Definition Highlights • Productivity Data Analysis and Issues for Discussion This work is sponsored by the Air Force Cost Analysis Agency USC CSSE Annual Research Review - Mar 2010

  3. Project Background • Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. • Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School • We will publish the AFCAA Software Cost Estimation Metrics Manual to help analysts and decision makers develop accurate, easy and quick software cost estimates for avionics, space, ground, and shipboard platforms. USC CSSE Annual Research Review - Mar 2010

  4. AFCAA Software Cost Estimation Metrics Manual • Chapter 1: Software Estimation Principles • Chapter 2: Product Sizing • Chapter 3: Product Growth • Chapter 4: Effective SLOC • Chapter 5: Historical Productivity • Chapter 6: Model Calibration • Chapter 7: Calibrated SLIM-ESTIMATE • Chapter 8: Cost Risk and Uncertainty Metrics • Chapter 9: Data Normalization • Chapter 10: Software Resource Data Report • Chapter 11: Software Maintenance • Chapter 12: Lessons Learned USC CSSE Annual Research Review - Mar 2010

  5. Stakeholder Communities • Research is collaborative across heterogeneous stakeholder communities who have helped us in refining our data definition framework, domain taxonomy and providing us project data. • Government agencies • Tool Vendors • Industry • Academia SLIM-Estimate™ TruePlanning® by PRICE Systems USC CSSE Annual Research Review - Mar 2010

  6. Current and Future DoD Cost Estimation Challenges Emergent requirements Cannot prespecify requirements, cost, schedule, EVMS Need to estimate and track early concurrent engineering Rapid change Long acquisition cycles breed obsolescence DoD Inst 5000.02 emphasis on evolutionary acquisition Net-centric systems of systems Incomplete visibility and control of elements Model, COTS, service-based, Brownfield systems New phenomenology, counting rules Always-on, never-fail systems Need to balance agility and high assurance USC CSSE Annual Research Review - Mar 2010

  7. Rapid Change Creates a Late Cone of Uncertainty– Need evolutionary/incremental vs. one-shot development Uncertainties in competition, technology, organizations, mission priorities USC CSSE Annual Research Review - Mar 2010

  8. Incremental Development Productivity Decline (IDPD) • Example: Site Defense BMD Software • 5 builds, 7 years, $100M; operational and support software • Build 1 productivity over 300 LOC/person month • Build 5 productivity under 150 LOC/PM • Including Build 1-4 breakage, integration, rework • 318% change in requirements across all builds • IDPD factor = 20% productivity decrease per build • Similar trends in later unprecedented systems • Not unique to DoD: key source of Windows Vista delays • Maintenance of full non-COTS SLOC, not ESLOC • Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC • Build 2: 400 KSLOC of Build 1 software to maintain, integrate USC CSSE Annual Research Review - Mar 2010

  9. SRDR Data Source USC CSSE Annual Research Review - Mar 2010

  10. Proposed Metrics Definition Highlights • Data quality and standardization issues • No reporting of equivalent “new” code size inputs: Design Modified, Code Modified, Integration Testing Modified, Software Understandability, Programmer Unfamiliarity • No common SLOC reporting – logical, physical, etc. • No standard definitions – Application Domain, Build, Increment, Spiral,… • No common effort reporting – analysis, design, code, test, CM, QA,… • No reporting of quality measures – defect density, defect containment, etc. USC CSSE Annual Research Review - Mar 2010

  11. Proposed Metrics Definition Highlights • Limited empirical research within DoD on other contributors to productivity besides effort and size: • Operating Environment, Application Domain, and Product Complexity • Personnel Capability • Required Reliability • Quality – Defect Density, Defect Containment • Integrating code from previous deliveries – Builds, Spirals, Increments, etc. • Converting to Equivalent SLOC • Reported code sizes for Modified, Unmodified/Reused add no value to a cost estimate unless they translate into equivalent “new” SLOC • This research and the resulting Cost Metrics Manual will discuss and address these issues USC CSSE Annual Research Review - Mar 2010

  12. SRDR Data Missing Domains: Internet, Maintenance and Diagnostics, Spacecraft bus Notes: SRDR: Software Resources Data Report USC CSSE Annual Research Review - Mar 2010

  13. Data Analysis Issues Preliminary Results - Do Not Use! PM=15*(EKSLOC)0.62 USC CSSE Annual Research Review - Mar 2010

  14. Sizing Issues -1 NCSS to Logical SLOC Conversion Ada: 45% C/C++: 61% C#: 61% Java: 72% USC CSSE Annual Research Review - Mar 2010

  15. Sizing Issues -2 • No Modified Code parameters • Percent Design Modified (DM) • Percent Code Modified (CM) • Percent Integration and Test Modified (IM) • Software Understanding (SU) • Programmer Unfamiliarity (UNFM) • Program interviews provided parameters for some records USC CSSE Annual Research Review - Mar 2010

  16. Effort Issues • Missing effort reporting for different lifecycle phases • Software requirements analysis (REQ) • Software architectural design (ARCH) • Software coding and testing (CODE) • Software integration (INT) • Software qualification testing (QT) • Software management, CM, QA, etc. (Other – very inconsistent) USC CSSE Annual Research Review - Mar 2010

  17. Application Domain Issues USC CSSE Annual Research Review - Mar 2010

  18. Proposed SRDR Changes -1 USC CSSE Annual Research Review - Mar 2010

  19. Proposed SRDR Changes -2 USC CSSE Annual Research Review - Mar 2010

  20. Concluding Remarks • Goal is to publish a manual to help analysts develop quick software estimates using empirical metrics from recent programs • Additional information is crucial for improving data quality across DoD • We want your input on Productivity Domains and Data Definitions • Looking for collaborators • Looking for peer-reviewers • Need more data USC CSSE Annual Research Review - Mar 2010

More Related