1 / 45

AFCAA Database and Metrics Manual

AFCAA Database and Metrics Manual . Ray Madachy, Brad Clark, Barry Boehm, Thomas Tan Wilson Rosa, Sponsor USC CSSE Annual Research Review March 8, 2011 . Agenda. Introduction SRDR Overview Review of Accepted Changes SRDR Baseline Issues Data Analysis Manual Reviewers Website. 2.

kerri
Download Presentation

AFCAA Database and Metrics Manual

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas Tan Wilson Rosa, Sponsor USC CSSE Annual Research Review March 8, 2011

  2. Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 2

  3. Project Overview • Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. • Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School • Metrics Manual will present data analysis from existing final SRDR data • Additional information is crucial for improving data quality • Data homogeneity is important 2011 USC CSSE Annual Research Review

  4. Software Cost Databases • Purpose: to derive estimating relationships and benchmarks for size, cost, productivity and quality • Previous and current efforts to collect data from multiple projects and organizations • Data Analysis Center for Software (DACS) • Software Engineering Information Repository (SEIR) • International Software Benchmarking Standards Group (ISBSG) • Large Aerospace Mergers (Attempts to create company-wide databases) • USAF Mosemann Initiative (Lloyd Mosemann Asst. Sec. USAF) • USC CSSE COCOMO II repository • DoD Software Resources Data Report (SRDR) • All have faced common challenges such as data definitions, completeness and integrity 2011 USC CSSE Annual Research Review

  5. Research Objectives • Using SRDR data, improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. • Characterize different Application Domains and Operating Environments within DoD • Analyze collected data for simple Cost Estimating Relationships (CER) within each domain • Develop rules-of-thumb for missing data • Make collected data useful to oversight and management entities Data Analysis Cost = a * Xb Data Records CERs 2011 USC CSSE Annual Research Review

  6. Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 6

  7. Software Resources Data Report The Software Resources Data Report (SRDR) is used to obtain both the estimated and actual characteristics of new software developments or upgrades. Both the Government program office and, later on after contract award, the software contractor submit this report. For contractors, this report constitutes a contract data deliverable that formalizes the reporting of software metric and resource data. All contractors, developing or producing any software development element with a projected software effort greater than $20M (then year dollars) on major contracts and subcontracts within ACAT I and ACAT IA programs, regardless of contract type, must submit SRDRs. The data collection and reporting applies to developments and upgrades whether performed under a commercial contract or internally by a government Central Design Activity (CDA) under the terms of a memorandum of understanding (MOU). Reports mandated for Initial Government, Initial Developer, and Final Developer. 2011 USC CSSE Annual Research Review

  8. Submittal Process Government Program Office Data accessed through Defense Cost and Resource Center (DCARC), http://dcarc.pae.osd.mil. The DCARC's Defense Automated Cost Information Management System (DACIMS) is the database with current and historical cost and software resource data for independent, substantiated estimates. 2011 USC CSSE Annual Research Review

  9. Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 9

  10. Proposed SRDR Modifications Data analysis problems with project types, size and effort normalization, and multiple builds motivated some of these: 2011 USC CSSE Annual Research Review

  11. Proposed SRDR Modifications 2011 USC CSSE Annual Research Review

  12. Proposed SRDR Modifications … 2011 USC CSSE Annual Research Review

  13. Proposed SRDR Modifications … 2011 USC CSSE Annual Research Review

  14. Modified Final Developer Form (1/3) 2011 USC CSSE Annual Research Review

  15. Modified Final Developer Form (2/3) 2011 USC CSSE Annual Research Review

  16. Modified Final Developer Form (3/3) 2011 USC CSSE Annual Research Review

  17. Product and Development Description Our recommendation for one operating type and one application domain was not incorporated 2011 USC CSSE Annual Research Review

  18. Staffing Our expanded number of experience levels was adopted for Personnel Experience. 2011 USC CSSE Annual Research Review

  19. Size (1/2) • Accepted these recommendations: • Requirements volatility changed from relative scale to percentage • Reused code with Modifications is our Modified code (report DM, CM IM) • Reused code without Modifications is our Reused code (DM, CM =0, report IM) 2011 USC CSSE Annual Research Review

  20. Size (2/2) Our recommendation for deleted code was accepted. 2011 USC CSSE Annual Research Review

  21. Resource and Schedule We recommended the breakout of QA, CM and PM which were previously under “other”. 2011 USC CSSE Annual Research Review

  22. Quality We recommended more common Defect metrics: Number of Defects Discovered and Number of Defects Removed. Gone is the Mean Time to Serious or Critical Defect (MTTD) or Computed Reliability. 2011 USC CSSE Annual Research Review

  23. Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 23

  24. SRDR Revisions • Two-year update cycle • We submitted our recommendations in Spring 2010 • Received draft of updated SRDR DID in Fall 2010 from committee reflecting our changes and more • Adoption schedule unclear • DCARC website shows 2007 version posted at http://dcarc.pae.osd.mil/Policy/CSDR/csdrReporting.aspx.  • Issues: what version(s) to cover in manual for collection and analysis • E.g. previously we had guidance to mitigate downfalls of 2007 version 2011 USC CSSE Annual Research Review

  25. Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 25

  26. SRDR Raw Data (520 observations) PM = 1.67 * KSLOC0.66 2011 USC CSSE Annual Research Review

  27. Data Conditioning Segregate data Normalize sizing data Map effort distribution 2011 USC CSSE Annual Research Review

  28. SRDR Data Segmentation 2011 USC CSSE Annual Research Review 28

  29. SRDR Data Fields • Used • Contractor (assigned OID) • Component description (sanitized) • Development process • Percentage personnel experience • Peak staffing • Amount of relative requirements volatility • Primary & secondary lang. • Months of duration by activity • Hours of effort by activity & comments • Lines of code: new, modified, unmodified, auto-generated • Missing • Our Application and Environment classification • Build information • Effort in Person Months • Adapted code parameter: DM, CM & IM • Equivalent KSLOC 2011 USC CSSE Annual Research Review

  30. Derive Equivalent Size • Normalize the SLOC counting method to Logical SLOC • Physical SLOC count converted to Logical SLOC count by programming language • Non-comment SLOC count converted to Logical SLOC count by programming language • Convert Auto-Generated SLOC convert to Equivalent SLOC (ESLOC) • Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) • DM = CM = 0; IM = 100 • Convert Reused SLOC to ESLOC with AAF formula • DM = CM = 0; IM = 50 • Convert Modified SLOC to ESLOC • Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3 • Default values: Low – Mean – High based on 90% confidence interval • Create Equivalent SLOC count and scale to thousands (K) to derive EKSLOC • (New + Auto-Gen+ Reused+ Modified) / 1000 = EKSLOC • Remove all records with an EKSLOC below 1.0 2011 USC CSSE Annual Research Review

  31. Convert Modified Size to ESLOC Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) Problems with missing DM, CM & IM in SRDR data Program interviews provided parameters for some records For missing data, use records that have data in all fields to derive recommended values for missing data 2011 USC CSSE Annual Research Review 31

  32. Map Effort Distribution Currently don’t use Software Requirements and Developmental Test hours • Labor hours are reported for 7 activities: • Software Requirements • Software Architecture (including Detailed Design) • Software Code (including Unit Testing) • Software Integration and Test • Software Qualification Test • Software Developmental Test & Evaluation • Other (Mgt, QA, CM, PI, etc.) • Create effort distribution percentages for records that have hours in requirements, architecture, code, integration and qualification test phases (developmental test evaluation and other phases may or may not be blank) • Using activity distribution to backfill missing activities makes results worse 2011 USC CSSE Annual Research Review

  33. Team Experience • SRDR Data Definition • Report the percentage of project personnel in each category • Highly Experienced in the domain (three or more years of experience) • Nominally Experienced in the project domain (one to three years of experience) • Entry-level Experienced (zero to one year of experience) • Need to include Team Experience (TXP) in CERs to estimate cost • After analyzing the data, the following quantitative values are assigned: • Highly experienced: 0.60 • Nominally experienced: 1.00 • Entry-level experienced: 1.30 2011 USC CSSE Annual Research Review

  34. Analysis Steps 2011 USC CSSE Annual Research Review

  35. Derive EKSLOC 2011 USC CSSE Annual Research Review

  36. Size - Effort Effort for small software configuration items (CI) appears high. For larger CIs, it appears more normal. 2011 USC CSSE Annual Research Review

  37. Productivity Conundrum • We can confirm the unusually high effort by comparing productivity to size. • We see that productivity is lower for smaller sizes (5 to 40 EKSLOC) than larger sizes. • This is counter to what we believe to be true – productivity should be lower for larger the software CIs (all other factors being held constant). 2011 USC CSSE Annual Research Review

  38. Large Project CER • We also can see that modeling this domain above 50 EKSLOC produces a model that shows more effort is required with larger CIs. This is what we expect. • This model can be used in analysis to help quantify the amount of “extra” effort in projects below 50 KESLOC. 2011 USC CSSE Annual Research Review

  39. Overhead Function • If we use the model from the previous slide on the data below 50 KESLOC, we can express the difference in “what we would expect” (the model) and “what we observe” (the data) as a function of size. • Yet another model can be created to express the decreasing decreasing difference with increasing size. • Call this model the “Overhead Function” 2011 USC CSSE Annual Research Review

  40. Derive Effort 2011 USC CSSE Annual Research Review

  41. Communications Domain After applying the overhead function to the observed effort (by subtracting the overhead function effort from the observed effort), we get an overall Cost Estimating Relationship that seems reasonable. 2011 USC CSSE Annual Research Review

  42. Command & Control Domain 2011 USC CSSE Annual Research Review

  43. Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 43

  44. Manual Reviewers Website http://csse.usc.edu/afcaa/manual_draft/ Review and Discussion 2011 USC CSSE Annual Research Review

  45. Questions? For more information, contact: Ray Madachy rjmadach@nps.edu Or Brad Clark bkclark@csse.usc.edu Or Wilson Rosa Wilson.Rosa@pentagon.af.mil 2011 USC CSSE Annual Research Review 45

More Related