450 likes | 607 Views
AFCAA Database and Metrics Manual . Ray Madachy, Brad Clark, Barry Boehm, Thomas Tan Wilson Rosa, Sponsor USC CSSE Annual Research Review March 8, 2011 . Agenda. Introduction SRDR Overview Review of Accepted Changes SRDR Baseline Issues Data Analysis Manual Reviewers Website. 2.
E N D
AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas Tan Wilson Rosa, Sponsor USC CSSE Annual Research Review March 8, 2011
Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 2
Project Overview • Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. • Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School • Metrics Manual will present data analysis from existing final SRDR data • Additional information is crucial for improving data quality • Data homogeneity is important 2011 USC CSSE Annual Research Review
Software Cost Databases • Purpose: to derive estimating relationships and benchmarks for size, cost, productivity and quality • Previous and current efforts to collect data from multiple projects and organizations • Data Analysis Center for Software (DACS) • Software Engineering Information Repository (SEIR) • International Software Benchmarking Standards Group (ISBSG) • Large Aerospace Mergers (Attempts to create company-wide databases) • USAF Mosemann Initiative (Lloyd Mosemann Asst. Sec. USAF) • USC CSSE COCOMO II repository • DoD Software Resources Data Report (SRDR) • All have faced common challenges such as data definitions, completeness and integrity 2011 USC CSSE Annual Research Review
Research Objectives • Using SRDR data, improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. • Characterize different Application Domains and Operating Environments within DoD • Analyze collected data for simple Cost Estimating Relationships (CER) within each domain • Develop rules-of-thumb for missing data • Make collected data useful to oversight and management entities Data Analysis Cost = a * Xb Data Records CERs 2011 USC CSSE Annual Research Review
Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 6
Software Resources Data Report The Software Resources Data Report (SRDR) is used to obtain both the estimated and actual characteristics of new software developments or upgrades. Both the Government program office and, later on after contract award, the software contractor submit this report. For contractors, this report constitutes a contract data deliverable that formalizes the reporting of software metric and resource data. All contractors, developing or producing any software development element with a projected software effort greater than $20M (then year dollars) on major contracts and subcontracts within ACAT I and ACAT IA programs, regardless of contract type, must submit SRDRs. The data collection and reporting applies to developments and upgrades whether performed under a commercial contract or internally by a government Central Design Activity (CDA) under the terms of a memorandum of understanding (MOU). Reports mandated for Initial Government, Initial Developer, and Final Developer. 2011 USC CSSE Annual Research Review
Submittal Process Government Program Office Data accessed through Defense Cost and Resource Center (DCARC), http://dcarc.pae.osd.mil. The DCARC's Defense Automated Cost Information Management System (DACIMS) is the database with current and historical cost and software resource data for independent, substantiated estimates. 2011 USC CSSE Annual Research Review
Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 9
Proposed SRDR Modifications Data analysis problems with project types, size and effort normalization, and multiple builds motivated some of these: 2011 USC CSSE Annual Research Review
Proposed SRDR Modifications 2011 USC CSSE Annual Research Review
Proposed SRDR Modifications … 2011 USC CSSE Annual Research Review
Proposed SRDR Modifications … 2011 USC CSSE Annual Research Review
Modified Final Developer Form (1/3) 2011 USC CSSE Annual Research Review
Modified Final Developer Form (2/3) 2011 USC CSSE Annual Research Review
Modified Final Developer Form (3/3) 2011 USC CSSE Annual Research Review
Product and Development Description Our recommendation for one operating type and one application domain was not incorporated 2011 USC CSSE Annual Research Review
Staffing Our expanded number of experience levels was adopted for Personnel Experience. 2011 USC CSSE Annual Research Review
Size (1/2) • Accepted these recommendations: • Requirements volatility changed from relative scale to percentage • Reused code with Modifications is our Modified code (report DM, CM IM) • Reused code without Modifications is our Reused code (DM, CM =0, report IM) 2011 USC CSSE Annual Research Review
Size (2/2) Our recommendation for deleted code was accepted. 2011 USC CSSE Annual Research Review
Resource and Schedule We recommended the breakout of QA, CM and PM which were previously under “other”. 2011 USC CSSE Annual Research Review
Quality We recommended more common Defect metrics: Number of Defects Discovered and Number of Defects Removed. Gone is the Mean Time to Serious or Critical Defect (MTTD) or Computed Reliability. 2011 USC CSSE Annual Research Review
Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 23
SRDR Revisions • Two-year update cycle • We submitted our recommendations in Spring 2010 • Received draft of updated SRDR DID in Fall 2010 from committee reflecting our changes and more • Adoption schedule unclear • DCARC website shows 2007 version posted at http://dcarc.pae.osd.mil/Policy/CSDR/csdrReporting.aspx. • Issues: what version(s) to cover in manual for collection and analysis • E.g. previously we had guidance to mitigate downfalls of 2007 version 2011 USC CSSE Annual Research Review
Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 25
SRDR Raw Data (520 observations) PM = 1.67 * KSLOC0.66 2011 USC CSSE Annual Research Review
Data Conditioning Segregate data Normalize sizing data Map effort distribution 2011 USC CSSE Annual Research Review
SRDR Data Segmentation 2011 USC CSSE Annual Research Review 28
SRDR Data Fields • Used • Contractor (assigned OID) • Component description (sanitized) • Development process • Percentage personnel experience • Peak staffing • Amount of relative requirements volatility • Primary & secondary lang. • Months of duration by activity • Hours of effort by activity & comments • Lines of code: new, modified, unmodified, auto-generated • Missing • Our Application and Environment classification • Build information • Effort in Person Months • Adapted code parameter: DM, CM & IM • Equivalent KSLOC 2011 USC CSSE Annual Research Review
Derive Equivalent Size • Normalize the SLOC counting method to Logical SLOC • Physical SLOC count converted to Logical SLOC count by programming language • Non-comment SLOC count converted to Logical SLOC count by programming language • Convert Auto-Generated SLOC convert to Equivalent SLOC (ESLOC) • Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) • DM = CM = 0; IM = 100 • Convert Reused SLOC to ESLOC with AAF formula • DM = CM = 0; IM = 50 • Convert Modified SLOC to ESLOC • Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3 • Default values: Low – Mean – High based on 90% confidence interval • Create Equivalent SLOC count and scale to thousands (K) to derive EKSLOC • (New + Auto-Gen+ Reused+ Modified) / 1000 = EKSLOC • Remove all records with an EKSLOC below 1.0 2011 USC CSSE Annual Research Review
Convert Modified Size to ESLOC Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) Problems with missing DM, CM & IM in SRDR data Program interviews provided parameters for some records For missing data, use records that have data in all fields to derive recommended values for missing data 2011 USC CSSE Annual Research Review 31
Map Effort Distribution Currently don’t use Software Requirements and Developmental Test hours • Labor hours are reported for 7 activities: • Software Requirements • Software Architecture (including Detailed Design) • Software Code (including Unit Testing) • Software Integration and Test • Software Qualification Test • Software Developmental Test & Evaluation • Other (Mgt, QA, CM, PI, etc.) • Create effort distribution percentages for records that have hours in requirements, architecture, code, integration and qualification test phases (developmental test evaluation and other phases may or may not be blank) • Using activity distribution to backfill missing activities makes results worse 2011 USC CSSE Annual Research Review
Team Experience • SRDR Data Definition • Report the percentage of project personnel in each category • Highly Experienced in the domain (three or more years of experience) • Nominally Experienced in the project domain (one to three years of experience) • Entry-level Experienced (zero to one year of experience) • Need to include Team Experience (TXP) in CERs to estimate cost • After analyzing the data, the following quantitative values are assigned: • Highly experienced: 0.60 • Nominally experienced: 1.00 • Entry-level experienced: 1.30 2011 USC CSSE Annual Research Review
Analysis Steps 2011 USC CSSE Annual Research Review
Derive EKSLOC 2011 USC CSSE Annual Research Review
Size - Effort Effort for small software configuration items (CI) appears high. For larger CIs, it appears more normal. 2011 USC CSSE Annual Research Review
Productivity Conundrum • We can confirm the unusually high effort by comparing productivity to size. • We see that productivity is lower for smaller sizes (5 to 40 EKSLOC) than larger sizes. • This is counter to what we believe to be true – productivity should be lower for larger the software CIs (all other factors being held constant). 2011 USC CSSE Annual Research Review
Large Project CER • We also can see that modeling this domain above 50 EKSLOC produces a model that shows more effort is required with larger CIs. This is what we expect. • This model can be used in analysis to help quantify the amount of “extra” effort in projects below 50 KESLOC. 2011 USC CSSE Annual Research Review
Overhead Function • If we use the model from the previous slide on the data below 50 KESLOC, we can express the difference in “what we would expect” (the model) and “what we observe” (the data) as a function of size. • Yet another model can be created to express the decreasing decreasing difference with increasing size. • Call this model the “Overhead Function” 2011 USC CSSE Annual Research Review
Derive Effort 2011 USC CSSE Annual Research Review
Communications Domain After applying the overhead function to the observed effort (by subtracting the overhead function effort from the observed effort), we get an overall Cost Estimating Relationship that seems reasonable. 2011 USC CSSE Annual Research Review
Command & Control Domain 2011 USC CSSE Annual Research Review
Agenda • Introduction • SRDR Overview • Review of Accepted Changes • SRDR Baseline Issues • Data Analysis • Manual Reviewers Website 2011 USC CSSE Annual Research Review 43
Manual Reviewers Website http://csse.usc.edu/afcaa/manual_draft/ Review and Discussion 2011 USC CSSE Annual Research Review
Questions? For more information, contact: Ray Madachy rjmadach@nps.edu Or Brad Clark bkclark@csse.usc.edu Or Wilson Rosa Wilson.Rosa@pentagon.af.mil 2011 USC CSSE Annual Research Review 45