270 likes | 388 Views
SLOC and Size Reporting. Pongtip Aroonvatanaporn [Material by Ray Madachy ] CSCI577b Spring 2011 February 4, 2011. Outline. Size Reporting Process SLOC Counting Rules Reused and Modified Software COCOMO Model The Unified Code Count tool Conclusion. Goal of Presentation.
E N D
SLOC and Size Reporting PongtipAroonvatanaporn [Material by Ray Madachy] CSCI577b Spring 2011 February 4, 2011 (C) USC-CSSE
Outline • Size Reporting Process • SLOC Counting Rules • Reused and Modified Software • COCOMO Model • The Unified Code Count tool • Conclusion (C) USC-CSSE
Goal of Presentation • Understanding the size data required at IOC • Why? • Important historical data on 577 process • Process performance • Can be used for COCOMO calibration • Specially calibrated COCOMO for 577 • Current COCOMO utilize 200+ projects to calibrate • Help identify COCOMO deficiencies and additional needs (C) USC-CSSE
Size Reporting Process • Determine what you produced and quantify it • Code developed new, reused, and modified • Apply code counter to system modules • Apply reuse parameters to all reused and modified code to get equivalent size (C) USC-CSSE
Size Reporting Process • Identify any software not counted • Provide as much background as possible • Someone can follow up and fill in the gaps • Acceptable to use function point counts as a last resort • Problem with counting COTS code • Provide COCOTS inputs if doing COTS-intensive development • COTS-development contributes the majority of effort, but size cannot be counted (C) USC-CSSE
Size Reporting Process • Finalizing the report • Add up all equivalent lines of code • The same top-level size measure that COCOMO uses • Count by modules • The modules should be consistent with your COCOMO estimation • Otherwise, nearly impossible to compare with estimates (C) USC-CSSE
Lines of Code • Source lines of code (SLOCs) • Logical source statements • NOT physical statements • Logical source statements • Data declarations • Non-executable statements that affect an assembler’s or compiler’s interpretation • Executable statements • Cause runtime actions (C) USC-CSSE
Lines of Code Example 1 Example 2 1 String[] command = 2 { 3 “cmd.exe”, 4 “/C”, 5 “-arg1”, 6 “-arg2”, 7 “-arg3” 8 }; 1 int arg1=0; int arg2=4; String ans; 2 ans = “Answer is”; System.out.println(ans+arg1+arg2); LOC = 1 LOC = 5 (C) USC-CSSE
SLOC Counting Rules • Standard definition for counting lines • Based on SEI definition • Modified for COCOMO • When line or statement contains more than one type • Classify it as type with highest precedence (C) USC-CSSE
SLOC Counting Rules (C) USC-CSSE
SLOC Counting Rules (C) USC-CSSE
Reused and Modified Software • Also categorized as “adapted software” • Problem: • Effort for adapted software is not the same as for new software • How to compare effort for reused and modified software with new software? • Counting approach: • Convert adapted software into equivalent size of new software (C) USC-CSSE
Reuse Size-Cost Model Does not cross origin due to cost for assessing, selecting, and assimilating reusable components ~ 5% • Non-linear because small modifications generation disproportionately large costs • Cost of understanding software • Relative cost of interface checking (C) USC-CSSE
COCOMO Reuse Model • Non-linear estimation model • Convert adapted software into equivalent size of new software Percent Design Modified Percent Code Modified Percent of effort for integration and test of modified Adaptation Adjustment Factor Adaptation Adjustment Multipliers (AAM) Equivalent SLOC Assessment and Assimilation Effort Software Understanding Unfamiliarity (C) USC-CSSE
Reuse Model Parameters • DM – Percent Design Modified • Percentage of adapted software’s design modified to fit it to new objectives • CM – Percent Code Modified • Percentage of “reused” code modified to fit it to new objectives • IM – Percentage of effort for integration and test of modified software • Relative to new software of comparable size IM = 100 * I&T Effort (modified software) / I&T Effort (new software) (C) USC-CSSE
Reuse Model Parameters (AA) • Assessment & Assimilation Effort • Effort needed to: • Determine whether fully-reused software is appropriate • Integrate its description into overall product description (C) USC-CSSE
Reuse Model Parameters (SU) • Software Understanding Effort • When code isn’t modified (DM=0, CM=0), SU=0 • Take subjective average of 3 categories (C) USC-CSSE
Reuse Model Parameters (UNFM) • Unfamiliarity • Effect of programmer’s unfamiliarity with software (C) USC-CSSE
Improved Reuse Model • Unified model for both reuse and maintenance • New calibration performed by Dr. Vu Nguyen • SLOC modified and deleted are considered to be equivalent to SLOC added 0.3 1 (C) USC-CSSE
Reuse Parameter Guidelines (C) USC-CSSE
Data Collection (C) USC-CSSE
Data Collection • Refer to COCOMO model definition for details on various parameters • DM, CM, IM, etc • Indicate the counting method you used • Manual approach? • Automated? • Available Code Counters • CSSE Code Counter: UCC • Code Counter developed as part of CSC 665 Advanced Software Engineering project • Third party. But make sure that the counting rules are consistent. (C) USC-CSSE
The Unified Code Count Tool • Developed at USC-CSSE • Based on the counting rule standards established by SEI • Evolved to count all major languages including web platforms • Can be used to determine modified code (changed and deleted) • Use this data to find equivalent “new” code (C) USC-CSSE
Conclusion • Software sizing and reporting is more than just simple line counting • Finding actual effort based on equivalent sizing • Only logical source code contributes to effort • Accurate reporting is essential • For research purposes • Process performance evaluation and calibration • Future planning and productivity predictions • Give background on software pieces not counted (C) USC-CSSE