520 likes | 762 Views
Software Sizing, Estimation, and Tracking . Pongtip Aroonvatanaporn CSCI577 Spring 2012 February 10, 2012. Outline. Terms and Definitions Software Sizing Software Estimation Software/Project Tracking. Terms and Definitions. Software Sizing Mechanism to estimate size and complexity
E N D
Software Sizing, Estimation, and Tracking PongtipAroonvatanaporn CSCI577 Spring 2012 February 10, 2012 (C) USC-CSSE
Outline • Terms and Definitions • Software Sizing • Software Estimation • Software/Project Tracking (C) USC-CSSE
Terms and Definitions • Software Sizing • Mechanism to estimate size and complexity • Software Estimation • Mechanism to estimate effort, time, duration • Software/Project Tracking • Mechanism to manage project progress (C) USC-CSSE
Software Sizing • Agile Techniques • Story points • Planning Poker • Traditional Techniques • Expert Judgment • Function Points • Application Points • Uncertainty treatment • PERT Sizing • Wideband Delphi • COCOMO-U (C) USC-CSSE
Story Points (C) USC-CSSE
Story Points: What? • Estimation mechanism based on user stories • Features/capabilities = story point • Often used by Scrum team • Strong focus on agile process • A way to estimate difficulty • Without committing time duration • Measure size and complexity • Essentially, how hard it is (C) USC-CSSE
Story Points: Why? • Better than hours • Humans are not good at estimating hours • Standish group survey • 68% projects failed to meet original estimates • Hours completed tells nothing • No useful information for clients/customers • Story points can provide roadmap of capabilities to be delivered • Less variation (C) USC-CSSE
Story Points: How To? • Involves the entire team • Process • Look at the backlog of features • Pick the easiest • Give a score to that feature (i.e. 2) • Estimate other features relative to that point • Cohn Scale • Fibonacci • 0, 1, 2, 3, 5, 8, 13, 20, 40, 100 (C) USC-CSSE
Story Points: How To? • Velocity • First sprint… • After 2-3 sprints, average story points completed • Velocity used for planning future iterations guess (C) USC-CSSE
Story Points: The Good • Estimate backlog • Focus on product, not tasks • Items that are valuable to clients/customers • Track progress based on results delivered • Hours are bad • 1 hour for most productive team = hours for least productive team • In industry • Story point estimation cuts estimation time by 80% • More estimation and tracking than typical waterfall • 48 times faster than traditional waterfall estimations 2000 (C) USC-CSSE
Story Points: The Bad • Publishing vs. Development • Less effort for publishing • Complexity vs. Time • Some stories are intellectually complex • Some stories are simply time consuming • Less complex, but repetitive tasks get lower numbers. • No accurate on actual effort required • Some developers prefer hours and days • Difficult to determine completion time without velocity (C) USC-CSSE
Story Points: Example • Students can purchase monthly parking passes online • Parking passes can be paid via credit cards • Parking passes can be paid via PayPal • Professors can input student marks • Students can obtain their current seminar schedule • Students can order official transcripts • Students can only enroll in seminars for which they have pre-requisites • Transcripts will be available online via a standard browser http://www.agilemodeling.com/artifacts/userStory.htm (C) USC-CSSE
Planning Poker (C) USC-CSSE
Planning Poker: What? • A mechanism to • Introduce estimation • Invoke discussions • Like playing poker • Each person has cards • Reveal cards at the same time (C) USC-CSSE
Planning Poker: Why? • Multiple expert opinions • Knowledgeable • Best suited for estimation tasks • Estimates require justifications • Improve accuracies • Better compensated for missing information • Good for story point estimation • Average of estimates gives better results (C) USC-CSSE
Planning Poker: How? • Include all developers • Process • Each estimator given a deck of cards • For each user story, moderator reads the description • Discuss about story until all questions are answered • Each estimator selects a card representing his/her estimate • Everyone show their cards at the same time (avoid bias) • High and low estimators explain their estimates • Discuss about estimates. • Estimators re-select cards • If estimates converge, take the average. • If estimates do not converge, repeat the process. (C) USC-CSSE
Planning Poker: How? • Done at two different times • First • Before project begins • Estimate large number of items • Initial set of user stories • Second • During the end of each iteration • Estimate for the upcoming iteration (C) USC-CSSE
Planning Poker: The Good • Fun and enjoyable • Convergence of estimates • More accurate • Justifications • Invoke group discussions • Improve understandings • Improve perspectives (C) USC-CSSE
Planning Poker: The Bad • Easy to get into excessive amount of discussion • Not accurate estimates • Bad results • Require high level of expertise • Opinions • Analogies • High and low estimators may be viewed as “attackers” (C) USC-CSSE
Function Points (C) USC-CSSE
Function Points: What? • Quantify the functionality • Measure development and maintenance • Independent of technology • Consistently across all projects • Unit of measure representing the function size • Application = number of functions delivered • Based on user’s perspective • What user asked for, not what is delivered • Low cost and repeatable • Good for estimating use-cases (C) USC-CSSE
Function Points: How? • Process • Determine function counts by type • Determine complexity level. Classify each function count by complexity levels • Apply complexity weights • Compute Unadjusted Function Points. Add all the weighted function counts to get one number (UFP) (C) USC-CSSE
Function Points: How? • Data Functions • Internal logical files • External interface files • Transactional functions • External inputs • External outputs • External inquiries (C) USC-CSSE
Internal Logical Files • Data that is stored and maintained within your application • Data that your application is built to maintain • Examples • Tables in database • Flat files • Application control information • Configuration • Preferences • LDAP data stores (C) USC-CSSE
External Interface Files • Data that your application uses/references • But not maintained by your application • Any data that your application needs • Examples • Same as Internal Logical Files • But not maintained by your system (C) USC-CSSE
External Inputs • Unique user data or user control input that enters the application • Comes from external boundary • Examples • Data entry by users • Data or file feeds by external applications (C) USC-CSSE
External Output • User data or control that leaves the applications • Leaves the external boundary • Present information • Retrieval of data or control • Examples • Reports • Data display on screen (C) USC-CSSE
External Inquiry • Unique input-output combination • Input causes/generates immediate output • No mathematical formulas or calculations • Create no derived data • No Internal Logical Files maintained during processing • Behavior of system not altered • Examples • Reports that do not involve derived data(direct queries) (C) USC-CSSE
Complexity Levels (C) USC-CSSE
Function Point to SLOC • COCOMO II has built-in calibration for converting Unadjusted FP to SLOC • First specify the implementation language/technology • Apply the multiplier (SLOC/UFP) • More information can be found in COCOMO II book (C) USC-CSSE
Function Points: The Good • Independent of programming language and technology • Help derive productivity and quality performance indicators • Benchmarking • Productivity rate • Cost/FP • Guard against increase in scope • Function creep (C) USC-CSSE
Function Points: The Bad • Requires subjective evaluations • A lot of judgment involved • Many cost estimation models do not support function points directly • Need to be converted to SLOC first • Not as much research data available compared to LOC • Can only be performed after design specification (C) USC-CSSE
Estimating with Uncertainty? (C) USC-CSSE
Uncertainty Treatment • PERT Sizing • Use distribution • Specify pessimistic, optimistic, and most likely sizes • Biased? • Wideband Delphi • Experts discuss and estimate individually • Discussions focus on points where estimates vary widely • Reiterate as necessary • COCOMO-U • Extension to COCOMO II • Use Bayesian Belief Network to address uncertain parameters • Provide range of possible values (C) USC-CSSE
Workshop time! (C) USC-CSSE
Scenario • Develop a software system for Effort Reporting • Sounds familiar? • Software Requirements • User authentication • User capabilities • Select Week • Submit weekly effort • View/Update weekly effort • View weekly total • Admin capabilities • View grade report by user (on time submission) • Add/view/edit effort categories (C) USC-CSSE
Outline • Terms and Definitions • Software Sizing • Software Estimation • Software/Project Tracking (C) USC-CSSE
Project Tracking • Goal-Question-Metric • PERT Network Chart • Gantt Chart • Burn Up and Burn Down Charts (C) USC-CSSE
Goal-Question-Metric: What? • By Victor Basili, University of Maryland and NASA • Software metric approach • Capturesmeasurement on three levels • Conceptual level (goal) • Defined for an object • Operational level (question) • Define models of the object of study • Quantitative level (metric) • Metrics associated with each question in a measurable way (C) USC-CSSE
Goal-Question-Metric: Why? • Used within context of software quality improvement • Effective for the following purposes: • Understanding organization’s software practices • Guiding and monitoring software processes • Assessing new software engineering technologies • Evaluating improvement activities (C) USC-CSSE
Goal-Question-Metric: How? • Six-step process • Develop a set of corporate, division, and project business goals • Generate questions defining those goals • Specify measures needed to be collected to answer questions • Develop mechanisms for data collection • Collect, validate,and analyze data. Provide feedback in real-time • Analyze data in post mortem fashion. Provide recommendations for future improvements. (C) USC-CSSE
Goal-Question-Metric: The Good • Align with organization environment • Objectives and goals • Project context • Flexible (C) USC-CSSE
Goal-Question-Metric: The Bad • Only useful when used correctly • Must specify the right goals, questions, and metrics to measure • Requires experience and high level of knowledge to use • No explicit support for integrating with higher-level business goals and strategies • Some things cannot be measured (C) USC-CSSE
GQM+Strategies: What? • An extension of GQM • Built on top • Link software measurement goals to higher-level goals • Software organization • Entire business (C) USC-CSSE
GQM+Strategies: Example • Wants: Increase customer satisfaction • Strategy: Improve product reliability • Both hardware and software • Software development contribution • Reduce defect slippage • Improve testing process • Team leaders decide on set of actions to take • Implement improvements • Measure results of improvements • A tie between test defect data and customer satisfaction (C) USC-CSSE
GQM+Strategies: Example (C) USC-CSSE
Other Project Management Methods (C) USC-CSSE
PERT Network Chart • Identify critical paths • Nodes updated to show progress • Grows quickly • Becomes unusable when large • Especially in smaller agile environments • Eventually gets thrown away (C) USC-CSSE
“Burn” Charts Burn Up Burn Down • Effective in tracking progress • Good for story points • Not good at responding to major changes (C) USC-CSSE
Workshop Time! (C) USC-CSSE