220 likes | 575 Views
Application Development Estimation Techniques and Tips Austin, TX – February 2009 * * * Discussion document, Strictly confidential & proprietary * * * Agenda … TODAY WE WILL DISCUSS TECHNIQUES AND TIPS FOR BETTER ESTIMATION ON OUR APPLICATION DEVELOPMENT PROJECTS
E N D
Application DevelopmentEstimation Techniques and TipsAustin, TX – February 2009 * * * Discussion document, Strictly confidential & proprietary * * *
Agenda … TODAY WE WILL DISCUSS TECHNIQUES AND TIPS FOR BETTER ESTIMATION ON OUR APPLICATION DEVELOPMENT PROJECTS • Discussion of Our Current Estimation Practices • Estimation Quiz • Industry Estimation Statistics • Common Problems with Estimates • Techniques & Tips to Improve Our Estimates • Estimate Size, Derive Duration • Estimation Techniques • How to Convert a Size Estimate into a Duration Estimate • Estimation Checklist • Other Discussion Topics • Resources
Discussion of Our Current Estimation Practices … WE ESTIMATE PROJECTS AND TASKS ALL THE TIME – HOW DO WE DO IT TODAY? • What is an estimate? • How do we currently come up with our estimates? • How accurate are we? • What are the benefits of a good estimate? • What is the cost of a bad estimate? • What are some of the biggest challenges with estimating? • How is an estimate different than a target?
Estimation Quiz … LET’S TAKE A QUIZ TO FIND OUT HOW GOOD WE ARE AT ESTIMATING – WE MIGHT NOT EVEN NEED TO FINISH THIS PRESENTATION … Instructions • For each question, fill in the upper and lower bounds that, in your opinion, give you a 90% chance of including the correct value • Be careful not to make your ranges too narrow or too wide ESTIMATION QUIZ Notes: 1) Source: Software Estimation by Steve McConnell
Estimation Quiz … HOW MANY OF YOUR RANGES INCLUDED THE CORRECT ANSWER? ESTIMATION QUIZ - ANSWERS Notes: 1) Source: Software Estimation by Steve McConnell
Estimation Quiz … WHAT DID THAT QUIZ SHOW US? HOW CONFIDENT IS 90% CONFIDENT? Conclusions • Is our “gut feel” for 90% confidence accurate or just wishful thinking? • How did we do compared to these people? SAMPLE RESULTS Notes: 1) Source: Software Estimation by Steve McConnell
Industry Estimation Statistics … SOFTWARE DEVELOPMENT TEAMS ARE HORRIBLE AT ESTIMATION – AND WE ALMOST NEVER OVERESTIMATE SOFTWARE PROJECT SUCCESS RATIOS • Several different studies show that ~25% of projects finish on time / on budget while ~75% are either late, over budget or cancelled completely • The larger the system / project, the smaller the chance for success Notes: 1) Source: Chaos Report by The Standish Group 2) Source: Estimating Software Costs by Capers Jones
Common Problems with Estimates … HERE ARE SOME OF THE COMMON PROBLEMS WITH SOFTWARE DEVELOPMENT ESTIMATES • We try to estimate really big projects / chunks: 86% of really large projects fail completely or are delivered late / over budget vs. 8% of really small projects • We don’t put enough padding (in the form of ranges) into our estimates even though we know we will receive inaccurate or limited information about the project being estimated (Cone of Uncertainty…) • We have an expert create the estimate and a beginner do the work • We derive duration based on “ideal days” instead of “reality days” (e.g., we spend 2 hours per day in meetings, vacations, holidays, etc.) • We omit key activities when thinking through our estimates • Non-feature requirements (e.g., development environment setup, build scripts, data migration, performance testing / resolution, usability changes, etc.) • Iteration overhead (e.g., defect resolution, demo preparation, Sprint Planning, knowledge transfer, documentation, code reviews / refactoring, support for QA, ramp up time for new team members, etc.) • Non-working days (e.g., holidays, PTO, company meetings, etc.) • We consistently underestimate “legacy system” and 3rd party system integration, impact of using new technologies / frameworks, etc. • We provide single point or ranged estimates that are too narrow • We provide “off the cuff” estimates without thinking through the actual steps / components for that item • We are overly optimistic (e.g., “we’ll save some time here because this will be just like this other one…”, etc.)
The Cone of Uncertainty … THE CONE OF UNCERTAINTY IS A MODEL FOR DETERMINING HOW ACCURATE YOUR ESTIMATES CAN BE, GIVEN THE CURRENT STATE OF THE PROJECT • The “Cone of Uncertainty” represents the best possible accuracy that your estimates can have at a given stage of the project • Do our ranged estimates provide an adequate range relative to the stage of the project we are in?
Techniques & Tips to Improve Our Estimates … WE SHOULD BE ABLE TO IMPROVE OUR ESTIMATES BY USING SOME OF THESE TECHNIQUES AND TIPS • Estimate Size, Derive Duration • When estimating size, use a couple of different estimation techniques (include Planning Poker as one of these) to validate the estimates • Consider the difference between Ideal Time and Elapsed Time when forecasting velocity and determining duration • Go through a quick checklist of commonly omitted tasks to catch more of those and include them in your estimate • Understand the project’s position in the “Cone of Uncertainty” and provide an appropriate range for your estimate
Estimate Size, Derive Duration … IT IS BETTER TO SEPARATE THE ESTIMATION OF SIZE FROM THE ESTIMATION OF DURATION – FIRST ESTIMATE “HOW BIG?” AND THEN DERIVE “HOW LONG?” Desired Features • Separate the estimating of size from the estimation of duration to improve the accuracy & consistency of your estimates Determine Schedule Estimate Size Derive Duration EXAMPLE Suppose I am tasked with moving a big pile of dirt from my front yard to my backyard. I could look at the pile, assess my tools (shovel an wheelbarrow) and directly estimate that the job will take 3 hours (bypassing any estimate of size and not really based on anything other than a judgment or guess). Instead, what if I looked at the pile and based on its rough dimensions estimated that it was 300 cubic feet of dirt (Now I need to derive a duration from that). A label on the wheelbarrow says that it will hold 6 cubic feet. Dividing 300 by 6, I determine that it will take 50 trips with the wheelbarrow. I estimate that each trip will take 3 minutes to load the wheelbarrow, 2 minutes to walk to the backyard and dump and 1 minute to walk back to the front yard (total of 6 minutes). Now I can calculate the duration by multiplying 50 trips by 6 minutes per trip to estimate that it will take 300 minute (5 hours). How could you apply this to a software estimate?
Estimation Techniques … THERE ARE A NUMBER OF TECHNIQUES TO DETERMINE HOW BIG SOMETHING IS – USE MORE THAN ONE FOR VALIDATION • Expert Judgment – Expert judgment is the most common approach (industry and Credera) but can be very inaccurate for a number of reasons (e.g., different resource doing the work, omitted tasks, etc.). This is a pretty good approach if the “expert” is the person that will be performing the work. • Estimate by Analogy – Estimating by analogy involves using similar past projects or tasks (informally) to estimate the current work. This can be very effective if you’re sure it is similar (e.g., requirements, assumptions, environment, etc.) and you have fairly accurate and detailed recollection of the actuals from the previous project. • Disaggregation – Also known as “bottom up” is the practice of breaking an estimate into multiple pieces, estimating each piece and then summing the individual estimates into an aggregate – it is easier to estimate small things, you’ll catch some otherwise omitted tasks and you benefit from the Law of Large Numbers (i.e., your individual estimate errors – high and low – will cancel each other out to a certain degree). This can be very hard to do (with any real granularity) at the early stages of the “Cone of Uncertainty”. • Historical Data – If available, use of historical data (estimates and actuals) can be very valuable in the estimation process • Planning Poker – Planning Poker is a collaborative estimation technique where the team determines the estimates together. This has a number of benefits (e.g., people doing the work are estimating the work, uncovers missed tasks or bad assumptions, estimates must be defended
The Project Manager or Product Owner kicks off the Sprint Planning Session … The team thinks about the backlog item being estimate … Planning Poker … TEAM ESTIMATION CAN BE VERY TIME CONSUMING AND IS OFTEN CORRUPTED BY “ANCHORING” WHEN ONE TEAM MEMBER HEAVILY INFLUENCES ESTIMATES John thinks he knows exactly what to do, so he says “3 days!”, this makes Bob and Mary doubt their own estimates … Mike then asks for the remaining estimates … 1 3 How long will it take? 3! ? Mike (PM/PO) !? John (DEV) 2 !? Sarah (DEV) Bob (DEV) 3 Mary (DEV) 13 ? 4 John (DEV) 3! 8 Sarah (DEV) Bob (DEV) 3 5 John (DEV) Mary (DEV) 5 Sarah (DEV) Bob (DEV) Mary (DEV)
With Planning Poker, each team member is given a deck of special cards … The Project Manager or Product Owner kicks off the Sprint Planning Session … The team starts to think about the backlog item … Planning Poker … PLANNING POKER IS AN ITERATIVE APPROACH TO ESTIMATING ITEMS IN THE PRODUCT BACKLOG INTENDED TO REDUCE ANCHORING AND TIME WASTING Mike then asks everyone to flip a card with their estimate … Big difference … the team (John and Bob, in particular) discusses to determine why their estimates varied so much … Turns out John forgot about a critical detail and Bob didn’t know there was some existing code that could be leveraged … iteration #2 … 4 1 3 5 8 13 3 ? 13 2 20 40 1 100 .5 ? 0 John (DEV) Sarah (DEV) 8 Bob (DEV) 2 How long will it take? Mary (DEV) Mike (PM/PO) 5 3 3 8 13 ? ? 8 John (DEV) John (DEV) 8 Sarah (DEV) Bob (DEV) Sarah (DEV) 8 Bob (DEV) Mary (DEV) Mary (DEV)
Velocity & Ideal Time … BE SURE TO ACCOUNT FOR THE SOMETIMES LARGE DIFFERENCE BETWEEN IDEAL DAYS AND ACTUAL DAYS WHEN DERIVING DURATION • Once we have estimated the size of the work to be done, we need to estimate our velocity so that we can derive the duration • Velocity is the amount of work (e.g., story points, etc.) a team can complete during a given time period (e.g., we can complete 20 story points per Sprint, etc.) • Option #1: Over time you can calculate a historical average velocity for a team, which can be used in forecasts • Option #2: Wait until after the 1st Sprint to provide the total estimate if the client will allow it – you can then use Sprint 1 velocity as an indicator for future Sprints • Option #3: Estimate the total number of hours available to work on the project during the Sprint (sum of each person’s ideal time), break the features down into tasks with “ideal hour” estimates and see how many “points” worth of features you could get done in the available ideal hours. Once you have that number, convert it to a ranged estimate. • Be sure to account for the difference between ideal time and actual elapsed time when forecasting velocity. Ideal time is defined as how long a task would take if: • It was all you worked on • You had no interruptions • You had everything you needed to complete the assignment • How many ideal hours do you average in a real day?
Derive Duration … AFTER WE HAVE ESTIMATED THE SIZE OF THE WORK AND FORECASTED VELOCITY WE CAN DERIVE THE DURATION TO COMPLETE THE TASKS ESTIMATE SIZE DERIVE DURATION FORECAST VELOCITY / = ? • If each Sprint was 2 weeks, when would you estimate that this project will be completed?
Estimation Checklist … THIS ESTIMATION CHECKLIST SHOULD HELP US IMPROVE THE ACCURACY OF OUR ESTIMATES • Estimate size – Derive Duration based on “Velocity” or projected velocity • Get the entire development team involved in the estimation (e.g., Planning Poker, etc.) • Break the project down into smaller pieces and then spot check • Split the project up into features / use cases / user stories and estimate each • Take a “medium sized” feature and break it into development tasks and estimate each – add these up and see how it compares to your estimate for that feature • Spot check a large feature and a small feature – adjust other estimates as needed based on what you learn about your estimates • Estimate by Analogy (e.g., this use case is similar to the one we did on the last project that took about 10 days, etc.) and Triangulate to provide higher confidence (e.g., same size at this other use case and twice as big as this one – feels right, etc.) • Estimate best case, worst case and most likely case for each use case to stimulate thinking about the full range of possible outcomes (Consider Using this formula) Expected Case = [BestCase + (3 X MostLikelyCase) + (2 X WorstCase)] / 6 • Add time for the tasks we commonly forget (e.g., environment setup, testing, defect resolution, data migration, etc.) • Provide your estimate as a range – rather than a precise number (e.g., 6-8 weeks vs. 6 weeks, etc.) • Recognize where you are in the “Cone of Uncertainty” when determining how wide your range should be • Use the following scale for your estimates: 1, 2, 3, 5, 8, 13, 20, 40, 100 – we are better at estimating things that are roughly the same size and why argue about 22 units versus 23 units…
Other Discussion Topics … A FEW OTHER MISCELLANEOUS TOPICS TO DISCUSS • How should we estimate differently for “Fixed Fee” and “Fixed Date” projects? • Best Case & Worst Case Velocity should provide you with the range • Provide scope as a range on these projects (e.g., you will have all of these features and may have some of these, etc.) • We have to decide how much risk we are willing to take on to win the project • Should we “pad” our estimates? What about Parkinson’s Law? • How do these estimation concepts apply to other types of projects (e.g., IT Strategy, Business Intelligence, etc.) • What other “levers” should we apply to our estimation model (e.g., onsite vs. offsite team members, dedicated QA team, dedicated creative resource(s), involvement on client development resources, etc.)?
Resources … THESE TWO BOOKS PROVIDE A LOT OF GOOD INSIGHT INTO SOFTWARE ESTIMATION AND PLANNING • Software Estimation, Demystifying the Black ArtSteve McConnell • Agile Estimation and PlanningMike Cohn
Application DevelopmentEstimation Tips & TechniquesAustin, TX – February 2009 * * * Discussion document, Strictly confidential & proprietary * * *