510 likes | 667 Views
Practical Software Project Estimation Black Art or Science? A Workshop. Acknowledgements. Tony Rollo SMS - UK Terry Wright Multi Media Vic Michael Stringer SAGE Technology David Cleary Charismatek Software Engineering Australia. Definitions.
E N D
Practical Software Project Estimation Black Art or Science? A Workshop
Acknowledgements Tony Rollo SMS - UK Terry Wright Multi Media Vic Michael Stringer SAGE Technology David Cleary Charismatek Software Engineering Australia
Definitions • ISBSG = International Software Benchmarking Standards Group • FP = Function Point • PDR = Project Delivery Rate = Hours per FP • PWE = Project Work Effort
What we will cover • Introduction - the track record • Factors affecting productivity • Estimates – how accurate • Using a project history repository for: • Measuring • Estimating Plus: • ISBSG Summary
What is our software delivery track record? • Only 26% of software projects are successful • 74% ended up in varying degrees of trouble • 32% of projects are terminated before delivering anything (average over-run = 87%) • Impact of poor estimates: • Missed delivery dates – loss of business • Resources wasted on failing projects • Cancelled projects – money spent – no business value • Business case for IT investment invalidated
74% failures The Cost of Failure 26% of software projects are successful Projects estimated using formal tools and methods are twice as likely to succeed Impact: Cancelled project = Excess cost = Invalid investment Late delivery = loss of business Failing project = lost opportunity
Other Industries can measure • Building Construction Industry • Metric = Cost per Metre of floor space to build • Building Engineer’s estimate • Building Function = Office Accom. • Location = CBD • Height = 20 floors • Cost per Square Metre = $1400 to $2300 • Variables • Theatrettes, Gym, computer rooms, facade • Quality • BUT only 2:1 price variation (in SE we are 10:1)
All ‘mature’ industries can measure their productivity …..why can’t we do it in software engineering?
Estimation ApproachesMacro estimation • Equation Use – useful early ball park • Based on a depth of historical data • Imprecise for accurate estimates • Comparison – provides a good guide • Based on representative experience • Experience MUST be relevant • Analogy – objective, repeatable • Based on past project attributes • Difficulty finding suitable past projects
Estimation ApproachesMicro- Estimation Work Breakdown • Project broken down • into components or tasks Each component or task is separately estimated • Results are aggregated to produce • an estimate of the whole • Detailed and specific to the project • May overlook activities or items
Important Notes • A formal project risk assessment is essential prior to estimating • Always apply your own knowledge and experience to adjust estimates
Do’s and Don’ts Never rely on a single estimate Use cross and sanity checks Never give a single estimate Give most likely, least and greatest If Macro and Micro estimates vary: by more than 10-15% identify why and rework the estimates
Factors affecting Productivity Based upon Statistical Analysis • The most influential Factors are: • Primary Programming Language • Platform MF, MR, PC • Team Size • For detailed analysis – ISBSG Benchmark R6 • Consider these most influential factors when estimating Factors
Computer Languages One analysis provided by ISBSG is the productivity of individual languages. It is useful to look at them with regard to the platform type Mainframe languages tend to have a wide range of productivity figures Mid range languages usually have a narrow range of productivity figures PC languages usually exhibit the highest productivity though for small projects
Team Size A software project is a team activity Team work is an important aspect Large teams are more difficult to manage and communication is usually more time consuming Teams of less than five give higher productivity
5 5 0 Accuracy of Estimates The most useful estimates are Effort & Duration Good estimates tended to be for smaller projects with short durations ISBSG Data shows: Met both estimates Met one estimate Met neither Poor estimates for large projects, for new development and for client server, mid range, portable projects, lots of users
25 25 50 Estimation Accuracy Estimates based on Function Point sizing are producing the most accurate estimates for delivery date, effort & cost Use Work Breakdown only Use Function Points only Estimation tools are used in about 10% Use Both Techniques Management directive sets delivery in 15%
Accurate Estimation Projects using only work Breakdown Severely underestimate cost Projects using both techniques Get it right or over estimate slightly Conclusion?
Project Size Macro-estimation techniques require the functional size of the proposed project.
Panic! • I don’t count function points, should I leave now? • No! • You can “cheat” by using the known relationships between FP components • If you have a logical data model, you can derive an estimated size
FP Breakdown New Developments
Using these component relationships Use the number of Logical Files or the number of Inputs as the base for estimating size. Outputs, Queries and Interface Files are too variable to use early in the lifecycle.
Estimating Size • 15 internal logical files (from a logical data model) • 7.4 function points per logical file (median from ISBSG) • 15 x 7.4 = 111 function points • Logical files ~ 22.1% of a project (from ISBSG chart) • 111FPs x 100/22.1 = 502FPProject Size
The Rule of the “Thirties” Various organisations have come up with a rule of thumb of “thirty something” logical files equalling one unadjusted function point of total project size. The range is between 31 to 35. This lines up with our example: 15 x 34 = 510FP project size
Project History Data Productivity Rate is best derived from your own projects because of the large number of attributes which influence productivity Use ISBSG data for project history: Industry wide data is useful when you have no relevant ‘experience’ Or as a Sanity check
ISBSG Software Project Database • Project data for >1,200 projects • Probably represents top 20% of industry • Primarily MIS Applications • Data from 20 countries • All (but 5) since 1990, 50% since 1998
Low High Project ILF 20% 24% 30% EIF 3% 12% 4% EI 26% 39% 42% 10% EO 22% 24% 14% EQ 12% 14% Sanity Checks Of Completeness: Is anything missing? Of the estimate Is your project estimate very different from ISBSG data for similar projects ? See the Benchmark Release 6
Estimation Using Equations • Used to produce an initial ‘ball park’ • Based on regression equations • ISBSG Practical Project Estimation toolkit • Has details in appendix • Basic spreadsheet regression tool • ISBSG Release 7 CD has an estimation tool
Using equations • Regression equation tables are available for: • Productivity (person hours per function point) • Effort (person hours) • Duration (elapsed hours) • Speed of delivery (function points delivered per elapsed calendar month)
Estimation - Equation approach • Establish the project size • Establish key attributes (eg. language, platform, team size) • Select the formula • Look up the equation tables • Select the appropriate table values • Do the calculation
Equation example Ok Let’s have a go: • Software project: • Mainframe platform • Size of 460 function points
Equation example - using tables • Project Delivery Rate PDRRE = 14.35 Size –0.072 = 14.35 460 –0.072 = 9.2 hours per function point • Project Work Effort PWERE = 14.35 Size 0.928 = 14.35 460 0.928 = 4,245.1hours
Equation example - using ISBSG tools Two tools available: • Simple spreadsheet regression tool (ISBSG Tool Kit) • Reality Checker (ISBSG Data CD R7)
Warning ! • It is very important to treat estimates obtained from regression equations as “ball park” only.
Estimation - Using comparison • Comparison based estimation involves selecting a group of similar completed projects from a project database, then using the average of the median of the effort values.
Estimation Using Comparison Comparison based estimation: • Define the platform and identify that subset of ISBSG data • Define the other attributes – language, team size etc. • For each attribute obtain the median value • Determine the average of the medians for PDR and delivery rate • Calculate the effort and duration • The result is your estimate
Example table - comparison estimation Attribute Median Median PDR delivery speed hrs/FP FPperMth Language -Cobol 15.2 51.2 Business type Financial 8.9 44.5
Example comparison estimate • Project Delivery Rate PDRAC = average of category median project delivery rates = 12 hours per fp • Project Work Effort PWEAC = PDRAC Functional Size= 12 460 = 5,520 hours
Example comparison estimate using ISBSG tool • Use spreadsheet comparison tool on the ISBSG Toolkit CD
Using analogy • Analogy estimation involves selecting from a project database, one completed project that closely matches the attributes of your planned project. Then use this ‘analogue’ as the basis of your estimate.
Analogy process • Establish the attributes of your planned project • Search a repository of completed projects If a suitable analogue is available: • Use the effort from the analogue as your base • Compare each attribute and adjust accordingly
Estimation by Analogy Using Project Attributes: • Find a project that matches If there is no project: • Eliminate one or more attributes as required If there are several projects: • Then add more attributes if known • Or use averages of the multiple projects Estimate from the projects attributes: • Speed of delivery, effort, duration etc
Simple example New Project’s attributes: • Size 540FP (filter on 250-1000) • Mid Range • New Development • MIS • 3GL • C++ (Primary Programming Language)
Analogue Example Results A project with the following values: • 391FP • 5,793hrs • 14.8 hours per FP • 13 Months • 30.1 function points per month
Estimating from the chosen Analogue Take the actual size of the new project: 540FPs X the Analogue’s PDR of 14.8hrs per FP = 7,992hrs project work effort divide by the Analogue’s speed of delivery: 30.1FPs per month = 17.9 months duration
Caution ! • The best analogues will normally be found in your own project repository • The fewer the shared attributes between the analogue project and the target project, the more careful you must be.
Caution ! • Better project estimates are obtained by using a combination of work breakdown and macro-estimation techniques. • There is no “silver bullet” for project estimation. You must apply your knowledge and experience to fine tune any estimate derived using macro-estimation.
The ISBSG Repository helps with Project Estimating • Good estimating requires a measurement history. • Some teams have no history. • Many teams are going into new areas. • ISBSG data provides a metrics history. • ISBSG provides regression equations • ISBSG provides project delivery rate tables • Data can be used to build estimating frameworks.
Estimating Summary • There are good techniques and tools available for software estimation • Don’t rely on one approach or technique • Always do a risk assessment first • Always apply your own knowledge and experience
Summary - ISBSG Strengths • Not profit motivated • Provides Project Benchmarking • Allows direct access to the data • Allows networking • Broad representation of IT • technologies, organisation types, geography • International based standard