1 / 15

UKSMA 2005

UKSMA 2005. Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com. Topics. The IT organisation Measurement – in particular Function Points Lessons Learnt The future Summary. The IT organisation. As is Dedicated local staff committed to customer departments

Download Presentation

UKSMA 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

  2. Topics • The IT organisation • Measurement – in particular Function Points • Lessons Learnt • The future • Summary

  3. The IT organisation • As is • Dedicated local staff committed to customer departments • Little formal Project Management discipline, need to react to customers needs and wants • To be • Delivery split between own and IT Services companies • Move work to more formally controlled projects with a measured project delivery rate

  4. FPs enable Productivity Measurement,but other measurements are required to ensure overall delivered quality. • Function Points may be used to measure the productivity of the software delivery process • Productivity = FP/100 Hours of project effort • Note there is no measure against people • A basket of measurements should be put in place to indicate when action is required to keep the software delivery process in balance. • A minimum set of measurements in the basket are: • Schedule eg Schedule Days/1000FP • Defect Rates eg Defects per delivered FP • Customer Satisfaction • Focusing on one element in the basket can lead to aberrant behaviour

  5. Function Point Analysis have more than one flavour • A fully auditable value – expensive and not really required for tracking trend line behaviour • A reasonable approximation – this is much easier and cheaper if all data and transactions are treated as Average (FP Light) • A rough guess – this is of limited value other than an indicator of the effort to produce a better value

  6. FP Light was chosen as the output product Sizing Measure • Advantages • Less effort to count • Saving is around 50% compared to full IFPUG count • Disadvantages • Less accurate as a sizing metric • Accuracy is within 15-20 % • Comment: • The accuracy of the FP light number is sufficient since the accuracy obtained on many other measures which go to make up the required metrics set is often even less.

  7. Each Data Flow and Data Store has a set number of FPs. Administrator Product Information Update Product Product History Customer Select Sales Products Product Product Information Administrator Inquiry Policy Function Point Light Definitions Policy Details Data Maintain Policy internal ILF = 10 FP EIF = 7 FP Policy Information external Data Flows EI = 4 FP Agent EO = 5 FP EQ = 4 FP

  8. Lessons Learnt - Training • FP training for IT professionals is available from a number of suppliers but staff churn means that the training requirement is an on-going requirement

  9. culture, comparison • Culture • The management team using the metrics needs to provide sponsorship, promotion and direction to enable them to be successfully embedded in the culture • (This presentation is a variant of the one shared with them) • Comparison • When using productivity to assess the software process (and/or setting productivity targets), areas of the organisation with similar project attributes should be chosen. • There are hard and easy function points – compare with care • Many factors affect the productivity achieved on any given project. • Project Type, Platform, Architecture, Software framework, None coding effort, Skill and experience levels, Process, Tools etc etc etc

  10. Exceptions & Effort • Exceptions • FP Lite Not a useful metric for every component or every project • must decide how to handle these exceptions • Project effort • It is important for consistency that the set of activities for which the effort is measured against FP size is the same on a project by project basis within component areas. • For accurate assessment of productivity the resource must correspond to the delivery FP counted. • a suitable level of detail to allow analysis by role and stage • where appropriate “non countable” activities and their resource should be excluded from the measurement • Event / Milestone which determines the start and end of the project and or phase also needs agreeing. These should fit with the hand off and hand in of work given to the external teams.

  11. Process • To count FPs, the analysis workshops need to be built into the standard development lifecycles. • Metrics group must have a method of recognising when a project is due an FP count. • A FP number should be required before project closure is accepted. • The metrics group and FP Analysts provide but do not own the data. • They are not in a position to make decisions based on the measurements, ie to change the software development lifecycles. • Techniques to utilise the data must developed • What are the questions? Is the data sufficient? • What can be safely compared & summed

  12. Human Resources • Human Resources • If project staff fear that they are not producing enough Function Points, they will be tempted to inflate the value whenever possible to make the figures reflect themselves in a positive light. • If FP counting is applied inappropriately it will lose credibility • The Metrics group will need management support to ensure • counts are performed according to the rules. • exceptions are recognised • inappropriate comparisons are avoided • analysis is appropriate • not simply a global average

  13. Boundaries & documentation • Boundaries • FPs need to count flows across and data within application boundaries • The setting of a standard set of boundaries within NUL is therefore vital for consistency of FP counts • Process Documentation • FP counting is made easier with FP friendly documentation. • Every effort should be made to make the standard documentation set FP friendly.

  14. The future use of Function Points can be expanded to provide project support • Requirements validation • Function Point analysis tests the requirements (and high level design) documentation and models for usability and completeness. The Analyst can raise queries and issues which can avoid the project becoming troubled. • Need an early FP count and a repeat count for each major change • Help in the validation of project estimates. • Carrying out an early Function Point estimate will allow project estimates to be validated based on historical productivity figures. • Need a database of historical project data and an early FP count. • Help in tracking of projects using a modified Earned Value process. • Function points delivered during the project lifecycle may be reported against expected delivery.

  15. Summary • The FP rollout is going well • The organisation now has the capability to size projects and software • Therefore can implement the correct measures to decide whether initiatives are improving or worsening the IT capability • Lessons Learnt • Executive and management support can resolve these • Future holds additional benefits • Better estimating • Better project management discipline, particularly requirements & test management

More Related