650 likes | 1.6k Views
Agile Metrics that Matter. September 1, 2011 Ian Savage Brian Kronstad. First… some questions for you. Size of your org: <25 25-100 100-500 >500 Familiarity with metrics : I confuse “metrics” with “matrix” I know something about metrics
E N D
Agile Metrics that Matter September 1, 2011 Ian Savage Brian Kronstad
First… some questions for you • Size of your org: • <25 • 25-100 • 100-500 • >500 • Familiarity with metrics: • I confuse “metrics” with “matrix” • I know something about metrics • I have managed projects using explicit exit criteria • I have implemented a successful org-wide metrics program • Your expectations?
Some Agile Metrics* • Predictable, consistent delivery of business value • Customer involvement • Team cohesion, joy, happiness, trust • Status of customer service queues (SLA) • Burndown irregularities • Absolute velocity (total work done) • Relative velocity (total work done / plan) *from McAfee Program Managers – August 2011
Others’ Thoughts on Agile Metrics • “…primarily three metrics that provide the best measure of the state of an Agile project, namely progress, quality, and teammorale.” http://www.projectsmart.co.uk/metrics-that-matter-in-agile-projects.html • “What do your stakeholders want?... All the information they need to make decisions and no more…” http://community.thoughtworks.com/files/1c2707ac7c/Agile_Metrics_that_Matter.pdf • “The number of graphs we generate from our application code is over 16,000. How do we collect so many metrics? We keep the process super simple.” http://www.mikebrittain.com/blog/2011/03/19/metrics-driven-engineering-at-etsy/
Metrics Matter • People change behaviors to meet the numbers • So choose metrics wisely… • Easy to collect: • KLOCs • bug counts • calendar days • progress against plan • Quality-enabling • Strategic • customer loyalty • earned value • cost of quality • Operational • early detections • unit test coverage and worth • value delivered
So? How do we choose metrics? And what’s this GQM and FURPS?
Goal-Question-Metric (GQM) • Dr. Victor Basili c. 1970 – University of Maryland • Goals • Business, product, or iteration results • Questions • Define the nature and scope of our inquiry • Metrics • Measurements that give the answers • Method* • Data collection, sorting, aggregating *my addition
Software Attributes (FURPS) • Grady and Caswell - 1987 -Hewlett-Packard • Model for classifying software quality attributes • Functionality - feature set, capabilities, generality, security • Usability - human factors, aesthetics, consistency, documentation • Reliability- failure frequency, recoverability, predictability, accuracy • Performance - speed, efficiency, resource use, throughput, response • Supportability - testability, extensibility, adaptability, maintainability, compatibility, configurability, serviceability, installability, localizability, portability Software Metrics: Establishing a Company-wide Program, Prentice-Hall
Goals and Tradeoffs • Tim Lister: “If the organization does not measure, or in some other legitimate way determine, if a goal has been met, then the organization was never really serious about that goal.” PNSQC Proceedings - 1993 Keynote • Grady/Caswell: “Establishing priorities is important because of the tradeoffs involved between quality attributes. For example, adding a new function might improve functionality but decrease performance, usability, and/or reliability.”
A Working Definition of “Quality” Quality is a function of salient attributes: Q(total) = f (attribute(1), attribute(2),… attribute(N))
Metrics Development (1/4) Notes: • Stakeholders meet and rank sort the attributes. • Some attributes are left unmanaged.
Metrics Development (2/4) Notes: • Team discusses the attributes and records the questions we want answered. • These questions reflect values and drive decisions.
Metrics Development (3/4) Notes: • Team decides the metrics that will answer the questions. • And set targets for each metric.
Metrics Development (4/4) Notes: • Team determines methods for generating those metrics. • Check and adjust: • MethodMetric • Metric Questions • Questions Goals
Agile Scrum Characteristics • “Short” cycles that deliver value • Emphasis on fixed time and flexible scope • “if the scope don’t fit, you must omit” • Continuous backlog planning • Tests driving (or closely aligned with) coding • Refactoring is an expectation • Daily scrum meetings
Implications for Reporting • Multiple levels to consider • Strategic – Product Backlog • Operational – Releases • Tactical – Sprints and scrums • Levels impact different project aspects • Decision-making, Planning, Measuring, Reporting • Stakeholders have different requirements • Project managers – e.g. Scope, Schedule, Budget, Quality • Sponsor – e.g. Features by release and date, TCO • Customers – e.g. Fit for use & purpose • May need to re-educate stakeholders • Agile terms, interpreting Agile reports, what’s significant to track (e.g. how to measure defect density given refactoring)
Extending GQMM + Reporting • Goals (conceptual level) • What results do we want? • W.r.t. : Products, Processes, Resources • Questions (operational level) • How will we know we’ve met those goals? • Metrics (quantitative level) • What measurement gives the answer? • Method • How do we collect / derive the data? • Reporting • How do we convey the information in a way that provides value to various stakeholders?
Performance Reporting Example Sprint and Release Impacts on Performance Significance: The performance metric indicates response time changes due to sprints and release Analysis: Response times at sprint completions are progressing towards the NFR target of 2 seconds. Recent sprints have had less impact on response times Response: Team will have an independent architecture and design review. Findings will be addressed in a refactor sprint following the October release
Common Agile Reports • Burn Down / Burn Up charts • Product, Release & Sprint • Amount of work completed • Cumulative Flow Diagrams • Earned Business Value • Velocity Table • Backlog Change Report
Example Burn Down Chart http://www.atlassian.com/software/greenhopper/tour/burndown-charts.jsp
Cumulative Flow Chart Remaining In Process Completed http://www.atlassian.com/software/greenhopper/tour/burndown-charts.jsp
Earned Business Value Business value delivered per sprint and release Significance: The metric indicates the relative business value delivered per sprint and release Analysis: While significant business values has been delivered, it has consistently fallen short of expectations Response: Team will perform root cause analysis to identify contributing factors for lower business values (e.g. scope cut) and whether additional business value is necessary to achieve product ROI
Basic Velocity Table* *More developed velocity tables use a complexity factor which relates to the risk of a particular story or set of stories (aka epic)
Backlog Change Report Backlog Features delivered per sprint and release Significance: Demonstrates the addition of backlog features to the product for each sprint and release Analysis: 95% of backlog features delivered as of sprint 10. Trending toward completion by sprint 12 Response: PM will discuss remaining features to determine whether their contribution to the end product is necessary.
Some Lessons Learned • Understand how the information will be used • Does the report answer the Question? • Revalidate the Question • With experience the Question may change, new questions may arise • Understand how to deliver the information • Frequency, method (e.g. email), level of detail • Stakeholder preferences: tables/graphics, data/discussion • Include context for the information • Significance: Why is this information important • Analysis: How should this particular information be interpreted
Some (more) Lessons Learned • Don’t overburden the development team with metrics collection • Less is more, metrics should be a bi-product of development • Tracking and reporting story points versus hours • Be clear on how they’re used and be consistent • Set Stakeholder expectations • Velocity will not likely stabilize until after a few sprints, therefore expect early estimates to be wrong • Flexible scope <> unmanaged scope
Caution • Don’t mix project metrics and people metrics • Project metrics are not for reviewing people • They can be gamed and the data will skew • Will hurt team morale
Additional Resources • The Goal Question Metric Approach, Basili, Caldiera, Rombach, et. al. • Software Metrics: Establishing a Company-wide Program, Grady and Caswell, 1987 Prentice-Hall • Practical Software Metrics for Project Management and Process Improvement, Grady, Robert , 1992 H-P • Earned Value and Agile Reporting, Anthony Cabri, Mike Griffiths, QuadrusDevelopment Inc. • Establishing Metrics using Balance Scorecard and Goal Question Metrics Technique For Organizational Prosperity, Deborah Devadason, Qualcon 2004 • GQM+Strategies, Basili, et. al.