170 likes | 175 Views
This presentation proposes a standard set of measures, metrics, and reports for software testing. It covers key measures across the testing lifecycle, including planning, risk, test preparation, test execution, and defect analysis. Effective ways to present the measures are also discussed.
E N D
Specialist Interest Groupin Software Testing Seven Key Measures for Software Testing Graham Thomas RCOG, 15th June 2006
Abstract Last year I came across the worst measurement example that I had seen in over 20 years experience of IT. Part of the problem comes from the fact that there isn’t a standard set of measures, so should we actually get upset when software testers measure the wrong thing, in the wrong way, and then report it badly? Actually no! Not until there is a standard definition for software test measurement and reporting. So there is the challenge for this presentation. To present a standard set of measures, metrics and reports for software testing so that there can no longer be any excuse. This presentation proposes 7 key measures across the software testing lifecycle, covering; Planning, Risk, Test Preparation, Test Execution and Defect analysis. The presentation will also identify effective ways to present the 7 key measures in the form of a practical model. 2
Agenda • An example Test Report • Definition of Measurement and Metric • Seven Key Measures • Weekly Reporting • More measures and metrics • Tips for success • Conclusion 3
An example Test Report • Lets look at an example test report • Example Test Report • To summarise • Poor presentation • Unable to quickly, simply and easily get a view of testing • Too much information • Difficult to compare data • Real message obscured • Actually unintelligible ! 4
DefinitionMeasurement - Metric • Measurement “The act or process of measuring” • Metric “A calculated term or enumeration representing some aspect of biological assemblage, function, or other measurable aspect and is a characteristic of the biota that changes in some predictable way with increased human influence” Ref: Minnesota Pollution Control Agency Biological Monitoring Program Glossary of Terms 5
Coverage 12
Config Status Metrics Analysis Risk Weekly Report 13
More Measures and Metrics The Stroop Effect ZYP QLEKF SUWRG XCIDB WOPR ZYP QLEKF XCIDB SUWRG WOPR SUWRG ZYP XCIDB QLEKF WOPR RED BLACK YELLOW BLUE RED GREEN YELLOW BLACK BLUE BLACK RED YELLOW GREEN BLUE GREEN • Use these views to support the view and message • Defects by Type / Severity / Priority • Defect Hot Spot Analysis • Defect Age Analysis • Causal Analysis • Metrics • Defects / Faults per KLOC / KXLOC • Defects per Requirement • Days Test Effort per Requirement • DDP 14
A Few Tips for successful charting • EasilyDistinguishableColours • Consistent look and feel • If you shade then -light at the topdark at the bottom • REDmeansDANGER ! • Cumulative totals enable to be spotted • Remember it is the content that is important Trends 15
Conclusions • There are other things to measure than just defects • Report trends rather than snapshot measurements • Limit the amount that you report to increase impact • Be consistent in your reporting • Explain what you mean by your measurements - don’t assume that others will automatically know 16
Contact Details Graham Thomas Independent Consultant graham@badgerscroft.com +44 7973 387 853 www.badgerscroft.com 17