1 / 21

Software Quality Week ‘97

Software Quality Week ‘97. How to Apply Static and Dynamic Analysis in Practice - Otto Vinter Manager Software Technology and Process Improvement Brüel & Kjaer Sound & Vibration Measurement email: ovinter@bk.dk. Company Profile. Brüel & Kjaer Skodsborgvej 307, DK-2850 Naerum, Denmark

jeff
Download Presentation

Software Quality Week ‘97

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Quality Week ‘97 • How to Apply Static and Dynamic Analysis in Practice • - • Otto Vinter • Manager Software Technology and Process Improvement • Brüel & Kjaer Sound & Vibration Measurement • email: ovinter@bk.dk

  2. Company Profile • Brüel & Kjaer • Skodsborgvej 307, DK-2850 Naerum, Denmark • Tel: +45 4580 0500, Fax: +45 4580 1405 • High-Precision Electronic Instrumentation for • Sound • Vibration • Condition Monitoring • Gas Measurements

  3. The PET Process Improvement Experiment • The Prevention of Defects through • Experience-Driven Test Efforts • (PET) • PET Objectives • Extract knowledge on frequently occurring problems in the development process for embedded software • Change the development process by defining the optimum set of methods and tools available to prevent these problems reappearing • Measure the impact of the changes in a real-life development project • Funded by the CEC ESSI Programme (Project no. 10438)

  4. Defect Analysis from Error Logs • Results of the Analysis of Error Reports • no special bug class dominates embedded software development • requirements problems, and requirements related problems, are the prime bug cause (36%) • problems due to lack of systematic unit testing is the second largest bug cause (22%)

  5. The PET Experiment • Actions to Improve Unit Testing • introduction of static and dynamic analysis • host/target tools • basic set of metrics • Comparative Analysis • assess a trial-release of a product • remove all static analysis anomalies • increase test coverage to industry best practice (branch coverage > 85%) • measure the effect after production release

  6. Results of the PET Experiment • Applying Static and Dynamic Analysis: • 75% reduction in production-release bugs • compared to trial-release • 70% requirements bugs in production-release • because bugs in other categories dropped • 46% increase in testing efficiency • bugs found per hour of testing • app. 75% payback on tools and training • reduced maintenance effort

  7. Where to Apply Static and Dynamic Analysis • Test Activities • Requirements Verification • Design Verification • Unit Test • Integration Test • Acceptance Test • Beta Test } Static and Dynamic Analysis

  8. Static Analysis in Unit Testing • Static analysis is performed • before unit testing starts • and after any modification to unit • Complexity Metrics • use McCabe complexity metric as an indication • take loop nesting into account • use visual inspection of data flow graphs • for perceived complexity • focused code inspection • Analysis reports are reviewed / inspected • determine special actions • determine test coverage profile • test coverage levels

  9. Static Flowgraph (McCabe = 10)

  10. Static Flowgraph (McCabe = 20)

  11. McCabe = 46

  12. McCabe = 55

  13. Visualisation Theme • If it looks right, it could be wrong, • but if it looks wrong, it can’t be right. Uffa Fox, yacht designer

  14. Static Analysis in Unit Testing • Data Flow Analysis • Undefined but referenced variables • must be removed from the code • Variables defined but not used in scope • should be documented in the code • Variables redefined with no use in between • should be documented, if not close to • Suspicious casting (loss of information, mismatches) • if unavoidable, use explicit casts • should be documented • Global variable anomalies (local overrides global) • should be removed

  15. Static Analysis in Unit Testing • Establish project standards for code • no goto’s, breaks in all cases, procedure size, etc. • Remove unused items • unreachable code (incl. procedures) • declared variables that are not used • Address design architecture problems • use visual inspection of control flow graphs • procedure parameter anomalies (referenced only, defined only, not used)

  16. Static Analysis Results • Data Flow Anomalies • Type of Bug Distribution • Use of uninitialised variable: 15 % • Variable defined but not used in scope: 24 % • Variable redefined with no use in between: 36 % • Parameter mismatch: 6 % • Unreferenced procedure: 18 % • Declared but not used variable: 0 % • Other types of static bugs: 0 %

  17. Dynamic Analysis in Unit Testing • Test coverage strategy • based on information from static analysis • determined on a unit by unit basis • Unit test coverage must take into consideration • perceived complexity • based on a visual inspection • degree and type of reuse • criticality • product and project issues • developer experience • domain, product, and unit • bug and change request history

  18. Dynamic Analysis in Unit Testing • Dynamic analysis reports are reviewed / inspected • after unit testing (and a static analysis) • and after any modification to unit • Test coverage achieved • test coverage level (e.g. 85% of branches) • visual inspection of dynamic flow graphs • determine special actions • Static and dynamic analysis reports are saved in the configuration system along with the code

  19. Problems when Introducing Static and Dynamic Analysis • Static analysis • an extreme amount of anomalies to evaluate • more than 1 per 60 lines of code • Dynamic analysis • best suited for automatic testing • documented test cases • no automatic test case generation • only structure based • test cases are selected by normal unit testing techniques • equivalence partitioning, boundary value analysis ...

  20. Problems when Introducing Static and Dynamic Analysis • Tools break • code expands when instrumented (40%) • large amount of tracing data is generated (Mbytes) • So • instrument only parts of system • handle frequently used parts separately • run the regression test suite several times • with different parts instrumented

  21. Conclusions on Static/Dynamic Analysis • Performance Improvement • an efficient way to remove bugs • marginal delay on trial release date • marginal increase in the testing resources required • immediate payback on tools, training & implementation • increased quality and reduced maintenance costs • increased motivation • applicable to the whole software development industry, incl. embedded software

More Related