330 likes | 495 Views
Intro. Verification and Validation Processes Introduction Adrian Marshall. Agenda. Introduction Definitions V&V Test objectives Testing challenges Testing and the V model Testing Approaches Testing levels Classes of test Risk based testing Requirements based testing Test Methods
E N D
Intro Verification and Validation Processes Introduction Adrian Marshall Adrian Marshall
Agenda • Introduction • Definitions • V&V • Test objectives • Testing challenges • Testing and the V model • Testing Approaches • Testing levels • Classes of test • Risk based testing • Requirements based testing • Test Methods • Common testing types • Testing tools Adrian Marshall
Intro • Verification • IEEE 1012: the process of determining whether or not the products of a given phase of the software development lifecycle fulfil the requirements established during the previous phase. • ISO 12207: confirmation by examination and provision of objective evidence that specified requirements have been fulfilled Adrian Marshall
Intro • Validation • IEEE 1012: the process of evaluating software at the end of the software development processto ensure compliance with software requirements • ISO 12207: confirmation by examination and provision of objective evidence that particular requirements for a specific intended use have been fulfilled Adrian Marshall
Intro • Put more simply: • We Verify that the output of each software phase meets its requirements, and • We Validate that the software, at the end of the development effort, meets the overall intended use Adrian Marshall
Types of V & V Activities • Requirements analysis and Traceability analysis • Design analysis • Interface analysis • Implementation evaluation • static - reviews, inspections, structure analysis • dynamic - simulation, prototyping, execution time analysis • formal - mathematical analysis of algorithms • Testing • Project & Management analysis Adrian Marshall
Sources of guidance on V & V • V & V Standards • IEEE 1012 - Software V & V Plans • IEEE 1059 - Guide for Software V & V Plans • IEEE 1028 - Software Reviews & Audits • IEEE 829 - Software Test Documentation • Related Standards • ISO 12207 - Software Lifecycle Processes • ISO 9126 - Software Quality Characteristics • Text • V & V of Modern Software -Intensive Systems - Schulmeyer & Mackenzie, Prentice Hall, 2000 Adrian Marshall
Pros and Cons of V & V • Positive • early error detection • better product quality • better project planning • better adherence to standards, methods and practices • better decision support information • Cost of detection & prevention < cost of corrective action • Negative • additional time and effort required for V&V activities • additional cost (visible) • Independence of V&V can be hard for small organisations Adrian Marshall
Defect Introduction by Phase What is known about the quality of software systems? Applied Software Measurement 2nd Edition, by Capers Jones. McGraw-Hill, 1997. ISBN: 0-07-032826-9 Adrian Marshall
Cost of Removing Defects What is known about the quality of software systems? Applied Software Measurement 2nd Edition, by Capers Jones. McGraw-Hill, 1997. ISBN: 0-07-032826-9 Adrian Marshall
Testing Definitions (1) • Testing is the process of executing a program with the intent of finding errors • Testing is an activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results • Testing is the process by which we understand the status of the benefits and the risk associated with release of a software system • Testing includes all activities associated with the planning, preparation, execution, and reporting of tests Adrian Marshall
Testing Definitions (2) • “Testing cannot guarantee that errors are not present, rather it demonstrates that errors are present”…. Adrian Marshall
Test Objectives • Verifying the implementation of any or all products • Requirements and solution validation • Defect detection • Provide assessment of deployment risks • Provide performance & threshold benchmark data • Establish testing processes, assets, data and skills for on-going testing activities Adrian Marshall
Testing Challenges • Complete testing is not possible • Testing work is creative and difficult • Testing is costly • Testing is often not seen as a core activity • Testers aim to find and report problems Adrian Marshall
Testing Challenges • Technical personnel often do not want to become testers, leaving testing to non-technical system users • Testing requires independence • Testing is often a critical path activity • Testing is often trimmed to solve schedule or budget problems.… Adrian Marshall
Testing and the V Model DESIGN ACTIVITIES TESTING ACTIVITIES BUILD ACTIVITIES Determine business requirements Review requirements Analyse test requirements Test against business requirements Accept system ACCEPTANCE LEVEL BUSINESS REQUIREMENTS LEVEL Review solution Develop Master Test Plan Test against system requirements Determine system requirements Install system SYSTEMS REQUIREMENTS LEVEL SYSTEM LEVEL Design solution Develop Detailed Test Plan/s Test integration of components Integrate components DESIGN LEVEL INTEGRATION LEVEL Design component solution Buy / build components Test components COMPONENT LEVEL COMPONENT LEVEL Adrian Marshall
Testing Approaches • Bottom up • tests smallest components / sub functions first • test drivers are required • Top down • tests major functional areas from the top down • stubs are used where lower levels are incomplete • Functional thread • process / path-oriented approach which crosses unit boundaries • Combined…. Adrian Marshall
Testing Levels • Unit / Component testing conducted to verify the implementation of the design for one software element (for example, unit, module, function, class instance, method) or a collection of software elements • Integration an orderly progression of testing in which software elements, hardware elements, or both are incrementally combined and tested until the entire system has been integrated • System the process of testing an integrated hardware and software system to verify that the system meets its specified requirements…. Adrian Marshall
Classes of Test • White (or glass) box testing • designed with knowledge of how the system is constructed • aims to exercise the internal logical structure • statements, decisions, paths & exception handling evaluated • Black box testing • designed without knowledge of how the system is constructed • verifies that functional & performance requirements have been satisfied • focuses on the external behaviour of the system…. • Grey Box • designed with some knowledge of how the system is constructed Adrian Marshall
White Box Testing • White box testing techniques • Control flow based testing (e.g. decision & statement coverage testing) • Statement coverage – each statement is executed at least once • Decision coverage – each conditional statement is executed at least once each way • Complexity based testing – (eg McCabe cyclomatic complexity measure) – higher concentration of tests for more complex software • Boundary case and exception handling Adrian Marshall
Black Box Testing • Black box testing techniques • Equivalence partitioning • Boundary value analysis • Decision table • Testing from formal specifications • Error guessing • Exploratory testing Adrian Marshall
Risk-Based Testing • Test areas of risk with more rigor (greater coverage of functionality, and/or code) • Product risks may include: • Performance (capacity, throughput, accuracy, etc) • Safety • Security (authentication…) • Complexity • Test areas of higher risk first. • Focus on consequences and likelihood. Adrian Marshall
Requirements-Based Testing • Systematic requirements based testing ensures complete testing scope is analysed. • Focus Areas (examples) Functionality- • Security, Accuracy, Regulatory Compliance, Technical Compliance, Reliability - • Data Integrity, Error Handling, Fault Tolerance, Recoverability Useability - • User Friendliness, User Guidance, Adaptability , Clarity of Control, Error Handling, Conciseness, Ease of Learning, Documentation Quality, Ease of Installation, Performance- • Throughput, Acceptable Response Time, Data Storage Requirements, Acceptable Memory Capacity, Acceptable Processing Speed Portability - • Portability to Different Hardware Platforms, Compatibility With Different Operating Systems, Conformance, Replaceability, Languages Supported…. Adrian Marshall
Test Methods Adrian Marshall
Inspections and Reviews • Inspections and reviews require visual examination. • They can be conducted at the early definitions phase and hence provide efficient defect rectification. • Can be influenced by the ability of the inspector/reviewer (use checklists to standardise). Adrian Marshall
Common Testing Types (1) • Acceptance testing / User Acceptance Testing (UAT) Testing a system’s behaviour against the customer’s requirements • Alpha & beta testing Testing by a representative sample of users (internal = alpha, external = beta) • Installation testing Testing a system after installation in the target environment • Performance testing Testing against specified performance requirements (eg. response time) • Reliability testing Testing of stability, endurance, robustness, and recoverability…. Adrian Marshall
Common Testing Types (2) • Regression testing Re-testing previously run tests to evaluate the impact that a software change may have on unaltered software components • Security testing Testing a system’s ability to prevent unauthorised use or misuse, authentication • Compatibility / Interoperability testing Testing the ability of software to operate and coexist with other (application and system) software and hardware • Stress testing Exercising a system at the maximum design load and beyond • Usability testing Testing a system’s user friendliness, ease of learning, and ease of use…. Adrian Marshall
Testing Tools • Test management tools • information repositories • document generators • defect management tools • requirements traceability and test coverage tools • Test execution tools • compliers, debuggers, link loaders • source code analysers (coverage & complexity) • GUI testers • functional record and replay tools / robots • performance / load and stress testing tools • security vulnerability analysis tools…. Adrian Marshall
Requirements • Requirements definition through design • Software Specifications • Requirement/specification reviews Adrian Marshall
Software Specification Adrian Marshall Twelve Requirements Basics for Project Success”, Dr. Ralph R. Young , Northrop Grumman Information Technology Defense Group
Criteria for good requirements Twelve Requirements Basics for Project Success”, Dr. Ralph R. Young , Northrop Grumman Information Technology Defense Group Adrian Marshall
Review for testability • Review criteria: • Concise • Complete • Unambiguous • Consistent • Verifiable • Traceable Adrian Marshall
Review • V & V can provide continuous information about the quality of the system and the development effort • cost of detection & prevention < cost of corrective action • Testing is a process by which we understand the status of the benefits and the risk associated with release of a software system. • There are many testing techniques available for developers and testers. • Risk based testing is used to focus scarce testing resources. • Systematic requirements based testing ensure complete testing scope is analysed. • Automated testing tools may by used to assist test management and execution. Adrian Marshall