780 likes | 1.05k Views
Verification and Validation. Verification: checks that the program conforms to its specification. Are we building the product right? Validation: checks that the program as implemented meets the expectations of the user. Are we building the right product?. Static Verification.
E N D
Verification and Validation • Verification: checks that the program conforms to its specification. • Are we building the product right? • Validation: checks that the program as implemented meets the expectations of the user. • Are we building the right product?
Static Verification • Program inspection • Formal method
Verification and Proofs of Correctness • Formally specify the desired functionality, then verify that the program is a correct implementation of the specification.
Hoare's Rules • Program fragments and assertions are composed into triples {P}S{Q} • where P is the precondition assertion, Q is the postcondition assertion,and S is program statements. • Interpretation: If P is true before S is executed, then when S terminates, Q is satisfied.
Proofs of Correctness • Partial correctness: if the precondition is true, and the program terminates, then the postcondition is satisfied. • Total correctness: is partial correctness, plus a proof of termination.
The Assignment Rule {P} x:=f {Q} • where P and Q are the same, except that all the occurrences of x in P have been replaced by f. • Backward substitution • {x=5} x = x + 1 { x = 6} • {z > y + 50} x = z – 43 { x > y + 7}
Rule for Sequencing Statements{F1} S1; S2{F3} {F1}S1{F2}, {F2}S2{F3} {F1}S1;S2{F3}
Rule for Conditions and Loops {P & C} S1 {Q}, {P & ~C} S2 {Q} {P} if C then S1 else S2 endif {Q} {I & C} S { I } {I} while C do S od {I & ~C}
Software Testing • Software Requirements Specifications • Describes the expected runtime behaviors of the software. • A Test Plan • Describe how to test each behavior. • The software (source code or executable)
Testing • Failure: • The departure of program operation from user requirements. • Fault: • A defect in a program that may cause a failure. • Error: • Human action that results in software containing a fault.
The Test Plan • A ``living document''. It is born with the system and evolves as the system evolves. It is what the key decision makers use to evaluate the system. • User objectives. • System Description and traceability Matrices. • Special Risk Elements. • Required Characteristics-- operational and technical. • Critical Test Issues -- operational and technical.
Management Plan • Integrated Schedule • Roles and Responsibilities • Resources and Sharing
Verification Outline • Verification to Date • Previous Results • Testing Planned • Unresolved Issues • Issues arising during this phase • Scope of Planned tests • Test Objectives • Special Resources • Test Articles
Validation Outline • Validation to Date • Previous Results • Testing Planned • Unresolved Issues • Issues arising during this phase • Scope of Planned tests • Test Objectives • Special Resources • Test Articles
Test Results and Traceability • Test Procedures • Test Reporting • Development Folders
Types of Faults • algorithmic faults • computation and precision faults • documentation faults • stress or overloaded faults • capacity or boundary faults • timing or coordination faults • throughput or performance faults
IBM Orthogonal Defect Classification • Function: fault that affects capability, end-user interfaces, product interface with hardware architecture, or global data structure. • Interface: fault in interfacing with other components or drives via calls, macros, control blocks, or parameter lists. • Checking: fault in program login that fails to validate data and values properly before they are used. • Assignment: fault in data structure or code block initialization. • Timing/serialization: fault that involves timing of shared and real-timeresources. • Build/package/merge: fault that occurs because of problems in repositories, management changes or version control. • Documentation: fault that affects publications and maintenance notes • Algorithm: fault involving efficiency or correctness of algorithm or data structure but not design.
The Testing Process • Unit testing • Component testing • Integration testing • System testing • Acceptance testing
Testing Strategies • Top-down testing • Bottom-up testing • Thread testing • Stress testing • Back-to-back testing
Traditional Software Testing Techniques • Black box testing • program specifications : functional testing • operational profile: random testing, partition testing • White box testing • statement coverage • branch coverage • data flow coverage • path coverage • Others • Stress testing • Back-to back testing
Defect Testing • Black-box testing • Interface testing • Structural testing
Black-Box Testing • Graph-based testing methods • Equivalence Partitioning • Boundary value analysis
Graph-Based Testing • Transaction flow modeling • Finite state modeling • Data flow modeling • Timing modeling
Partition Testing • If an input condition specifies a range, one valid and two invalid equivalence classes are defined. • If an input condition requires a specific value, one valid and twoinvalid equivalence classes are defined. • If an input condition specifies a member of a {\em set}, one valid andone invalid class are defined. • If an input condition is boolean, one valid and one invalid class are defined.
Boundary Value Analysis • If an input condition specifies a range bounded by values a} and b, test cases should be designed with values a and b , justabove and just below a and b, respectively. • If an input condition specifies a number of values, test cases should be developed that exercise the minimum and maximum numbers.Values just above and below minimum and maximum are also tested. • Apply guidelines 1 and 2 to output conditions. • If internal program data structures have prescribed boundaries,be certain to design a test case to exercise the data structure at its boundary
Interface Testing • Parameter interfaces • Shared memory interfaces • Procedural interfaces • Message passing interfaces
Integration Testing • top-down integration • bottom-up integration • incremental testing
White-Box Testing • Statement Coverage Criterion • Branch coverage criterion • Data flow coverage criterion • Path coverage criterion
Statement Coverage while (…) { …. while (…) { ….. break; } ….. break;
Branch Coverage if ( StdRec != null) StdRec.name = arg[1]; …… ……. write (StdRec.name)
Data Flow • Data-flow graph: is defined on the control flow graph by defining sets of variables DEF(n), C-USE(n) and P-USE(n), for each node n. • Variable x is in DEF(n) if • n is the start node and $x$ is a global, parameter, or static localvariable. • x is declared in a basic block n with an initializer. • x is assigned to the basic block with the =, op=, ++, or --operator.
read (x, y); if x> 0 z = 1; else z = 0; if y < 0 write (z) else write (y / z); Path : 2n Data flow : n4 n is the number of conditional statements Branch: Cyclomatic complexity (McCabe) CC(G) = #E - #N + 2P. Data flow Testing
C-Use • Variable x is in C-USE(n) if x occurs as a C-USE expression in basic block as • a procedure argument, • an initializer in a declaration, • a return value in a return statement, • the second operand of “=“ • either operand of “op=“ • the operand of ++, --, * • the first operand of . , =>
P-Use • Variable x is in P-USE(n) if x occurs as a P-USE expression in basic block • as the conditional expression in a if, for, while, do, or switch statement. • as the first operand of the condition expression operator (?:), the logical and operator (&&), or the logical or operator (||)
C-Use Coverage • A C-Use is a variable x and the set of all paths in the data-flow graph from node na to nb such that • x is in DEF(na), and • x is not in DEF(ni) for any other node ni on the paths (definition clear path), and • x is in C-USE(nb). • A C-Use is covered by a set of tests if at least one of the paths in the C-Use is executed when the test is run.
P-Use Coverage • A P-Use is a variable x and the set of all paths in the data-flow graph from node na to nb such that • x is in DEF(na), and • x is not in DEF(ni) for any other node ni on the paths (definition clear path), and • x is in P-USE(nb). • A P-Use is covered by a set of tests if at least one of the paths in the P-Use is executed when the test is run.
read (x, y); if x> 0 z = 1; else z = 0; if y < 0 write (z) else write (y / z); x = 1, y = 1 -> 1 x = -1, y = -1 -> 0 x = 0, y = 0 -> error Path Coverage X>0 Y < 0
all-paths all-du Paths all-uses all-p-uses/some-c-uses all-p-uses Branch Statement all-c-uses/some-p-uses all-c-uses all-defs
Complexity of White-Box Testing • Branch Coverage: • McCabe’s Cyclomatic complexity: CC(G) = #E - #V + 2P (G = (E, V)) • All-defs: M + I * V • All-p-uses, all-c-uses/some-p-uses, all-p-uses/some-c-uses, all-uses: N 2 • All-du, All-Path: 2N: N is the number of conditional statements
Mutation Testing • A program under test is seeded with a single error to produce a ``mutant'' program. • A test covers a mutant if the output of the mutant and the program under test differ for that test input. • The mutation coverage measure for a test set is the ratio ofmutants covered to total mutants
DebuggingProgram Slicing Approach • Static slicing: decomposes a program by statically analyzing data-flow and control flow of the program. • A static program slice for a given variable at a given statementcontains all the executable statements that could influence the value of that variable at the given statement. • The exact execution path for a given input is a subset of the static program slice with respect to the output variables at the given checkpoint. • Focus is an automatic debugging tool based on static program slicing to locate bugs.
Dynamic Slicing • dynamic data slice: A dynamic data slice with respect to a given expression, location, and test case is a set of all assignments whose computations have propagated into the current value of the given expression at the given location. • dynamic control slice:A dynamic control slice with respect to a given location and test case is a set of all predicates that enclose the given location.
Object-Oriented Testing Process • Unit testing - class • Integration testing - cluster • System testing - program
Class Testing • Inheritance • Polymorphism • Sequence
Class Testing Strategies • Testing inheritance • Testing polymorphism • State-oriented testing • Data Flow testing • Function dependence class testing
A subclass may re-define its inherited functions and other functions may be affected by the re-defined functions. When this subclass is tested, which functions need to be re-tested? Class foo { int local var; ... int f1() { return 1; } int f2() { return 1/f1(); } } Class foo child :: Public foo { // child class of foo int f1() { return 0; } } Inheritance
Testing Inheritance ``Incremental testing of object-oriented class structures.'' Harrold et al., (1992) • New methods: complete testing • Recursive methods: limited testing • Redefined methods: reuse test scripts
An object may be bound to different classes during the run time. Is it necessary to test all the possible bindings? // beginning of function foo { ... P1 p; P2 c; ... return(c.f1()/p1.f1()); // end of function foo Polymorphism
Testing Polymorphism • ``Testing the Polymorphic Interactions between Classes.'' McDaniel and McGregor (1994) Clemson University
State-Oriented Testing • ``The state-based testing of object-oriented programs,'' 1992, C. D. Turner and D. J. Robson • ``On Object State Testing,'' 1993, Kung et al. • ``The testgraph methodology: Automated testing of collection classes,'' 1995, Hoffman and Strooper • The FREE approach: Binder http://www.rbsc.com/pages/Free.html