520 likes | 704 Views
Black-box conformance testing for real-time systems. Stavros Tripakis VERIMAG Joint work with Moez Krichen. black box. inputs. outputs. Tester. Verdicts (pass/fail/?). Black-box conformance testing. Does the SUT conform to the Specification ?. SUT (system under test). Specification.
E N D
Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen
black box inputs outputs Tester Verdicts (pass/fail/?) Black-box conformance testing Does the SUT conform to the Specification ? SUT (system under test) Specification
Model-based testing • The specification is given as a formal model. • The SUT also behaves according to an unknown model (black-box). • Conformance of SUT to the specification is formally defined w.r.t. these models.
SUT inputs outputs Tester Verdicts (pass/fail) Real-time Testing Our models of preference Theory: timed automata Practice: the IF language (www-verimag.imag.fr/~async/IF/) Tester observes events and time-stamps.
Plan of talk • Specification model • Conformance relation • Analog & digital tests • Test generation • Tool and case studies
Plan of talk • Specification model • Conformance relation • Analog & digital tests • Test generation • Tool and case studies
Specification model:general timed automata with input/output/unobservable actions • Timed automata = finite-state machines + clocks. • Input/output actions: interface with environment and tester. • Unobservable actions: • Model partial observability of the tester. • Good for compositional specifications.
a? b! x 4 x:=0 Simple example 1 “Output b at most 4 time units after receiving input a.”
Compositional specifications A B C Compositional specifications with internal (unobservable) actions.
internal (unobservable) actions. Compositional specifications
environment system (spec) Modeling assumptions on the environment Compose the specification with a model of the environment. Export the interactions between them (make them observable).
Simple example 2 “Output b at most 4 time units after receiving input a, provided a is received no later than 10 time units.” a? b! x 10 x 4 x:=0 x:=0 Constraints on the inputs model assumptions. Constraints on the outputs model requirements.
a? b! x 4 x:=0 Simple example 2 “Output b at most 4 time units after receiving input a, provided a is received no later than 10 time units.” a? b! a! y 10 y:=0 A compositional modeling of the same example.
Plan of talk • Specification model • Conformance relation • Analog & digital tests • Test generation • Tool and case studies
Conformance relation: tioco • A timed extension of Tretman’s ioco (input-output conformance relation). • Informally, A tioco B if • Every output of the implementation is allowed by the specification, including time delays. • A: implementation/SUT (input-complete). • B: specification (not always input-complete (model environment assumptions).
Conformance relation • Formally: A tioco B (A: implementation, B:specification) iff Traces(B). out(A after ) out(B after )
A after = {s | Seq. s0 s proj(,Obs)=} Conformance relation • where: out(S)= delays(S) outputs(S)
delays(S) = {tR | sS. UnobsSeq. time() = t s } a outputs(S) = {aOutputs | sS . s } Conformance relation • where:
a? b! x 4 x:=0 Spec: Examples “Output b at most 4 time units after receiving input a.”
a? a? b! b! x 4 x:=0 x:=0 x = 4 Spec: Impl 1: Examples “Output b at most 4 time units after receiving input a.”
a? a? b! b! x 4 x:=0 x:=0 x = 4 Spec: Impl 1: Examples “Output b at most 4 time units after receiving input a.” OK!
a? a? a? b! b! b! x 4 x 2 x:=0 x:=0 x:=0 x = 4 Spec: Impl 1: Impl 2: Examples “Output b at most 4 time units after receiving input a.” OK!
a? a? a? b! b! b! x 4 x 2 x:=0 x:=0 x:=0 x = 4 Spec: Impl 1: Impl 2: Examples “Output b at most 4 time units after receiving input a.” OK! OK!
a? a? b! b! x 4 x:=0 x:=0 x = 5 Spec: Impl 3: Examples “Output b at most 4 time units after receiving input a.”
a? a? b! b! x 4 x:=0 x:=0 x = 5 Spec: Impl 3: Examples “Output b at most 4 time units after receiving input a.” NOT OK!
a? a? b! b! x 4 x:=0 x:=0 x = 5 Spec: Impl 3: a? Impl 4: Examples “Output b at most 4 time units after receiving input a.” NOT OK!
a? a? b! b! x 4 x:=0 x:=0 x = 5 Spec: Impl 3: a? Impl 4: Examples “Output b at most 4 time units after receiving input a.” NOT OK! NOT OK!
Plan of talk • Specification model • Conformance relation • Analog & digital tests • Test generation • Tool and case studies
Timed tests • Two types of tests: • Analog-clock tests: • Can measure real-time precisely • Difficult to implement for real-time SUTs • Good (flexible) for discrete-time SUTs with unknown time step • Digital-clock tests: • Can count “ticks” of a periodic clock/counter • Implementable for any SUT • Conservative (may say PASS when it’s FAIL)
a a b b c c 1.3 2.4 2.7 time time Timed tests • Analog-clock tests: • They can observe real-time precisely, e.g.: • Digital-clock (or periodic-sampling) tests: • They only have access to a periodic clock, e.g.: 1 2 3
a b c 1.3 2.4 2.7 time Timed tests • Analog-clock tests: • They can observe real-time precisely, e.g.: • Digital-clock (or periodic-sampling) tests: • They only have access to a periodic clock, e.g.: a b c 1 2 3 time
Note • Digital-clock tests does not mean we discretize time: • The specification is still dense-time • The capabilities of the observer are discrete-time ) • Many dense-time traces will look the same to the digital observer (verdict approximation)
Plan of talk • Specification model • Conformance relation • Analog & digital tests • Test generation • Tool and case studies
Untimed tests • Can be represented as finite trees (“strategies”): i o1 o2 o3 o4 fail i1 i2 i3 … … fail pass
Digital-clock tests • Can be represented as finite trees: i Models the tick of the tester’s clock o1 o2 o3 o4 tick fail … i1 i2 i3 … … fail pass
Analog-clock tests • Cannot be represented as finite trees: i … o1 o2 o3 o4 0.1 0.11 0.2 fail i1 i2 i3 Infinite number of unknown delays … … fail pass Solution: on-the-fly testing
On-the-fly testing • Generate the testing strategy during test execution. • Symbolic generation. • Can be applied to digital-clock testing as well.
If empty, FAIL. observation (event or delay) runs matching observation next estimate Test generation principle current estimate = set of possible states of specification
Test generation algorithmics • Sets of states are represented symbolically (standard timed automata technology, DBMs, etc.) • Updates amount to performing some type of symbolic reachability. • Implemented in verification tools, e.g., Kronos. • IF has more: dynamic clock creation/deletion, activity analysis, parametric DBMs, etc.
“Tick” original specification automaton tick! new specification automaton z = 1 z:= 0 Digital-clock test generation • Can be on-the-fly or static. • Same algorithms. • Trick: • Generate “untimed” tester: tick is observable. • Can also model skew, etc, using other “Tick” automata.
Recent advances • Representing analog-clock tests as timed automata. • Coverage criteria.
Timed automata testers • On-the-fly testing needs to be fast: • Tester reacts in real-time with the SUT. • BUT: reachability can be costly. • Can we generate a timed automaton tester ? • Problem undecidable in general: • Non-determinizability of timed automata. • Pragmatic approach: • Fix the number of clocks of the tester. • Fix their reset positions. • Synthesize the rest: locations, guards, etc.
b? 1 x 4 Timed automata testers • Example: a? b! Spec: 1 x 4 x:=0 a! Tester: PASS x=1 b? x < 1 x > 4 FAIL
Coverage • A single test is not enough. • Exhaustive test suite up to given depth: • Explosion: # of tests grows exponentially! • Coverage: few tests, some guarantees. • Various criteria: • Location: cover locations of specification. • Edge: cover edges of specification. • State: cover states (location,clocks) of spec. • Algorithms: • Based on symbolic reachability graph. • Performance can be impressive: 8 instead of 15000 tests.
Plan of talk • Specification model • Conformance relation • Analog & digital tests • Test generation • Tool and case studies
TTG: Timed Test Generation Implementation • Implemented on top of IFenvironment.
Tool • Input language: IF timed automata • Dynamic creation of processes/channels. • Synchronous/asynchronous communication. • Priorities, variables, buffers, external data structures, etc. • Tool options: • Generate analog tester (or monitor). • Generate digital test/monitor suite: • Interactively (user guided). • Exhaustive up to given length. • Coverage (current work).
SUT SUT outputs inputs outputs Monitor Tester Verdicts (pass/fail) Verdicts (pass/fail) Real-time Monitoring/Testing
A sample test generated by TTG
Case studies • A bunch of simple examples tried out • A simple light controller. • 15000 digital tests up to depth 8. • 8 tests suffice to cover specification. • A larger example: NASA K9 Roverexecutive. • SUT: 30000 lines of C++ code. • TA specification generated automatically from mission plans. • Monitors generated automatically from TA specs. • Traces generated by NASA and tested by us.