850 likes | 1.12k Views
Q uasimodo. Model-Based Testing and Test-Based Modelling. Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL. Overview. Model- Based Testing Model-Based Testing with Labelled Transition Systems Model-Based Testing: A Wireless Sensor Network Node
E N D
Quasimodo Model-Based Testingand Test-Based Modelling Jan TretmansEmbedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL
Overview • Model-BasedTesting • Model-Based Testing with Labelled Transition Systems • Model-Based Testing: A Wireless Sensor Network Node • Test-Based Modelling
specification (Software) Testing checking or measuringsome quality characteristicsof an executing objectby performing experimentsin a controlled wayw.r.t. a specification tester SUT System Under Test
Sorts of Testing phases system integration module unit accessibility portability white box black box maintainability efficiency usability reliability functionality aspects
But also: ad-hoc, manual, error-prone hardly theory / research no attention in curricula not cool :“if you’re a bad programmer you might be a tester” Testing is: important much practiced 30-50% of project effort expensive time critical not constructive(but sadistic?) Paradox of Software Testing • Attitude is changing: • more awareness • more professional
Testing Challenges Trends in Software Development • Increasing complexity • more functions, more interactions, more options and parameters • Increasing size • building new systems from scratch is not possible anymore • integration of legacy-, outsourced-, off-the shelf components • Blurring boundaries between systems • more, and more complex interactions between systems • systems dynamically depend on other systems, systems of systems • Blurring boundaries in time • requirements analysis, specification, implementation, testing, installation, maintenance overlap • more different versions and configurations • What is a failure ?
!coffee ?button !alarm ?coin ?button Formal Models (Klaas Smit)
Developments in Testing 1 • Manual testing SUT System Under Test passfail
testcases TTCN TTCN Developments in Testing 2 test execution • Manual testing • Scripted testing SUT passfail
Developments in Testing 3 high-leveltest notation test execution • Manual testing • Scripted testing • High-level scripted testing SUT passfail
Developments in Testing 4 model-basedtest generation systemmodel Testcases TTCN TTCN test execution • Manual testing • Scripted testing • High-level scripted testing • Model-based testing SUT passfail
Model-Based . . . . .Verification, Validation, Testing, . . . . .
Validation, Verification, and Testing ideas ideaswishes validation validation properties model verification abstractmodels,math concreterealizations testing testing SUT
Model-based verification : formal manipulation prove properties performed on model Model-based testing : experimentation show error concrete system Verification and Testing formal world concrete world Verification is only as good as the validity of the model on which it is based Testing can only show the presence of errors, not their absence
Code Generation from a Model A model is more (less)than code generation: • views • abstraction • testing of aspects • verification and validationof aspects met
Model-Based Testing model-basedtest generation systemmodel Testcases TTCN TTCN test execution SUT passfail
iocotest generation LTSmodel Testcases TTCN TTCN LTStest execution SUTbehaving asinput-enabled LTS passfail MBT with Labelled Transition Systems input/outputconformanceioco • set ofLTS tests
!coffee ?button !alarm ?coin ?button Models: Labelled Transition Systems Labelled Transition System: S, LI, LU, T, s0 initial state states transitions input actions output actions ? = input ! = output
test casemodel ! coin !coffee ! button --- !alarm ?button ?coin ?coffee ?alarm pass ?button Models: Generation of Test Cases specificationmodel fail fail
test casemodel ! button !coffee ! coin --- ?button !alarm ? coffee ?coin ? alarm fail pass fail ?button Models: Generation of Test Cases specificationmodel
pp = !x LU {} . p!x Conformance: ioco i ioco s =def Straces (s) : out (i after ) out (s after ) Straces ( s ) = { ( L {} )* | s } pafter = { p’ | pp’ } out ( P) = { !xLU | p!x,pP } { | pp, pP }
Conformance: ioco iioco s =def Straces(s) : out (iafter ) out (s after ) • Intuition: • i ioco-conforms to s, iff • if i produces output x after trace , then s can produce x after • if i cannot produce any output after trace , then s cannot produce any output after (quiescence)
specificationmodel ?dime ioco ioco !coffee !tea ioco ?quart ioco ?dime ?quart ?dime ?dime ?dime ?dime ?dime?quart !choc !choc !coffee !tea !tea !coffee ?dime?quart Example: ioco
? x (x < 0) SUT models ? x (x >= 0) ! x ? x ? x (x >= 0) ! y (|yxy–x| < ε) ? x (x < 0) !error ? x (x >= 0) ? x ! -x Example: ioco specificationmodel • LTS and ioco allow: • non-determinism • under-specification • the specification of properties rather than construction
?dub ?dub ?dub ?dub siocoi ?dub ?dub ?dub ?dub !tea !tea ?dub ?dub ?dub ?dub !tea !coffee !coffee ?dub ?dub ?dub iioco s =def Straces(s) : out (iafter ) out (s after ) s i iiocos out (iafter ?dub.?dub) =out (safter ?dub.?dub) = { !tea, !coffee } out (iafter ?dub..?dub) = { !coffee } out (safter ?dub..?dub) = { !tea, !coffee }
?coffee ?tea fail fail ?coffee ?tea pass fail ?coffee ?tea fail fail Test Case !dub test case = labelled transition system • ‘quiescence’ label • tree-structured • finite, deterministic • final states pass and fail • from each state pass, fail : • either one input !a • or all outputs ?x and !kwart ?coffee ?tea fail fail !dub ?coffee ?tea pass pass fail
1 end test case pass Test Generation Algorithm: ioco Algorithmto generate a test case t(S)from a transition system state set S, with S ( initially S = s0 after ). Apply the following steps recursively, non-deterministically: 3 observe all outputs forbidden outputs allowed outputs ?y ?x 2 supply input !a allowed outputs fail fail forbidden outputs ?y ?x !a t(S after !x) fail fail allowed outputs (or ): !x out(S) forbidden outputs (or ): !y out(S) t(S after !x) t(S after ?a )
Example: ioco Test Generation specification test ?dime ?dime !coffee
Example: ioco Test Generation specification test ?dime ?dime !coffee
Example: ioco Test Generation specification test ?dime ?dime !coffee
!dime Example: ioco Test Generation specification test ?dime ?dime !coffee
!dime Example: ioco Test Generation specification test ?coffee ?tea ?dime ?dime fail fail !coffee
!dime Example: ioco Test Generation specification test ?coffee ?tea ?dime ?dime fail fail ?coffee ?tea pass !coffee fail
!dime Example: ioco Test Generation specification test ?coffee ?tea ?dime ?dime fail fail ?coffee ?tea pass !coffee fail ?coffee ?tea pass fail fail
Test Result Analysis: Completeness For every test tgenerated with the ioco test generation algorithm we have: • Soundness :t will never fail with a correct implementationiioco s implies ipasses t • Exhaustiveness :each incorrect implementation can be detectedwith a generated test tiiocos impliest : ifailst
iocotest generation LTSmodel Testcases TTCN TTCN sound exhaustive LTStest execution SUTbehaving asinput-enabled LTS passfail Completeness of MBT with ioco SUT ioco model input/outputconformanceioco • set ofLTS tests SUTpassestests
S1 S1 S2 S2 environment e environment e Testing Equivalences S1S2 e E. obs ( e, S1 ) = obs (e, S2 ) ? ?
MBT: Test Assumption Test assumption : SUT . mSUTMODELS .t TEST . SUT passes t mSUTpasses t SUT mSUT test t test t
s LTS test tool gen : LTS (TTS) iiocos sound exhaustive SUT t SUT Soundness and Completeness Test assumption : SUTIMP . mSUTIOTS . tTTS. SUTpassest mSUTpasses t Prove soundness and exhaustiveness: mIOTS . ( tgen(s) . mpasses t ) mioco s SUTcomformstos passfail SUT passesgen(s)
? SUT passes Ts SUTconforms to s MBT : Completeness SUTpasses Ts SUTpasses Tsdef tTs . SUTpasses t t Ts . SUTpasses t test hypothesis: t TEST . SUT passes t mSUT passes t t Ts . mSUTpasses t prove: mMOD. ( t Ts . m passes t ) m imp s mSUT imp s define : SUT conforms to s iff mSUTimp s SUTconforms to s
Genealogy of ioco Labelled Transition Systems IOTS ( IOA, IA, IOLTS ) Trace Preorder Testing Equivalences(Preorders) Canonical Testerconf Quiescent Trace Preorder Repetitive QuiescentTrace Preorder(Suspension Preorder) Refusal Equivalence(Preorder) ioco
Variations on a Theme • iiocos Straces(s) : out ( i after ) out ( s after ) • iiors ( L {} )* : out ( i after ) out ( s after ) • iioconfs traces(s) : out ( i after ) out ( s after ) • iiocoFs F : out ( i after ) out ( s after ) • iuiocos Utraces(s) : out ( i after ) out ( s after ) • imiocos multi-channel ioco • iwiocos non-input-enabled ioco • iecoe environmental conformance • isiocos symbolic ioco • i(r)tiocos (real) timed tioco(Aalborg, Twente, Grenoble, Bordeaux,..... ) • iriocos refinement ioco • ihiocos hybrid ioco • iqiocos quantified ioco • ipocos partially observable game ioco • istiocoDs real time and symbolic data • . . . . . .
Model-Based Testing :There is Nothing More Practical than a Good Theory • Arguing about validity of test casesand correctness of test generation algorithms • Explicit insight in what has been tested, and what not • Use of complementary validation techniques: model checking, theorem proving, static analysis, runtime verification, . . . . . • Implementation relations for nondeterministic, concurrent,partially specified, loose specifications • Comparison of MBT approaches and error detection capabilities
Test Selection • Exhaustiveness never achieved in practice • Test selection to achieve confidencein quality of tested product • select best test cases capable of detecting failures • measure to what extent testing was exhaustive • Optimization problem best possible testing within cost/time constraints