270 likes | 400 Views
Applications of Automated Model Based Testing with TorX. Ed Brinksma Course 2004. TorX Case Studies. Conference Protocol EasyLink TV-VCR protocol Cell Broadcast Centre component ‘’Rekeningrijden’’ Payment Box protocol V5.1 Access Network protocol Easy Mail Melder FTP Client
E N D
Applications of AutomatedModel Based Testingwith TorX Ed Brinksma Course 2004
TorX Case Studies Conference Protocol EasyLink TV-VCR protocol Cell Broadcast Centre component ‘’Rekeningrijden’’ Payment Box protocol V5.1 Access Network protocol Easy Mail Melder FTP Client “Oosterschelde” storm surge barrier-control academic Philips CMG Interpay Lucent CMG academic CMG
The Conference Protocol Experiment • Academic benchmarking experiment,initiated for test tool evaluation and comparison • Based on really testing different implementations • Simple, yet realistic protocol (chatbox service) • Specifications in LOTOS, Promela, SDL, EFSM • 28 different implementations in C • one of them (assumed-to-be) correct • others manually derived mutants • http://fmt.cs.utwente.nl/ConfCase
The Conference Protocol Conference Service CPE CPE CPE UDP Layer join leave send receive
Conference ProtocolTest Architecture Tester TorX A B C UT-PCO = C-SAP CPE = IUT U-SAP LT-PCO LT-PCO UDP Layer
The Conference Protocol Experiments • TorX - LOTOS, Promela : on-the-fly ioco testing Axel Belinfante et al.,Formal Test Automation: A Simple ExperimentIWTCS 12, Budapest, 1999. • Tau Autolink - SDL : semi-automatic batch testing • TGV - LOTOS : automatic batch testing with test purposes Lydie Du Bousquet et al.,Formal Test Automation: The Conference Protocol with TGV/TorXTestCom 2000, Ottawa. • PHACT/Conformance KIT - EFSM : automatic batch testing Lex Heerink et al.,Formal Test Automation: The Conference Protocol with PHACTTestCom 2000, Ottawa.
Conference Protocol Results Results: fail pass “core dump” TorXLOTOS 25 3 0 TorXPromela 25 3 0 PHACT EFSM 21 6 1 TGV LOTOS random 25 3 0 TGV LOTOS purposes 24 4 0 pass 000444666 000444666 000444666289293398 000444666 000444666332
Conference Protocol Analysis • Mutants 444 and 666 react to PDU’s from non-existent partners: • no explicit reaction is specified for such PDU’s,so ioco-correct, and TorX does not test such behaviour • So, for LOTOS/Promela with TGV/TorX:All ioco-erroneous implementations detected • EFSM: • two “additional-state” errors not detected • one implicit-transition error not detected
Conference Protocol Analysis • TorX statistics • all errors found after 2 - 498 test events • maximum length of tests : > 500,000 test events • EFSM statistics • 82 test cases with “partitioned tour method” ( = UIO ) • length per test case : < 16 test events • TGV with manual test purposes • ~ 20 test cases of various length • TGV with random test purposes • ~ 200 test cases of 200 test events
EasyLink Case Study TV VCR EasyLink • protocol between TV and VCR • simple, but realistic • features: • preset download • WYSIWYR (what you see is what you record) • EPG download • ... object of testing communication
EasyLink Test Architecture MBB (= Magic Black Box) allows to monitorcommunicationbetween TV and VCR by PC allows PC to send messagesto mimic TV or VCR TorX distributedover PC and workstation MBB VCR TV RC manual inter-action PC Work Station
Testing Preset Download Feature • What? • check whether TV correctly implementspreset download based on Promela specification • How? • let PC play role of VCR and initiate preset download • receive settings from TV • WHILE (TRUE) { let PC initiate preset download let PC non deterministically stop preset download check for consistency in presets } • feature interaction:shuffle presets on TV using RC all under control of PC
EasyLink Experiences test environment influences what can be tested testing power is limited by functionality of MBB initially, state of TV is unknown tester must be prepared for all possible states some “hacks” needed in specification and tool architecture in order to decrease state space automatic specification based testing feasible tool architecture also suitable to cope with user interaction some (non fatal) non-conformances detected Results:
CMG - CBC Component Test • Test one component of Cell Broadcast Centre • LOTOS (process algebra) specification of 28 pp. • Using existing test execution environment • Based on automatic generation of “adapter” based on IDL • Comparison (simple): existing test TorX • code coverage 82 % 83 % • detected mutants/10 5 7 • Conclusion: • TorX is as least as good as conventional testing(with potential to do better) • LOTOS is not nice (= terrible) to specify such systems
“Rekeningrijden” Characteristics : • Simple protocol • Parallellism : • many cars at the same time • Encryption • Real-time issues • System passed traditional testing phase
‘’Rekeningrijden’’ :Phases for Automated Testing • IUT study • informal and formal specification • Available tools study • semantics and openness • Test environment • test architecture, test implementation, SUT specification • testing of test environment • Test execution • test campaigns, execution, analysis
‘’Rekeningrijden’’ Highway Tolling System Payment Box (PB) Road Side Equipment Onboard Unit UDP/IP Wireless
‘’Rekeningrijden’’: Test Architecture I PaymentBox TorX spec PB PCO
‘’Rekeningrijden’’: Test Architecture II PaymentBox TorX Test Context spec PB + UDP/IP SUT UDP/IP IAP PCO
‘’Rekeningrijden’’: Test Architecture III spec TorX Test Context PaymentBox ObuSim PB + ObuSim + TCP/IP + UDP/IP PCO IAP SUT UDP/IP TCP/IP
‘’Rekeningrijden’’: Test Campaigns • Introduction and use of Test Campaigns : • Management of test tool configurations • Management of IUT configurations • Steering of test derivation • Scheduling of test runs • Archiving of results
‘’Rekeningrijden’’: Issues • Parallellism : • very easy • Encryption : • Not all events can be synthesized :Leads to reduced testing power • Real-time : • How to cope with real time constraints ? • Efficient computation for on-the-fly testing ? • Lack of theory: quiescence vs. time-out
‘’Rekeningrijden’’ Problem:Quiescence in ioco vs. time-out Input Input Observe Input Observe Input tq tq Quiescence Tick Timeout Timeout Timeout Timeout TorX PB TorX PB Spec := Spec + Tick
Input01 Input0 Input0 Input0 Input1 Timeout Input1 Input1 Full Unexpected Error TorX PB TorX PB ‘’Rekeningrijden’’ Problem:Action Refinement Spec := Refine + Buffer
‘’Rekeningrijden’’: Issues • Modelling language: LOTOS Promela • Spec for testing Spec for validation • Development of specification is iterative process • Development of test environment is laborious • Parameters are fixed in the model • Preprocessing: M4/CPP • Promela problem: Guarded inputs • Test Campaigns for bookkeeping and control of experiments • Probabilities incorporated
‘’Rekeningrijden” : Results • Test results : • 1 error during validation (design error) • 1 error during testing (coding error) • Automated testing : • beneficial: high volume and reliability • many and long tests executed ( > 50,000 test events ) • very flexible: adaptation and many configurations • Step ahead in formal testing of realistic systems