310 likes | 759 Views
A Test Scenario Description Language. Bridging the gap between model-based testing and test execution languages such as TTCN-3 Dr. Andreas Ulrich Siemens AG, Corporate Technology andreas.ulrich@siemens.com. Overview. Introduction and motivation Modeling test scenarios Conclusions.
E N D
A Test Scenario Description Language Bridging the gap between model-based testing and test execution languages such as TTCN-3 Dr. Andreas Ulrich Siemens AG, Corporate Technology andreas.ulrich@siemens.com
Overview • Introduction and motivation • Modeling test scenarios • Conclusions
MBT in practice • Baris Güldali, Univ. Paderborn on http://model-based-testing.info/, • 2011-08-12: • “In the last weeks I have attended many industrial conferences where we conducted many talks on MBT. My impression is that MBT still didn’t reach into people’s hearts. • At iqnite conference in Düsseldorf last week, visitors feedback has shown that only 2 % of attending companies are familiar with or have tried MBT.”
Scenario-basedTesting Motivation(functional testing of discrete I/O event systems) Limits of current MBT approaches and tools • Rely on models that are expensive to create • Focus on structural coverage of model, but not fault detection • Insufficient support for concurrent interactions Ways out from the “MBT crisis” • Simplify models to carry only essential parts • Support concurrency directly in the model • Provide sound test implementationsof known coverage
What is scenario-based testing? Cam Kaner on Scenario Testing,STQE Magazine, Sep./Oct. 2003 • The scenario is a story about someone trying to accomplish something with the product under test. • Scenarios are useful to connect to documented software requirements, especially requirements modeled with use cases. • A scenario test provides an end-to-end check on a benefit that the program is supposed to deliver. Here we use scenarios to systematically test for the fulfillment of requirements on the software system.
The ScenTest meta-model – Basic components Static view on the SUT with its external ports and events / messages. Set of scenarios that describe interactions at the SUT’s ports (black-box approach).Each scenario represents a test. Optional graph that links scenarios together.Useful when describing choices over SUT inputs.Used for generating tests across scenarios.
The ScenTest meta-model – Test architecture Class diagram: Type definitions for SUT, ports, messages.Allows for syntactical checks of the model. Component diagram: Instantiation of SUT with its ports.Used as objects in test scenario diagrams.
The ScenTest meta-model – Test architecture (cnt.) • SUT is modeled as a single instance, even if comprised of several distributed components • All ports (i.e. interfaces) of the SUT that are exposed in testing must be defined together with its events / messages • Points of control and observation – SUT inputs and outputs • Points of observation – SUT outputs only Multi-port system Black-box testing approach • Assigning event / message types to port types enables validation of test scenario models e.g. misuse of messages at a given port
The ScenTest meta-model – Test scenarios Sequence diagram: Test scenarios are described as sequence diagrams with object lifelines as defined in the test architecture. Packages: Scenarios can be group using packages. A package contains at most one scenario.
The ScenTest meta-model – Test scenarios (cnt.) • A scenario describes the behavior of a (possibly distributed) SUT as it is observable at its (multiple) ports by an assumed ideal global tester • A scenario describes the expected behavior of the SUT • Derived from system requirements and use cases • System model, no test-specific model • Hence, in testing any observed deviation is a failure • A test scenario starts with an input to the SUT (received message) • Guarantees control over the SUT by a tester • Rules out testing of any oscillating systems that do not stabilize • One scenario relates to one executable test in test generation
ScenTest test scenario modeling – Overview • Basic concepts for behavioral modeling taken from CSP – Communicating Sequential Processes (Hoare 1978) • (MSC) Sequence (CSP) Prefixing, sequence • (MSC) Loop (CSP) Recursion • (MSC) Alternative (CSP) Non-deterministic choice • (MSC) Parallel (CSP) Concurrency (interleaving) • (MSC) Unless (CSP) Interruption • Not all concepts are expressible in UML2/MSC! • Some extensions to cope with testing • Requirement tracing • Ignore messages ignore superfluous SUT outputs • Unless Exceptional behavior within a defined scope • Optional messages variant of alternative
The whole picture – How test scenarios fit into the test process with MBT System Descriptions State-basedModels Test Scenario Graph(Activity Diagr., HL-MSC) Generation oftest scenarios TestScenarios Test Scenario Descriptions directly manual Generation oftest implementations JUnit TTCN-3 . . . Test Execution Languages
Test scenario descriptions – Summary Test scenario descriptions… • Focus on conformance testing for multi-port concurrent/distributed systems • Are more abstract then TTCN-3 descriptions • Test scenarios could be considered as a generalization ofGraphical TTCN-3, but are descriptive rather than operational • Allow for correctness checks at pre-compilation time • Could serve as an intermediate description to link and unify different MBT tools Next steps • Complete missing language features for test scenarios • Message templates, local actions, procedure calls • Expand towards describing real-time constraints • Offer a complementary textual representation
The ScenTest meta-model – Test scenario graph (Optional) Interaction overview diagram: Dependencies between test scenarios are described here. In particular the diagram captures choices over test scenarios.
The ScenTest meta-model – Test scenario graph (cnt.) • Optional graph to capture dependencies between test scenarios • Supported in UML2 by an interaction overview diagram (variant of an activity diagram; similar to High-Level MSC) • Offers choices over paths over scenarios Resulting in a fine-granular system specification • Since a test scenario starts with an input We actually specify choices over test inputs • May contain loops • Paths can be determined using various test generation criteria • Opens up various new modeling facilities Not all have been elaborated yet, e.g. parallel scenarios (fork/join), nesting of activities, mixed activities / scenario references
Modeling scenario graph and alternative approach vs. Test generator generates tests according to chosen coverage criterion. User models tests explicitly and keeps control over them.