720 likes | 910 Views
OO System Testing Behavioral test patterns. Automatic test synthesis from UML models. Outline. System testing Behavioral test patterns Generating behavioral test patterns. Testing product lines. Benefiting from the PL specificities Testing commonalities
E N D
OO System TestingBehavioral test patterns Automatic test synthesis from UML models
Outline • System testing • Behavioral test patterns • Generating behavioral test patterns
Testing product lines • Benefiting from the PL specificities • Testing commonalities • Deriving tests according to the variants • Specific tests • Reusing tests • Building test assets • Defining test independently from the products • Using generic scenarios • Deriving product-specific test cases from those generic scenarios
Test système et UML • Meeting distribué
The use case scenarios • High level, simple, incomplete • Wildcards for genericity • Example: :Server :Server x:user x:user enter(*, x) enter(*, x) ok nok Exceptional case (a) Nominal case (b) (b) Enter use case scenario (x is a scenario parameter)
Cas d’utilisation Scénarios Nominaux Scénarios Exc. Rares Scénarios Exc. Echecs A Planifier NA1, NA2 EA1, EA2 B Ouvrir NB1 EB1, EB2 I Clôturer NI1 RI1 C Consulter NC1 EC1 D Entrer NC1 RD1 ED1, ED2 E Demander la Parole NE1 EE1 G Parler NG1, NG2 RG1 EG1, EG2 H Sortir NH1 EH1 F Donner la Parole NF1 EF1, EF2 Test système et UML
Test système et UML • Critère minimum: Couvrir chaque scénario avec une donnée de test • Ici 27 cas de test • Critère de couverture des combinaisons de use-cases • Prérequis : un diagramme d’activité des use-cases
A swimlane per actor Visually significant Within UML notations Suitable to apply algorithms Activity diagrams • Difficult to build • Hard or impossible to express certain behaviors • Not suitable for use cases shared by actors
Test système et UML • Critère : tester chaque scénario de chaque cas d’utilisation dans chaque séquence nominale élémentaire (1 passage par boucle)
é ù N A 1 é ù N ê ú B 1 é ù é ù é ù é ù é ù é ù N N N N N N N ê ú [ ] [ ] ê ú A 2 A 1 A 1 I 1 A 1 I 1 H 1 · · · E N N ê ú ê ú ê ú ê ú ê ú ê ú ê ú ê ú B 1 1 1 B B E N N R N R E ë û ë û ë û ë û ë û ë û ê ú A 1 A 2 A 2 I 1 A 2 I 1 H 1 E ê ú ë û B 2 E ë û A 2 Test système et UML • Données de tests à générer pour la séquence A.B.I.H => 4 + 2x3 + 2x1x2 + 2x1x2x2 = 22 cas de test doivent être générés via ce « pattern » Cible de Test A B I H Combinaison des Cas de · · · Test
é ù é ù é ù é ù é ù N N N N E [ ] [ ] [ ] 1 1 1 1 1 A H A A A N N E ê ú ê ú ê ú ê ú ê ú 1 1 1 B B B N E N N E ë û ë û ë û ë û ë û 2 1 2 2 2 A H A A A Test système et UML • En évitant les redondances: ordonner les « cibles de test » => 10 cas de test doivent être générés via ce « pattern » Cible de Test H I B A Combina ison des Cas de · · · · · · N R I1 I1 Test
Behavioral Test Patterns • Based on the use case scenarios • high level • generic (use of wildcards) • incomplete • nominal or exceptional • A selection from among the scenarios : • An accept scenario ( test objective) • Reject scenarios (optional) • Prefix scenarios ( initialisation, optional)
Benefits from test patterns • Generation of product specific test cases from product independant test patterns • But tedious to build test patterns especially for « basis tests » • Idea : being able to build automatically significant sets of test patterns
How to exploit use cases ordering ? • Generate pertinent paths of use cases • In order to reach a test criterion • Issues: • An algorithm to assemble the use cases taking into account the pre and post conditions • Defining pertinent test criterions
Conclusion • From early modeling to test cases : • From reusable and generic test pattern • To concrete test cases, specific to each product • Two ways of selecting test patterns: • manually (qualitative approach) • driven by use cases sequential dependencies (quantitative approach)
From system level test patterns to specific test cases : application to product-Line architectures
Product Line architectures • A product line : a set of systems which share a common software architecture and a set of reusable components. • Building a product line aims at developing once the common core of a set of products, and to reuse it for all the products. • Defining a product family • Variants and commonalities • Reuse assets • For our purpose: specify behavioural test patterns, that become reusable “test assets” of the product-line
Product Line architectures: a key challenge • Use case scenarios cannot be used directly for testing • Generic and incomplete. • Parameters are not known, nor object instances (scenarios concern roles). • Specify the general system functionality without knowing – at that stage - the exact sequence calls/answers. • Generating test cases from such test patterns for a given UML specification is thus one of the key challenges in software testing today.
PL • Variants • optional, when a component can be present or not, • alternative, when at a variation point, one and only one component can be chosen among a set of components, • multiple, when at a variation point, several components can be chosen among a set of components. • All the variants must appear in the architecture but not all the possible combination of variants • Extracting a product from the global product line architecture : product instantiation
Product Line architectures: example • Virtual Meeting Server PL offers simplified web conference services: • it aims at permitting several kinds of work meetings, on a distributed platform. ( general case of a ‘chat’ software). • When connected to the server, a client can enter or exit a meeting, speak, or plan new meetings. • Three types of meetings • standard meetings where the client who has the floor is designated by a moderator (nominated by the organizer of the meeting) • democratic meetings which are standard meetings where the moderator is a FIFO robot (the first client to ask for permission to speak is the first to speak) • private meetings which are standard meetings with access limited to a defined set of clients.
VirtualMtg enter plan consult open leave manager user close speak hand over moderator connect The Virtual Meeting Example • Connection to the server • Planning of meetings • Participation in meetings • Moderation of meetings Virtual meeting use case diagram
Product Line architectures: example • Due to marketing constraints, the Virtual Meeting PL is derivable into three products • a demonstration edition: standard and limited • a personal edition: any type but limited • an enterprise edition: any type, no limitations • Two variants : type (multiple)and participants limitation (optional) • (also OS, languages, interfaces etc.)
The Virtual Meeting Example • Two main variants: • the kinds of meetings available • the limitation of the number of participants • Three products: • Demonstration edition • Personal edition • Enterprise edition
Testing product lines • Benefiting from the PL specificities • Testing commonalities • Deriving tests according to the variants • Specific tests • Reusing tests • Building test assets • Defining test independently from the products • Using generic scenarios • Deriving product-specific test cases from those generic scenarios
A contradiction • Test scenarios must be expressed at a very high level • to be reusable • to be independent from the variants and the products • Generic scenarios are too vague and incomplete cannot be directly used on a specific product • Impossible to reuse generic test scenarios ?
Behavioral Test Patterns • Based on the use case scenarios • high level • generic • product independent • nominal or exceptional • A selection from among the scenarios : • An accept scenario • Reject scenarios • Prefix scenarios
Testing a PL • Behavioral Test Patterns (or Test Objective) • an accept scenario: it expresses the behavior that has to be tested, e.g. the successful exit (“leave” a meeting use case) of a participant from a meeting, • one or several (optional) reject scenarios: they express the behaviors that are not significant for the tester, e.g. the consult function of a meeting state does not interact with the entering into a meeting. • one or several (optional) preamble (or prefix) scenarios that must precede the accept scenario. For example, a meeting must be opened before any participant can enter the virtual meeting.
:Server An Example :Server x:user S+ enter(*, x) :Server nok x:user connect(x) ok Prefix plan(*, x) y:user ok close(*, y) open(*, x) ok S- :Server y:user leave(*, y)
The reject scenarios • Optional • Reduce the « noise » • Avoid calls irrelevant for the test • Exclude interfering calls
:Server x:user connect(x) ok Prefix plan(*, x) user1:user ok open(*, x) ok user2:user server:demoServer user3:user Object diagram user4:user The prefix • Describes the preamble part of the test case • Guides the synthesis • A composition of use-case scenarios • Scenarios versus object diagram ?
Scenarios independent from the enter use case : added as reject scenarios Typical reject scenarios • Some scenarios can be added automatically • Use of a boolean dependency matrix
Typical reject / prefix scenarios • Use of the activity diagram • Accept scenario = the targeted scenario in a use cas • Prefix = the previous scenarios in the path • Reject = all the scenarios of the use cases that are not involved in the path.
UC1 nominal exceptional Use cases UC2 nominal exceptional manual or automated Accept scenario Reject scenarios (optional) Prefix scenarios (optional) General Design Main classes Interfaces… Test cases synthesis TP1 TP2 Detailed Design P1 Evolution Product instanciation P2 P3 Generating test patterns selection Test patterns specification (test objective)
Compiling the Test Pattern • Inputs venant d’UML: • Le diagramme de classes détaillé avec – autant que possible – un statechart par classe active du système • Un diagramme d’objets initial • Le pattern de test • Les aspects dynamiques sont fournis à TGV sous forme d’API • Output : • Un scénario détaillé UML décrivant tous les appels précis et les verdicts attendus à effectuer sur le système pour observer le comportement spécifié dans le pattern
Compiling the Test Pattern • accept+ = sequential composition of the prefix and the accept scenario • Scenarios making up the test case = • accepted by accept+ • rejected by none of the reject scenarios • accept+ LTS S+ • reject scenarios {seqj-}jJ LTS {Sj-}j J • Test pattern LTS S+ j J Sj-
Synthesis of the test case • Inputs of TGV: • Simulation API • LTS representing the Test Pattern • Which actions are internal ? • Which actions are inputs ? outputs ? • Output of TGV: IOLTS representing a test case • UML test case derivation
Product Line architectures: example (a) non-limited meetings (b) limited meetings
:Server An Example :Server x:user S+ enter(*, x) :Server nok x:user connect(x) ok Prefix plan(*, x) y:user ok close(*, y) open(*, x) ok S- :Server y:user leave(*, y)
Test patterns and test cases server:demoServer user1:user user2:user user3:user user4:user connect(user1) ok Preamble plan(aMtg, user1) ok open(aMtg, user1) ok enter(aMtg, user1) ok enter(aMtg, user2) ok enter(aMtg, user3) ok Test Objective enter(aMtg, user4) nok
Conclusion • From early modeling to test cases : • From reusable and generic test pattern • To concrete test cases, specific to each product • Methodology « not fully automated » …
Testability ? • Quality factor • The property of a software to be easily tested • A testability measurement • To estimate the testing effort • early in the software life-cycle • hopefully make the design closer to a correct implementation
Objectives • Focus on OO specific testing problems • In OO software control is widespread all over the design • Objects interactions • The UML class diagram isthe main specification/design reference • The test criterioncan be defined with respect to this reference
Objectives • The class diagram is often under-specified from a testing point of view • Many potential object interactions will never occur on final software • Make the class diagram more complete to avoid hard to test interactions between objects