1 / 73

Outline

Outline. Protocol Verification / Conformance Testing TTCN Automated validation Formal design and specification Traffic Theory Application Debugging. Testing of the Telecommunications Systems. Functional Tests Formal System Tests against system level requirements Protocol Verification

malaya
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline • Protocol Verification / Conformance Testing • TTCN • Automated validation • Formal design and specification • Traffic Theory • Application Debugging

  2. Testing of the Telecommunications Systems • Functional Tests • Formal System Tests against system level requirements • Protocol Verification • Performance/Load Tests • Interoperability Tests- IOT (Inter-working Tests) • Automatic Test Suites

  3. Protocol Testing • Conformance Testing • Performed utilising test equipment • Based on Conformance Test Suites/Cases specified by standard body • Are pretty detailed in terms of protocol implementation verification • Not good in testing performance/load issues • Inter-working Trials • Testing of the real equipment inter-working • Good for performance/load testing • Hard to completely verify protocol

  4. Conformance Testing • Based on ISO/IEC 9646 (or X.290) • "Framework and Methodology of Conformance Testing of Implementations of OSI and CCITT Protocols.” • Abstract Test Suites (ATS) consisting of Abstract Test Cases • Test Cases are defined using “Black Box”model, only by controlling and observing using the external interfaces. • Provide basis for Generic Test Tools and Methods for verification of Telecommunication Standards and Protocols

  5. ITU Conformance Testing Standards • X.290 - OSI conformance testing methodology and framework for protocol recommendations • X.291 Abstract test suite specification • X.292 The Tree and Tabular Combined Notation (TTCN) • X.293 Test realization • X.294 Requirements on test laboratories and clients for the conformance assessment process • X.295 Protocol profile test specification • X.296 Implementation conformance statements

  6. Conformance Testing • Verification that protocols are conforming to Standard Requirements • PICS - Protocol Implementation Conformance Statement • Information provided by protocol implementator on the extent of the implementation: what optional items are implemented, is there any restrictions … • based on the PICS questioner form provided by standard body • PIXIT - Protocol implementation extra information • provides information regarding the physical configuration of unit under test: eg. Telephone numbers, socket numbers ...

  7. Tree and Tabular Combined Notation TTCN • Defined as part of ISO/IEC 9646 (X.290) • Used as a language for formal definition of test cases • Each test case is an event tree in which external behavior such as: "If we send the message 'connect request,' either 'connect confirm' or 'disconnect indication' will be received" are described. • Messages can be defined using either the TTCN-type notation or ASN.1.

  8. TTCN Tests Reactive Systems Stimulus Response PCO IUT PCO = Point of Control and Observation IUT = Implementation Under Test

  9. A Typical Test Architecture Lower Tester Upper Tester U IUT L Underlying Service Provider

  10. Parallel Test Architecture PTC PTC PTC MTC CP CP CP CP CP MTC = Main Test Component PTC = Parallel Test Component CP = Coordination Point PTC PTC IUT Underlying Service Provider(s)

  11. TTCN Formats • TTCN is provided in two forms: • a graphical form (TTCN.GR) suitable for human readability • a machine processable form (TTCN.MP) suitable for transmission of TTCN descriptions between machines and possibly suitable for other automated processing.

  12. Main Components of TTCN • Test Suite Structure • Data declarations (mainly PDUs) • TTCN data types • ASN.1 data types • Constraints • i.e., actual instances of PDUs • Dynamic behaviour trees (Test Cases) • Send, Receive, Timers, Expressions, Qualifiers • Test Steps

  13. Test Suite Structure • TTCN Test suite consist of 4 major parts • suite overview part • declarations part • constraints part • dynamic part

  14. Suite Overview Part • The suite overview part is a documentary feature comprised of indexes and page references. • This is part of the heritage of TTCN as a paper-oriented language. • It contains a table of contents and a description of the test suite, and its purpose is mainly to document the test suite to increase clarity and readability. • In this part of the test suite, a quick overview of the entire test suite is possible.

  15. Test Suite Overview Proforma

  16. Suite Declarations Part • The declarations part is used for declaring • types, • variables, • timers, • points of control and observation (PCOs), • test components. • The types can be declared using either TTCN or ASN.1 type notation. • Graphical table is used for declarations

  17. TTCN Data Types • There is no equivalent to pointer types and, as a consequence of that, types cannot be directly or indirectly recursive. • Two structured types that are specific • protocol data unit (PDU) – data packets sent between peer entities in the protocol stack • abstract service primitive (ASP) - - a data type in which to embed a PDU for sending and receiving

  18. Example of ASN.1 Type Definition

  19. Example of TTCN Type Definition

  20. Variable Declarations

  21. Constraints Part In this part constraints are used to for describing the values sent or received. The instances used for sending must be complete, but for receiving there is the possibility to define incomplete values using wild cards, ranges and lists.

  22. Dynamic Part • In this part actual tests are defined • Contains Test Suite with • test groups – a grouping of test cases. It might, for example, be convenient to group all test cases concerning connection establishment and transport into a separate test group • test steps – a grouping of test events, similar to a subroutine or procedure in other programming languages • test events – the smallest, indivisible unit of a test suite. Typically, it corresponds to sending or receiving a message and the operations for manipulating timers

  23. TTCN Behavior Tree • Suppose that the following sequence of events can occur during a test whose purpose is to establish a connection, exchange some data, and close the connection. • a) CONNECTrequest, CONNECTconfirm, DATArequest, DATAindication, DISCONNECTrequest. • Possible alternatives to “valid behaviour” are • b) CONNECTrequest, CONNECTconfirm, DATArequest, DISCONNECTindication. • c) CONNECTrequest, DISCONNECTindication.

  24. TTCN Behavior Tree

  25. TTCN Behavior Table The behavior tables build up the tree by defining the events in lines on different indention levels. The rows on the same indention are events that describe alternatives, and row on the next indention are the continuation of the previous line. The line can consist of one or a number of the following: • send statement (!) – states that a message is being sent • receive statement (?) – states that a message is being received • assignment – there is an assignment of values • timer operations – start, stop or cancel a timer • time-out statement • Boolean operations, qualifying the execution • attachment (+) – acts as a procedure call

  26. TTCN Behavior Table All the leaves in the event tree are assigned a verdict that can be: • pass – the test case completed without the detection of any errors • fail – an error was detected • inconclusive – there was insufficient evidence for a conclusive verdict to be assigned, but that the behavior was valid A verdict can be final or preliminary – it will not terminate the active test case execution. To describe what is happening in the test case, the dynamic behavior can be explained in plain language.

  27. Dynamic Behaviour Table

  28. TTCN MP Form

  29. Testing and validation. To test an implementation of a protocol with an SDL specification the following approach can be taken. • Get the interface into the appropriate state using a preamble. • Send the appropriate message. • Check all outputs • Check the new state the protocol is in. • Perform a postamble to return the protocol to the null state.

  30. TTCN What do you need to know • To understand Terminology • To understand test principles and the test environment • You don’t need to know how to write TTCN scripts, but you have to understand them so you can read test reports and analyse problems

  31. Various Test Equipment • Numerous Test Equipment supports the TTCN • Protocol Analyzers • Protocol Simulators • Protocol Emulators

  32. Protocol Analysers (Monitors) • Monitors of the ‘live’ communication between equipment and interprets messages in human readable form. • Same H/W may support monitoring of different protocol types • It may provide some statistical analysis capability • It may have advanced search/filtering capabilities. • You may define event on which logging would start or stop. Some Equipment Some Equipment Protocol Analyser

  33. Protocol Simulators • Allows Users to write a scripts (similar to TTCN) which will send a messages and react on received messages • It can be used for testing various protocols • It may be quite big effort to write required amount of scripts • Suppliers often provide a set of test scripts targeting particular protocol Equipment Under Test Protocol Simulator

  34. Protocol Emulators • Emulates operation of equipment (not full implementation) • No script writing required • It provides some control of behavior and configuration • Not flexible as simulator • Usually one box can contain Protocol Analyser, Simulator and Emulator Equipment Under Test Protocol Emulator

  35. Interoperability Testing (IOT) • Testing what users may expect from product, which may include more then demanded by standards: performance, robustness and reliability • Proper functioning in the system where product is finally installed • Interoperability with applications which use the product • IOT is usually performed in addition to Formal Protocol Conformance Testing

  36. Pros & Cons of IOT • Potential IOT Problems • Not covering all possible protocol scenarios • Difficult to reproduce detected problems • Often little possibility to actively control the test environment so investigation can be done • The pros of IOT • Standard may be ambiguous so different manufactures implement differently • Performance/Load aspects easier to test • IOT may be cheaper and faster • It gives the confidence to user in the system operational capabilities

  37. The Case for Automated Validation • If a standard is ratified by a standards body that has not been validated and contains errors the following may occur: • Some implementations of the protocol may exist with serious errors that will affect service operation and service assurance • Different implementations of the protocol , that are bug free will have proprietary solutions to overcome protocol errors and thus may not inter-work • Some vendors may choose to implement only a subset of standardised protocols, relying on proprietary protocols.

  38. The Case for Automated Validation • Even with simple protocols containing a small number of functional entities, messages and states the number of possibilities will be more than most people will have time to verify by hand. • Communications protocol implementations are so complex, that it is likely to be impossible to test all protocol combinations on a system that has implemented a protocol. • In reality standards committees are design by committee, and although some members perform automatic validation, most don’t, and protocols are modified to cope with bugs as they appear

  39. State Explosion Problem • Even with simple protocols containing a small number of functional entities, messages and states the number of possibilities will be more than most people will have time to verify by hand. • Some problems have more permutations than a computer can go through, due to limitations on processing speed and memory.

  40. Travelling salesman problem • A salesman spends his time visiting n cities (or nodes) cyclically. In one tour he visits each city just once, and finishes up where he started. In what order should he visit them to minimise the distance travelled? • If every city is connected to every other city, then the number of possible combinations is (n-1)!/ 2. For only 100 cities the number of possible combinations could be as high as 4.67 x 10155 • There is a mathematical proof to show that for the number of towns in the US that the time required to list all combinations would be greater than the estimated length of the universe, and would require more memory than there are atoms in the universe, according to current physics theory.

  41. Automated analysis tools. • Through the use of automatic tools, that traverse through all the possible sets of states number of errors can be easily detected: • livelock and deadlock • output with no receiver • output with multiple receivers • over the maximum number of instances for a process • decision value not expected in the set of answers • unreachable states

  42. Automated analysis tools. • Errors related to data access, for example: • variable usage not compliant with its type • array overflow • Automated tools will not be able to detect undesirable behaviour due to poor specification

  43. Formal Design Vs Traditional Design • Traditional design involves the following design cycle • High Level Design • Requirements are written usually in English (or equivalent), • Protocol is then defined using a formal techniques such as SDL, MSC and ASN.1 • Low Level Design • Coding and testing • Most the effort occurs here. Coding and debugging thus becomes the focus of the protocol design

  44. Early attempts at Automated Validation • First attempt at automated validation was of by IBM in Zurich in late ‘70s. • Used a technique known as pertubation analysis, which involves starting at a state, determining all of the possible next states, and then using those next states in inputs for further pertubations.

  45. Early attempts at Automated Validation • Early attempts at automated validation had several drawbacks: • There was no high level formal notation way to define protocols in a generic way, that could be used as input to validation software. • Automated validation software required a large amount of human input. • Software had to be written for each new protocol being validated

  46. Formal Design • The later an error is detected the more expensive it is to fix. • Formal design techniques are used to verify the correctness of the protocol high level design phase. • This involves testing the protocol description as defined in a specification languages such as SDL. • This requires • An unambiguous notation • Effective validation tools

  47. Formal Protocol Specification Languages and Protocol Validation • Formal protocol definition languages such as SDL can be used as input to protocol validation tools. • Three common description languages used for this approach are SDL, Lotos and Estelle.

  48. FDTs are also intended to satisfy objectives such as: • a basis for analyzing specifications for correctness, efficiency, etc.; • a basis for determining completeness of specifications; • a basis for verification of specifications against the requirement of the Recommendation; • a basis for determining conformance of implementations to Recommendations; • a basis for determining consistency of specifications between Recommendations; • a basis for implementation support.

More Related