290 likes | 780 Views
STV. TSPC. System Test Specification. Location of the Test Specification Sources of Test Cases Test Cases from the Requirements Specs Test Cases from the Design Documentation Test Cases from the Usage Profile Test Case Selection Criteria Black-Box Test Methods Data Flow Analysis
E N D
STV TSPC System Test Specification Location of the Test Specification Sources of Test Cases Test Cases from the Requirements Specs Test Cases from the Design Documentation Test Cases from the Usage Profile Test Case Selection Criteria Black-Box Test Methods Data Flow Analysis Work Flow Analysis Boundary Value Analysis Command Syntax Analysis State Transition Analysis Test Case Classes Test Case Definition Partioning of Test Data Test Case Table Test Case for the Calender test Test Script for the Calender Test Test Cases = Test Specification Test Case Specification with TestSpec Requirement based Testing A7 Flight Control Specification A7 Test Path Specification A7 Test Script 1 13 2 14 3 15 4 16 5 17 6 18 7 19 8 20 9 21 10 22 11 23 12 24
STV TSPC-1 Location of the Test Specification Activity of specifying test cases Init Test- planning Test Plan Test design Test Design Test spec. Test Cases Test execution Test Results Test- evaluation Closure Result of Test Specification Test- docu.
STV TSPC-2 Sources of Test Cases Design Documentation Requirement Specs Usage profile Test Cases
STV TSPC-3 Test Cases from the Requirement Specs Customer opens System MenueSustomer selects Function Open an accountCustomer gives name, address and professionCustomer enters transaction Testfälle 5) If input is incorrect system will report an error (1) Customer must correct and repeat 6) If customer is a risk, systm will reject him (2) 7) If customer already has an account, (3) start Function for extra account 8) If no account number free, end transaction (4) 9) If customer account exits (5) start Function to update account 10) System opens new account (6:8) 1. 2. 3. 4.
STV TSPC-4 Test Cases from the Design Documentation D e c i s i o n T a b l e Conditions Rules Name, Address or J N N N N N N N NProfession invalidCustomer is a risk - J N N N N N N NCustomer has an account - - J N N N N N NAccount Number not free - - - J N N N N NAccount already opened - - - - J N N N NActions Report input error Reject customer Report already exists Report no account number Report Account opended open the account Update account statistic Confirm new account
STV TSPC-5 Test Cases from the Usage Profile Open Accounto anlegen Name: Address: Profession:
STV TSPC-6 Test Case Selection Criteria Selection Criteria &Priority Representative Selection Risk based Selection Error Probability Selection Black-Box White-Box Classificationof Test cases Extreme Values Normal Values „Invalid Values" SelectionCriteriafor Test data Criteria for test case & test data selection Bons & van Megen (SQS) 1983
STV TSPC-7 Black-Box Test Methods (from Boris Beizer) Data flow analysis Process flow analysis Value Domain analysis Syntax Analysis State transition analysis in Black-Box TestingJohn Wiley & SonsNew York, 1995
STV TSPC-8 Data Flow Analysis (based on Data Flow Diagrams) Data Inputs Data Outputs PIN Invalid PIN message Konto- auszahlung Invalid Card message Card no. Account no. New account balance Amount Receipt Generate Validate
STV TSPC-9 Work Flow Analysis (based on Work Flow Diagrams) Process Entry Process Exits certify Customer Customer isnot credible StartProcess store Customer Customer already exists assign AccountNo. No Accountavailable open Account Account alreadyopened New Account opened Work Flow Process
STV TSPC-10 Boundary Value Analysis (based on Data Value Domains) Today‘s Date is: <1900 1900:1949 1950:1999 >2000 Lower Boundary Value Upper Boundary Value 1900 1950 2000 Date Test
STV TSPC-11 Command Syntax Analysis (based on the Grammar) (1) (2)(3) (4) (5)(6) (7) (8) (9) DOS-Copy-Anweisung: <drive>:\[{<directory>\}][File[.ext]] [,<drive>:\[{<directory>\}][File[.ext]]] (10) (11)(12)(13) (14) (15)(16)(17) (18) (19) (1) target drive is missing (11) source drive is missing (2) target : is missing (12) source : is missing (3) target \ is missing (13) source \ is missing (4) target directory (14) source directory (5) target \ is missing (15) source \ is missing (6) n directories in target (16) n directories in source (7) target File is missing (17) source File is missing (8) target extension is missing (18) source extension missing (9) target is missing (19) source is missing (10) , comma is missing
STV TSPC-12 State Transition Analysis (based on State Transition Graph) S t a t e T r a n i t i o n s STATE Dust Child Single Married Divorced Widowed Dust X Child X X Single X X Married X X X Divorced X X Widowed X X Ashes to Ashes and Dust to Dust Life is hard, then you die
STV TSPC-13 Test Case Classes System Test cases Integration Test cases Unit Test cases
STV TSPC-14 Test Case Definition Datum: 01.01.98 Test Case No.: 20 Funktion: Order-Article Test Object: Order Entry Use Case INPUT OUTPUT Source Object Input-Name Input-Value Output-Value Output-Name Target Object Input Message Output Message Field E1 4711 4711 Field A1 Field E2 Smith Schmidt Field A2 Field E3 10 11 Field A3 Field E4 Article Article Field A4 Table B Table B Field B1 4711 Field B2 Köln 10 Field C3 Field B3 777 Table C Table C Field C1 4711 Field C2 Article Field C3 80 70 Field C3
STV TSPC-15 Partitioning of Test Data I N P U T S: Order: OrderNumber CustomerNumber CustomerName OrderPosition (9) ArticleNumber (9) OrderAmount (9) O U T P U T S: Messages: Order rejected OrderPosition rejected OrderPosition is deferred OrderPosition is fulfilled Error Messages: Customer not known Customer not credible Article not available ArticleAmount insufficient M A S T E R D A T A: Customer CustomerNumber (exists/absent) Customername CustomerCredibility Article ArticleNumber (exists/absent) ArticleName ArticleAmount ArticlePrice
STV TSPC-16 Test Case Tables T e s t f ä l l e Order Data TC-1 TC-2 TC-31 TC-32 TC-33 TC-4 TC-51 TC-52 TC-53 TC-10 OrderNumber 1 2 3 3 3 4 5 5 5 ...... 5 CustomerNumber 3333 6666 7777 7777 7777 8888 9999 9999 9999 ...... 9999 CustomerName Schmidt Huber Meyer Meyer Meyer Kohl Bahr Bahr Bahr ...... Bahr OrderPosition (9) 1 1 1 2 3 1 1 2 3 ...... 10 ArticleNumber (9) 4710 4710 4710 4711 4712 4713 4712 4711 4715 ...... 4716 OrderAmount (9) 1 2 3 9 4 2 6 7 4 ...... 19 Customer Data Article Data TC-2 TC-3 TC-4 TC-5 CustomerNumber 6666 7777 8888 9999 CustomerName Huber Meyer Kohl Bahr CustomerCredibility 0 1 1 1 TC-31 TC-32 TC-4 TC-51 TC-52 ArticleNumber 4711 4712 4713 4715 4716 ArticleName Buch Heft Stift Karte Stempel ArticleAmount 7 9 9 8 20 ArticlePrice 99,99 99,99 99,99 99,99 99,99
STV TSPC-17 Test Cases for the Calender Test I N P U T S T E S T C A S E S CalenderData CalenderNumber 1 2 3 4 4 4 4 4 5 5 5 CalenderName B G G G G G G G G G G Week Number 0 53 1 1 1 1 1 52 52 52 Week Day B G G G G G G G Activity 1 2 3 4 1 2 13 StartTime B G G G G G G EndTime G B G G G G G Description G G B G G G G B = badG = good
STV TSPC-18 Test Script for the Calender Test Testcase: TC_1 Testcase: TC_5{ CustomerNumber = 1; { CustomerNumber = 5; CustomerName = " "; CustomerName = „Obermeier";} WeekNumber = 1;Testcase: TC_2 WeekDay = "Monday";{ CustomerNumber = 2; Activity (1) CustomerName = "Meyer"; { StartTime = 0000; WeekNumber = 0; EndTime = 2400;} Description = "sleep";Testcase: TC_3 }{ CustomerNumber = 3; Activity (2) CustomerName = "Untermeier"; { StartTime = 2400; WeekNumber = 53; EndTime = 2430;} Description = "sleep";Testcase: TC_4 }{ CustomerNumber = 4; CustomerName = "Mittelmeier"; WeekNumber = 1; WeekDay = "Montag";}
STV TSPC-19 Test Cases = Test Specification Requirements ORACLE Test Cases ProgramSpecs DESIGN Test Specs Programs Test Procedures Test against the Specs
STV TSPC-20 Test Case Specification with TESTSPEC Use Case List UseCase A UseCase B UseCase C UseCase N IF < Pre Condition > THEN < Post Condition > Per Use Case a Test case for every Pre/Post Combination IF Conditions PreConditions PostConditions Test Cases
SYST TSPC-21 Requirement based Testing SPEC VERIFICATION = CORRECTNESS e.g. IEEE Std 829 TESTING STANDARDS TEST OBJECT VALIDATION = RELIABILITY ACCEPTANCE STANDARD HUMAN JUDGEMENT Usability Security Fault tolerance Coverage measurement Maintainability Testability Reliability TARGET COMPUTER OPERATIONAL ENVRONMENT
SYST TSPC-22 NRL-A7 Specification of Initeria Measurement Scale Calibration
SYST TPC-23 Path Expression in Initeria Measurement Scale Calibration A1 SET-IMSCAL-FINE IF (FLIGHT-MODE = *LAUTOCAL* OR FLIGHT-MODE = *LANDALN* OR FLIGHT-MODE = *01UPDATE*) IF ( aT = IN-MODE AN //IMSCAL// = $COARSES$); A1 SET-IMSCAL-FINE IF (FLIGHT-MODE = *HUDALN*) IF ( aT = IN-MODE AND /IMSMODE/ = $GNDAL$ AND //IMSCAL// = $COARSE$); A2 SET-IMSCAL-COARSE IF (FLIGHT-MODE = *HUDALN*) IF (aT = IN-MODE AND (/IMSMODE/ = $NORM$ OR /IMSMODE/ = $INER$) AND //IMSCAL// = $FINE$) A2 SET-IMSCAL-COARSE IF (FLIGHT-MODE = *SAUTOCAL* OR FLIGHT-MODE = *SINSALN* OR FLIGHT-MODE = *AIRLN*) C1 C2 C3 C4 C3 C5 C6 C C6 C1 C3 C2 C4 C5 A1 A1 A2
SYST TSPC-24 Test Script of Initeria Measurement Scale Calibration FUNCTION: CHANGE-SCALE-FACTOR ASSERT PRE FLIGHT-MODE SET (*LATUCAL*, *SAUTOCAL*, *LANDALN*, *SINSALN*, *1OUPDATE*, *HUDALN*, *AIRALN*, *DIG*, *DI*, *I*); ASSERT PRE /IMSMODE/ SET ($NORM$, $GNDAL$, $INER$, $MAGSL$, $GRID$ ); ASSERT PRE //IMSSCAL// SET ($FINE$, $COARSE$); IF (FLIGHT – MODE = *LAUTOCAL* OR FLIGHT – MODE = *LANDALN* OR FLIGHT – MODE = */1UPDATE*) IF (//IMSSCAL // = $COARSE$ THEN ASSERT POST //IMSSCAL// = $FINE$; IF (FLIGHT – MODE =*HUDALK*) IF (/IMSMODE/ = $GNDAL$ AND //IMSSCAL// = $COARSE$ THEN ASSERT POST //IMSSCAL// = $FINE$; IF (FLIGHT – MODE =*HUDALKN*) IF ((/IMSMODE/ = $NORM$ OR /IMSMODE/ = $INER$) AND //IMSSCAL// = $FINE$)) THEN ASSERT POST //IMSSCAL// = $COARSE$; IF (FLIGHT – MODE = *SAUTOCAL* OR FLIGHT – MODE = *SINSALN* OR FLIGHT – MODE = *AIRALN*) THEN ASSERT POST //IMSSCAL // = $COARSE$; END CHANGE – SCALE – FACTOR;
SYST TSPC-25 Test Cases for Initeria Measurement Scale Calibration By comparing these preconditions to be tested against the possible preconditions it will be determined that there are 10 * 5 * 2 = 100 possible states and that there will be 9 tested giving an input coverage of only 9 %. However, on the output side there is only one variable //IMSSCAL// and both of its states - $COARSE$ and $FINE$ - are achieved. Since there are only two elementary operations within the function and both are tested, the functional coverage will be 100%. Assuming that there is a 1: 1 correspondence between specified functions and implemented functions, this example demonstrates that functional coverage based on the specification is an even weaker measurement thane branch coverage. Thus, it may be possible to test all functions without testing all condition outcomes. State coverage is however a much stronger coverage measurement than path coverage, since there will be an M : 1 relationship between states and paths and it is possible that some states may not have a corresponding path, as depicted in the example. State testing is really the only way of detecting errors of omissions.