1 / 25

Comprehensive Data-Based Testing Methods for Software Components

Explore data flow analysis, test data generation, cause & effect analysis, boundary analysis, equivalence classes, and more in a software testing environment.

fryar
Download Presentation

Comprehensive Data-Based Testing Methods for Software Components

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TEST DATA Data-based Testing Data-based Testing Data-based Test Cases Environment of a Software Component Data Flow Analysis Test Data Sets Defining the Input Data Defining the Control Data Random Data Generation Test Data Derivation Progression & Degression Data Generation Boundary Analysis Building Equivalence Classes Inputs/Functions/Outputs Sample Equivalence Classes Sample Boundary Values Cause & Effect Analysis Cause & Effect Graph Sample Cause & Effect Graph Data Usage Types Input/Output Specification Validating System Databases Validating System Outputs Generation of an XML Interface Validation of the XML Interface 1 13 2 14 3 15 4 16 5 17 6 18 7 19 8 20 9 21 10 22 11 23 12 24

  2. TEST DATA-1 Data-based Testing Programs Data Structures Testing Programs against their Data Structures

  3. TEST DATA-2 Data-based Test Cases I n p u t s O u t p u t s Message_In Message_Out Field-E1 Field-E2 Field-E3 Field-AE Field-A2 Field-A3 Params Rec-B Rec-B T e s t O b j e c t Field-B1 Field-B2 Field-B3 Field-B2 Field-B4 Field-B5 Tab-1 List-1 Instance-1 Instance-2 Instance-3 Field-1 Field-2 Field-3 Params 1. Test 1. Test 2. Test 2. Test 3. Test 3. Test Input States Output-States

  4. TEST DATA-3 Environment of a Software Component Input-params Output-params IncomingMessages (XML/IDL/etc.) OutgoingMessages (XML/IDL/etc.) Test Object Global data Import Files Global data Export Files ReturnCodesDB-Retrieves Module CallsDB-Stores Pre-Arguments Post-Results HorizontalData Flow In Out O U T P U T S I N P U T S Post-Results Pre-Arguments Vertical Data Flow

  5. TEST DATA-4 Data Flow Analysis D P P P D D P D D D 3 1 1 2 2 4 6 4 5 3 b b a b b b a a a a 5 5 1 1 3 2 3 2 4 4 a-Flow b-Flow

  6. TEST DATA-5 Defining Test Data Sets X S S Y S X S S S S Y X S S Y S 4 4 2 3 4 2 1 I I 2 I I 1 I 3 I S S 12 34 : = Set of actual Results (Outputs): = Set of specified Results (Outputs): = Set of possible Arguments (Inputs): = Set of defined Arguments (Inputs) X = F (Y) when X  then Y F is correct when  ^  F is complete when  ^  F is consistent when  ^ 

  7. TEST DATA-6 Defining Input Data + + + + + + + + + All Data + + + + + + + + + + + + + + + + + + + Input Data = Arguments | Sending Variables + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + X = Y + 4; ADD Y TO X. X = SUB (Y); CALL SUB USING Y, X

  8. TEST DATA-7 Defining Control Data + + + + All Data + + + Input Data + + + + + + + + + + + + + + + + + + + + + + + + + + + Control Data = Predicates + + DO WHILE ( I < N ); + + + + + + + + + + + + + + + + IF I > 0 THEN

  9. TEST DATA-8 Random Data Generation Binary-Fieldsr = -32627, -1, 0, 1, 32627 Packed-Fields = -1000, 0, 1000, 1000000 Hexadecimal-Fields = 40, 00, A2, FF Character-Fields = ' ', 'XXXXX' Bit-Fields = 0, 1 Data Storage Data Type Definition Table Random- Data Generator BeforeImage

  10. TEST DATA-9 Test Data Derivation Approaches X Y (1-n) (1-n) X = F (Y) Results Arguments Input Value DomainsValid Input ValuesInvalid Input Values Output Value DomainsValid Output ValuesInvalid Output Values Function Equivalence Classes  Representative value set  Valid value set  Invalid value setBoundary Analysis  Numeric Value Ranges  Upper & Lower BoundsCause & EffectAnalysis  Events/ Starting States  Results/State Transitions

  11. TEST DATA-10 Progression & Degression Data Generation Increasing & Decreasing Intervals 1, 2, 3, 4, 5, 6, 7, 8, 9, etc.100, 200, 300, 400, 500, 600, 700, 800, 900, etc.10.5, 15.5, 20.5, 25.5, 30.5 35.5, etc.0, -1, -2, -3, -4, -5, -6, -7, etc.0, -100, -200, -300, -400, -500, -600, -700. etc.50.3, 40.3, 30.3, 20.3, 10.3, 0.3, -0.7, -10.7, etc.

  12. TEST DATA-11 Boundary Analysis Lower Boundary -100 -32.000 0 V alueRange V alueRange V alueRange +1.000 +50 +64.000 Upper Boundary

  13. TEST DATA-12 Building Equivalence Classes I N P U T D O M A I N 100 : 150 150 : 200 100 190 160 110 120 130 Sub-SubDomain < 100 170 200 > 200 140 180 190 First SubDomain 150 Second SubDomain Lower Bound = 100 Upper Bound = 200 Representative numeric values for Testing

  14. TEST DATA-13 Inputs/Functions/Outputs Inputs Functions Outputs Transaction Stock_Entry OrderPosition TransactionType OrderNumber ArticleNumber ArticleNumber StockinAmount OrderAmountArtikel Article ArticlelNumber ArticlesonStock Articlename ArticleStatus ArticlePrice ArticlesonStock ArticleStatusTransaction Stock_Exit OrderPosition TransactionType OrderNumber ArticleNumber Article StockoutAmount ArticleNumberArticle ArticlesonStock ArticleNumber Protocol ArticleName ArticleNumber ArticlePrice OrderAmount ArticlesonStock ArticlesonStockOrderPosition OrderNumber ArticleNumber StockAmount

  15. TEST DATA-14 Sample Equivalence Classes Input Parameter Valid Values Invalid Values Transaction TransType ‘IN', ‘OUT‘ NOT E SET (‘IN', ‘OUT') ArticleNumber 101, 150, 177, 182 < 100 OR > 200 StockOutAmount 1, 99, 50 < 1 OR > 99 StockInAmount 1, 99, 500 < 1 OR > 500 Article ArticleNumber E SET ( ArticleNumbers) NOT E SET (ArticleNumbers) ArticleName 'BOOK', ‘DISC', ‘CD' 'NAILS', 'SCREWS', ' ', ArticlePrice 5.00, 48.80, 99.99 < 5.00 OR > 99.99 ArticlesonStock 1, 99, 500 < 0 OR > 500 OrderId ‘Y', 'N' NOT E SET (‘Y', 'N')

  16. TEST DATA-15 Sample Boundary Values Input Parameters Lower Boundary Upper Boundary Transaction ArticleNumber 99, 100, 101 199, 200, 201 StockinAmount 0, 1, 2 98, 99, 100 StockoutAmount 0, 1, 2 499, 500, 501 Article ArticleNumber 99, 100, 101 199, 200, 201 ArticlePrice 4.99, 5.00, 5.01 99.98, 99.99, 100.00 ArticlesonStock -1, 0, 1 499, 500, 501

  17. TEST DATA-16 Cause & Effect Analysis   Logical Relations between Input & Outputs x = y = a b a b IF A THEN B IF NOT A THEN B x = a a y = ! c c x = b b IF (A ! B) THEN C IF NOT (A ! B) THEN C x = a a y = & c c z = b b IF (A & B) THEN C IF NOT (A & B) THEN C

  18. TEST DATA-17 Cause & Effect Graphs 1 2 OUTPUT S T A T E S 3 31  INPUTST A T E S 4  21 32  5 X  6 33 7  22 34 8 9

  19. TEST DATA-18 Sample Cause & Effect Graph Order.CustomerNr E SET (CustomerNrs) Customer is valid Order rejected   Orderaccepted OrderPositionrejected Customer is credible Cust_Credit > 2  Article on Stock  Order.ArticleNr E SET (ArticleNrs) Fulfill Order  ArticlesonStock - Order.Amount Article Amount on Stock is sufficient to fulfill order Order.Amount < ArticlesonStock

  20. TEST DATA-19 Data Usage Types Predicates F u n c t i o n Inputs Outputs Output Driven: when test cases are derived from predicted Outputs Input Driven: when test cases are derived from possible Inputs

  21. TEST DATA-20 Input/Output Specification Inputs Processing Outputs Transaction Update Stock-Delivered Stock Inventory Stock Inventory Stock-Ordered Order-SW Stock-Inventory Stock-NR Order-SW = 1If Stock-NR Valid & Stock-Ordered < = Stock-Inventory Order-SW = 0If Stock-NR Not Valid | Stock-Ordered > Stock-Inventory

  22. DATA-21 Validation of Database Contents TEST GUN- System Test GUNSQLDB SQL- Query CSV- Files Fields which map to the Fields of the old Database GU- System old TabComp Comparison protocol VSAM Files Missing Records,Matching Records,Non-matching Fields FileTran CSV- Files

  23. DATA-22 Validation of System Outputs TEST GU- System old Reports REPTOXML XML- Files XMLComp Comparison protocol Missing Objects,Matching Objects,Non-matching Results GU- System Test XML- Files

  24. TEST DATA-23 Generation of an XML Interface <?xml version = "1.0" encoding = "ISO-8859-1"?> <!DOCTYPE "xm059i" SYSTEM "xm059i.xsd"> <xm059i> <XM059-PARAMS> <P1> <P1-TT>12</P1-TT> <P1-MM>10</P1-MM> <P1-CE>19</P1-CE> <P1-JJ>77</P1-JJ> </P1> <P2> <LANG-CODE>1</LANG-CODE> <-- 1 = Deutsch, 2 = Franzoezisch, 3 = Italienisch --> </P2> <P3> <DIRECTION>1</DIRECTION> <-- 1 = Links, 2 = rechts --> </P3> <P4> <DAY-NAME>XXXXXXXXXX</DAY-NAME> </P4> <RETURN-CODE>00</RETURN-CODE> </XM059-PARAMS> </xm059i> ____________________________________________________________________________________ <?xml version = "1.0" encoding = "ISO-8859-1"?> <!--DOCTYPE XM059O SYSTEM "XM059O.xsd"--> <XM059O> <XM059-PARAMS> <RETURN-CODE>00</RETURN-CODE> <P1> <P1-TT>12</P1-TT> <P1-MM>10</P1-MM> <P1-CE>19</P1-CE> <P1-JJ>77</P1-JJ> </P1> <P2> <LANG-CODE>1</LANG-CODE> </P2> <P3> <DIRECTION>2</DIRECTION> </P3> <P4> <DAY-NAME>MERCOLEDI</DAY-NAME> </P4> </XM059-PARAMS> </XM059O>

  25. TEST DATA-24 Validation of the XML Interface +-----------------------------------------------------------------------+ | WSDL Response Validation Report | | Object: Kalender Date: 19.06.04 | | Type : XML System: TEST | | Key Fields of Response (ist,soll) | +-------------------------------------+---------------------------------+ | MsgKey:DayofWeek = 12101977 | | | Ist : DayofWeek | Mercolodi | | SOll: DayofWeek | Mittwoch | +-------------------------------------+---------------------------------+ | MsgKey:DayofWeek = 12101977 | | | Ist : Language | 2 | | SOll: Language | 1 | +-------------------------------------+---------------------------------+

More Related