460 likes | 932 Views
TEST CASE DESIGN. Prepared by: Fatih Kızkun. OUTLINE. Introduction Importance of Test Essential Test Case Development A Variety of Test Methods Risk Based Testing Use Case/UML Finite State Model Based Technique Conclusion. INTRODUCTION(1).
E N D
TEST CASE DESIGN Prepared by: Fatih Kızkun
OUTLINE • Introduction • Importance of Test • Essential Test Case Development • A Variety of Test Methods • Risk Based Testing • Use Case/UML • Finite State Model Based Technique • Conclusion
INTRODUCTION(1) • Failure: A program behaves differently as required by the specification. • Error: The internal state of a program is invalid (e.g. a precondition, postcondition, undefined pointers etc). • Fault: A static cause which allows errors to occur (e.g. bugs, design faults, hardware or system faults).
INTRODUCTION(2) • Faults are • static and can lead to error states. • observed by Black Box Test • Tests • try to detect failures. • observe error states that are tracked down to faults
INTRODUCTION(3)Typesof Tests • Tests of units (classes, individual methods,modules and procedures) • Test units concurrently • Locating faults easier • Higher-Order Testing (test of the whole program) • Detect errors because of inconsistency
INTRODUCTION(4)Performing a Good Test • Know the Characteristics of a Good Test • Know what an Equivalence Class is • Finding Equivalence Class • Select Test Cases from Each Equivalence Class • Test State Transitions • Test Time Dependencies • Test Load Limitations • Guess • Test Function Equivalence • Automate Function Equivalence Testing • General Equivalence Testing • Regression Testing • Executing the Tests
TEST METHODS • Risk-based testing • Requirements-based Testing • Use Case/UML • Keyword or Action Based Test (ABT) methodology • State Transition/ State Model Based Testing • Exploratory and Effective Ad Hoc Testing • Equivalence Class Partitioning and Boundary Value Analysis • Regression testing • Fault Injection/ Forced Error/ Negative Testing • DAST- Diagnostic Approach to Software Testing • Data Driven Testing
RISK BASED TESTING(1) • Make a prioritized list of risks • Perform testing that explores each risk • As risks evaporate and new ones emerge, adjust your test effort to stay focused on the current crop
RISK BASED TESTING(2)INSIDE-OUT • Begin with details • identify risks associated with them. • 3 questions to be asked • Vulnerabilities:What weaknesses or possible failures are there in this component? • Threats:What inputs or situations could there be that might exploit a vulnerability and trigger a failure in this component? • Victims:Who or what would be impacted by potential failures and how bad would that be?
RISK BASED TESTING(3)OUTSIDE-IN • Begin with a set of potential risks • Match potential risks to the details of the situation • A more general approach than inside-out • It’s easier • 3 kind of lists can be used • Quality Criteria Categories (capability, reliability,…) • Generic Risk List (complex, new, critical,…) • Risk catalogs
RISK BASED TESTING (4) It helps to communicate and negotiate which components will get more effort.
Use Case(1) • A use case diagram shows the interaction between an actor and the system. • In a use case diagram the system is viewed as • Input • Output • Functionality matter. • The purpose of a Use Case May include: • Promoting Communication • Understanding Requirements • Helping to identify “capsules” to encapsulate data • Focusing on the “What” rather than “How” • Providing Prototype Test Cases
USE CASE(2)Creating Test Cases From Use Cases • Identify all of the scenarios for the given use case • Alternative scenarios should be drawn in a graph fo each action • Create Scenarios for • a basic flow, • one scenario covering each alternative flow, • and some reasonable combinations of alternative flows • Create infinite loops
USE CASE(3)Creating Scenario • Identify variables for each use case step • Identify significantly different options for each variable (password too long, too short, availbale etc.) • Combine options to be tested into test cases • Assign values to variables
USE CASE(4) Email Password Search Selection Add Regular Regular Regular Select first one Illegal Add to cart 10 chars 200 chars Select last one 51 chars 11 chars 201 chars
Use Case(5) • They are only useful for certain types of testing: • User Acceptance Testing • Positive “business as usual” Functional testing • Manual black-box (some) • Scripted automation -Automated testing – • They could also help types of testing that overlap with positive functional testing: • Smoke testing • Sanity testing • Regression testing • Ad-hoc testing
Use Case(6) • What kind of bugs would not be discovered • System testing • Integration testing • Performance testing • Load testing • Software Compatibility testing • Hardware Compatibility testing • Exploratory testing
Finite State Model Based Technique (1) • Generates software tests from explicit descriptions of an application’s behavior. • Several good model-based test tools are currently available in the market • The techniques of model-based testing are not tied to any tool
Finite State Model Based Technique (2) • Create a finite state model of an application • Generate sequences of test actions from the model • Execute the test actions against the application • Determine if the application worked right. • Find bugs.
Finite State Model Based Technique (3) Digital Analog
Finite State Model Based Technique (4) The rules for these actions in the Clock application • Start • If the application is NOT running, the user can execute the Start command. • If the application is running, the user cannot execute the Start command. • After the Start command executes, the application is running. • Stop • If the application is NOT running, the user cannot execute the Stop command. • If the application is running, the user can execute the Stop command. • After the Stop command executes, the application is not running. • Analog • If the application is NOT running, the user cannot execute the Analog command. • If the application is running, the user can execute the Analog command. • After the Analog command executes, the application is in Analog display mode. • Digital • If the application is NOT running, the user cannot execute the Digital command. • If the application is running, the user can execute the Digital command. • After the Digital command executes, the application is in Digital display mode.
Finite State Model Based Technique (5) Not running/ Analog Not running/ Digital Stop Start Start Stop Digital Running/ Analog Running/ Digital Analog Analog Digital
Finite State Model Based Technique (6) • Visual Test has a rich set of functions for interacting with the application you are testing
ACTIONS open "test_sequence.txt“ #infile while not (EOF(infile)) line input #infile, action select case action case "Start" run("C:\clock.exe”) case "Analog" WMenuSelect("Settings\Analog") case "Digital" WMenuSelect("Settings\Digital") case "Stop" WMenuSelect(“Close”) Test_oracle() wend TEST_ORACLE FOR DIGITAL/ANALOG if (system_mode = RUNNING) then if ( WFndWnd("Clock") = 0 ) then print "Error: Clock should be Running" stop endif if ( (setting_mode = ANALOG) AND NOT WMenuChecked("Settings\Analog") ) then print "Error: Clock should be Analog mode" stop endif if ( (setting_mode = DIGITAL) AND NOT WMenuChecked("Settings\Digital") ) then print "Error: Clock should be Digital mode“ stop endif endif Finite State Model Based Technique (7)
Finite State Model Based Technique (7) • Test Sequence • Start • Maximize • Stop • Start • Minimize • Stop • Start • Restore • Stop
Other Test Methdos • Equivalence Partitioning and Boundary Values Extraction • Regression Testing • Diagnostic Approach to Software Testing • Keyword or Action Based Test (ABT) methodology • Fault Injection/ Forced Error/ Negative Testing
CONCLUSION • There are many test case design methods that can be used if suitable • Some of these methods also helps to select the right data for test • Selecting the right method makes it easier to detect faults • Test cases should be defined before the program is executed
LINKS • http://www.cs.rit.edu/~afb/20012/cs4/slides/testing-03.html • http://www.perl.com/pub/a/2005/07/14/bestpractices.html • http://www.cs.bsu.edu/homepages/metrics/cs639d/CS639WWW/chapter7-8/tsld001.htm • http://www.satisfice.com/articles/hrbt.pdf • http://www.satisfice.com/articles/rbt-trouble.pdf • http://www.testassured.com/docs/Dangers.htmhttp://www-128.ibm.com/developerworks/rational/library/04/r-3217http://www.geocities.com/model_based_testing/shoestring.htm • http://people.bath.ac.uk/tjs20/introduction.htm (chinese psotman problem)