1 / 31

Taxonomy of Test Oracles

Taxonomy of Test Oracles. Mudit Agrawal Course: Introduction to Software Testing, Fall ‘06. Motivation. Shortcomings of automated tests (as compared to human ‘ eye-ball oracle ’) Factors influencing test oracles Environmental (private, global) variables

myrtles
Download Presentation

Taxonomy of Test Oracles

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Taxonomy of Test Oracles Mudit Agrawal Course: Introduction to Software Testing, Fall ‘06 Software Testing, Fall 2006

  2. Motivation • Shortcomings of automated tests (as compared to human ‘eye-ball oracle’) • Factors influencing test oracles • Environmental (private, global) variables • State (initial and present) of the program • Dependence on test case • What kind of oracle(s) are best suited for which applications?

  3. Contents • Mutating Automated Tests – Challenges for test oracles • Generations of Automations and Oracles • Challenges for Oracles • Types of Oracles • Conclusion

  4. Automated Tests • Advantages • No intervention needed after launching tests • Automatically sets up and/or records relevant test environment • Evaluates actual against expected results • Reports Analysis of pass/fail • Limitations: • Less likely to cover latent defects • Doesn’t do anything different each time it runs • Oracle is not ‘complete’

  5. Limited by Test Oracle • ‘Predicting’ and comparing the results • Playing twenty questions with all the questions written down in advance! • Has to be influenced by • Data • Program State • Configuration of the system environment

  6. Testing with the Oracle Source: Douglas Hoffman, SQM, LLC

  7. Generations of Automation:First Generation Automation • Automate existing tests by creating equivalent exercises • Self verifying tests • Hard Coded oracles • Limitations • Handling negative test cases

  8. Second Generation Automation • Automated Oracles • Emphasis on expected results – more exhaustive input variation, coverage • Increasing • Frequency • Intensity • Duration of automated test activities (load testing)

  9. Second Generation continued… • Random selection among alternatives • Partial domain coverage • Dependence of system state on previous test cases • Mechanism to determine whether SUT’s behavior is expected • Pseudo random number generator • Test recovery

  10. Third Generation Automation • Take into account knowledge and visibility • Software instrumentation • Multi-threaded tests • Fall back compares • Using other oracles

  11. Third Generation continued… • Heuristic Oracles • Fuzzy comparisons, approximations • Diagnostics • Looks for errors • Performs additional tests based on the specific type of error encountered

  12. Challenges for Oracles • Independence and completeness – difficult to achieve both • Independence from • Algorithms • Sub-programs, libs • Platforms • OS • Completeness in form of information • Comparing computed functions, screen navigations and asynchronous event handling • Speed of predictions • Time of Execution of oracle

  13. Challenges for Oracle continued… • Better an oracle is, more complex it becomes • Comprehensive oracles make up for long test cases (DART paper) • More it predicts, more dependent it is on SUT • More likely for it to contain the same fault

  14. Challenges for Oracle continued… • Legimitate oracle – an oracle that produces accurate rather than estimated outputs • Generates results based on the formal specs • What if formal specs are wrong? • Very few errors cause noticeable abnormal test termination

  15. Source: An automated oracle for software testing

  16. IORL – Input/Output Req. Lang. • Graphics based Language • Optimal representation between the informal requirements and the target code

  17. Oracles for different scenarios • Transducers • That read an input sequence and produce and output sequence • Logical correspondence between I/O structures • e.g. native file format to HTML conversions in web applications • Solution – CFGs • System translates a formal specs of I/O files into an automated oracle

  18. Embedded Assertion Languages[Oracles for different scenarios] • Asserts! • Problems: • Non-local assertions • Asserts for pre/post condition pairs with a procedure as a whole • e.g. asserts for each method that modifies the object state • State caching • Saving parts or all of ‘before’ values

  19. Embedded Assertion Languages[Oracles for different scenarios] • Auxiliary variables • Quantification

  20. Extrinsic Interface Contracts [Oracles for different scenarios] • Instead of inserting asserts within the program, checkable specs are kept separate from the implementation • Extrinsic specs are written in notations • Less tightly coupled with target programming language • Useful when source-code need not be touched

  21. Pure Specification Languages [Oracles for different scenarios] • Problem with older approach: specs were not pure • Z and object-Z are model based specification languages • Describe intended behavior using familiar mathematical objects • Free of the constraints of the language

  22. Trace Checking [Oracles for different scenarios] • Uses a partial trace of events • Such a trace can be checked by an oracle derived from formal specs of externally observable behavior

  23. Types of Oracles Categorized based on Oracle-Outputs • True Oracle • Faithfully reproduces all relevant results • Uses independent platform, processes, compilers, code etc. • Lesser commonality => more confidence in the correctness of results • e.g. sin(x) [problem: how is ‘all inputs’ defined?

  24. Types of Oracles continued… • Stochastic Oracle • Statistically random input selection • Error prone areas of the software are no more or less likely to be encountered • sin() – pseudo random generator to select input values

  25. Types of Oracles continued… • Heuristic Oracle • Reproduces selected results for the SUT • Remaining values are checked based on heuristics • No exact comparison • Sampling • Values are selected using some criteria (not random) • e.g. boundary values, midpoints, maxima, minima

  26. Types of Oracles continued… • Consistent Oracle • Uses results from one test run as the Oracle for subsequent runs • Evaluating the effects of changes from one revision to another

  27. Comparison

  28. Who wants to be a Millionaire?

  29. Which category best describes GUI Testing? a. Heuristic b. Trace Checking c. Transducers d. None

  30. Thanks!

More Related