1 / 48

Software Testing

Learn how to test graph traversal algorithms in software development, including finding all paths from node A to node C satisfying specific conditions. Use Java to write effective testing code.

ble
Download Presentation

Software Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Testing COM 3220 www.ccs.neu.edu/home/lieber/ Testing/Spring 99

  2. Text book • Brian Marick: The Craft of Software Testing/Subsystem Testing, Prentice Hall. • Also use: • Testing Object-Oriented Software by David C.Kung, ei Hsia and Jerry Gao, IEEE Computer Society Press (Contains 19 papers 3 of which refer to work done at Northeastern University in the Demeter Research Group.) Testing/Spring 99

  3. Goal of first lecture • Teach you enough so that you can do homework 1 of testing a graph theoretic problem: Find all paths from node A to node C satisfying a simple property (like they must go through a node B). • Input: Directed acyclic graph G with nodes A,B and C. • Output: all paths from A via B to C. Testing/Spring 99

  4. Three meanings of bug • error: mistake made by a developer. Mostly located in people’s head. • fault: an error may lead to one or more faults. Faults are located in text of program. • failure: execution of faulty code may lead to one or more failures. A failure occurs when there is a difference between the results of the correct and incorrect programs. Testing/Spring 99

  5. Failure detection • Compare actual output to expected output. • Expected output is from specification. • Specification: any external, independent description of the program, including user documentation. • Are often incomplete, incorrect, ambiguous or contradictory. Specification may be wrong, not the program! Testing/Spring 99

  6. A summary of subsystem testing • Build the test requirement checklist • Find clues • Expand clues into test requirements • Design the tests • Combine requirements into tests • Check tests for common testing mistakes • Supplement testing with code inspections Testing/Spring 99

  7. A summary of subsystem testing • Implement test support code • Implement tests • Evaluate and improve tests • use code coverage tool • find undertested or missing clues • find more test requirements • write more test specifications Testing/Spring 99

  8. Motivation • Derive tests from both the specification and the program. • Derivation is done by ”predicting” likely programmer errors or likely program faults. • Use general rules, e.g., always test boundary conditions. Testing/Spring 99

  9. Motivation • Check for faults of omission: missed special cases. • Most common type of fault according to a study by Glass. • Experienced testers have a catalog of programming cliches and associated errors available. See Test Requirement Catalog (low-level omissions). Testing/Spring 99

  10. Motivation • First requirement of test design: Be methodical. Three stages: • Finding clues • sources for test requirements • Expanding them into test requirements • useful sets of inputs that should be considered • Writing test specifications • exact inputs and expected outputs Testing/Spring 99

  11. Clues • What needs testing? Collect from specification, program, bug reports, etc. • Create a checklist. Testing/Spring 99

  12. Test requirements • Create a test requirement catalog Testing/Spring 99

  13. Test specifications • Describes input and exact expected output. Testing/Spring 99

  14. Supplementary code inspections • Some faults that testing is poor at detecting. Testing/Spring 99

  15. Test implementation • Avoid having to write a lot of support code. • It is better to test larger subsystems because less support code needs to be written. • Individual routines are exercised more. • Testing the tests: test coverage as a crude measure. • During test design do not pay attention to coverage criteria. Testing/Spring 99

  16. Test implementation • During test design do not pay attention to coverage criteria. Test requirements from other sources should do that anyway. • Complete subsystem testing will usually result in high coverage. • Treat missed branches as clues about weaknesses in the test design. Testing/Spring 99

  17. Subsystem Specification Subsystem Code Catalogued Past Experience Clues and Test Requirements Program and Specification Changes Coverage Test Specifications Bug Reports Implemented Tests Testing/Spring 99

  18. A broader view: dependability Testing/Spring 99

  19. Application • Graph algorithms: • Depth-first traversal • Finding all paths satisfying some restrictions. • Happens to be a subsystem of Demeter/Java. • You don’t have to know much about Demeter. You will learn the minimal things you need. Testing/Spring 99

  20. Use Java to write testing code • You will need to write some Java code for testing. Testing/Spring 99

  21. Part of Demeter/Java Graph traversal Subsystem Specification Subsystem Code Catalogued Past Experience Clues and Test Requirements Program and Specification Changes Coverage Test Specifications Use Java/Scope Bug Reports Implemented Tests Testing/Spring 99

  22. What we want to test • Given a directed acyclic graph G (no multi-edges), traverse all paths from A via B to C. • Given a directed acyclic graph G (no multiedges), traverse all paths from A bypassing B to C. Testing/Spring 99

  23. Notation for describing graphs • A = B C D. // node A has three successors • B = E. // node B has only one successor • E = . // E has no successor • This information is put into a file program.cd. • Two files program.beh are given. Contains the traversal specification. Counts visits of C. Testing/Spring 99

  24. How to call the program • demjava test • The program will print the paths it traversed and print how often it visits C. Testing/Spring 99

  25. Clue list: from A via B to C • What does program do if there is no path from A via B to C? • What if A or B or C do not appear in the graph. • Check that paths from A to C not going through B are excluded: paths of length 1, 2 or 3. Testing/Spring 99

  26. Clue list: From A bypassing B to C • What does program do if there is no path from A bypassing B to C? • What if A or B or C do not appear in the graph. Is it ok if B does not appear? • Check that paths from A to C going through B are excluded: paths of length 1, 2 or 3. Testing/Spring 99

  27. Test specifications: From A via B to C A=C B X. B=C X. C=. X=C. A=C B. B=C. C=. A A A=B B=C. C=. A B B B C C X C 2 visits 1 visit 1 visit Testing/Spring 99

  28. Test specifications: From A via B to C A=C B X Y. Y=B. B=C X. C=. X=C. A Y B C X 4 visits Testing/Spring 99

  29. Test specifications: From A bypassing B to C A=C B X Y. Y=B. B=C X. C=. X=C. A Y B C X 2 visits Testing/Spring 99

  30. Fundamental Assumptions of Subsystem Testing • Most errors are not very creative. Methodological checklist-based approaches will have a high payoff. • Faults of omission, those caused by a failure to anticipate special cases, are the most important and most difficult type. • Specification faults, especially omissions, are more dangerous than code faults. Testing/Spring 99

  31. Fundamental Assumptions of Subsystem Testing • At every stage of testing, mistakes are inevitable. Later stages should compensate for them. • Code coverage is a good approximate measure of test quality. Must be used with extreme care. Testing/Spring 99

  32. A summary of subsystem testing • Build the test requirement checklist • Find clues • Expand clues into test requirements • Design the tests • Combine requirements into tests • Check tests for common testing mistakes • Supplement testing with code inspections Testing/Spring 99

  33. A summary of subsystem testing • Implement test support code • Implement tests • Evaluate and improve tests • use code coverage tool • find undertested or missing clues • find more test requirements • write more test specifications Testing/Spring 99

  34. Testing/Spring 99

  35. Test strategies • a systematic method used to select and/or generate tests to be included in a test suite. • effective: likely to reveal bugs • Kinds • behavioral = black-box = functional • structural = white-box = glass-box testing • hybrid Testing/Spring 99

  36. Testing strategies • behavioral = black-box = functional • based on requirements • structural = white-box = glass-box testing • based on program (coverages) • hybrid • use combination Testing/Spring 99

  37. Classification of bugs • unit/component bugs • integration bugs • system bugs Testing/Spring 99

  38. Generic Testing Principles • Define the graph • Design node-cover tests (tests that confirm that the nodes are there) • Design edge-cover tests (that confirm all required links and no more) • Design loop tests Testing/Spring 99

  39. Generic Testing Principles: Example • Define the graph • UML class diagram • Design node-cover tests (tests that confirm that the nodes are there) • Build at least one object of each class • Design edge-cover tests (that confirm all required links) • use each inheritance edge and association Testing/Spring 99

  40. Generic Testing Principles: Example • Define the graph • Finite state machine • Design node-cover tests (tests that confirm that the nodes are there) • Use each state at least once • Design edge-cover tests (that confirm all required links) • use each state transition at least once Testing/Spring 99

  41. Testing/Spring 99

  42. Quality factors • Correctness • conform to specification • Maintainability • ease with which software can be changed • corrective: error fixing • adaptive: requirement changes MAJORITY • perfective: improve system • Portability Testing/Spring 99

  43. Quality factors • Testability • how easy to test? Are requirements clear? • Usability • effort required to learn and operate system • Reliability: mean-time between failures • Efficiency: use of resources • Integrity, Security Testing/Spring 99

  44. Quality factors • Reusability • Interoperability • Write Quality Manual to address those issues Testing/Spring 99

  45. ISO 9000 Series of Standards(5 years old) • How can customers judge the competence of a software developer? • Adopted by 130 countries. • ISO 9001: Quality Systems - Model for Quality Assurance in Design, Development, Production, Installation and Servicing. (general design) Testing/Spring 99

  46. ISO 9000 Series of Standards(5 years old) • ISO 9000-3 Guidelines for the Application of ISO 9001 to the Development, Supply and Maintenance of Software. • ISO 9004-2 Quality Management and Quality System Elements Testing/Spring 99

  47. The end Testing/Spring 99

  48. Test coverage tool • For example: For each traversal, which fraction of traversal methods are used? • How often is each adaptive method called? • Define global counters in Main class. • Use aspect language to instrument code. Generate code. • Testing tool development. Testing/Spring 99

More Related