1 / 24

In Vivo Testing Approach for Efficient System Testing

Learn about the innovative in vivo testing approach, its implementation framework, and how it addresses testing challenges in deployment environments. Explore case studies on caching systems and the benefits compared to traditional unit testing.

snoe
Download Presentation

In Vivo Testing Approach for Efficient System Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The In Vivo Testing Approach Christian Murphy, Gail Kaiser, Ian Vo, Matt Chu Columbia University

  2. Problem Statement • It is infeasible to fully test a large system prior to deployment considering: • different runtime environments • different configuration options • different patterns of usage • This problem may be compounded by moving apps from single-CPU machines to multi-core processors

  3. Our Solution • Continually test applications executing in the field (in vivo) as opposed to only testing in the development environment (in vitro) • Conduct the tests in the context of the running application • Do so without affecting the system’s users

  4. int main ( ) { ... ... ... foo(x); int main ( ) { ... ... ... foo(x); test_foo(x); ... ... }

  5. Contributions • A new testing approach called in vivo testing designed to execute tests in the deployment environment • A new type of tests called in vivo tests • An implementation framework called Invite

  6. Related Work • Perpetual testing [Clarke SAS’00] • Skoll [Memon ICSE’04] • Gamma [Orso ISSTA’02] • CBI [Liblit PLDI’03] • Distributed In Vivo Testing [Chu ICST’08]

  7. Number of items in the cache Their size (in bytes) Should only be incremented within “if” block Example of Defect: Cache private int numItems = 0, currSize = 0; private int maxCapacity = 1024; // in bytes public int getNumItems() { return numItems; } public boolean addItem(CacheItem i) throws ... { numItems++; add(i); currSize += i.size; return true; } Maximum capacity if (currSize + i.size < maxCapacity) { } else { return false; }

  8. Insufficient Unit Test public void testAddItem() { Cache c = new Cache(); assert(c.addItem(new CacheItem())) assert(c.getNumItems() == 1); assert(c.addItem(new CacheItem())) assert(c.getNumItems() == 2); } 1. Assumes an empty/new cache 2. Doesn’t take into account various states that the cache can be in

  9. Defects Targeted • Unit tests that make incomplete assumptions about the state of objects in the application • Possible field configurations that were not tested in the lab • A legal user action that puts the system in an unexpected state • A sequence of unanticipated user actions that breaks the system • Defects that only appear intermittently

  10. Applications Targeted • Applications that produce calculations or results that may not be obviously wrong • “Non-testable programs” • Simulations • Applications in which exta-functional behavior may be wrong even if output is correct • Caching systems • Scheduling of tasks

  11. In Vivo Testing: Process • Create test code (using existing unit tests or new In Vivo tests) • Instrument application using Invite testing framework • Configure framework • Deploy/execute application in the field

  12. Run a test? Rest of program continues Stop Fork Model of Execution Function is about to be executed Execute function NO Yes Run test Create sandbox

  13. Writing In Vivo Tests /* Method to be tested */ public boolean addItem(CacheItem i) { . . . } /* JUnit style test */ public void testAddItem() { Cache c = new Cache(); if (c.addItem(new CacheItem())) assert (c.getNumItems() == 1); } In Vivo boolean CacheItem i) { this; int oldNumItems = getNumItems(); i)) return oldNumItems+1; else return true;

  14. Instrumentation /* Method to be tested */ public boolean __addItem(CacheItem i) { . . . } /* In Vivo style test */ public boolean testAddItem(CacheItem i) { ... } public boolean addItem(CacheItem i) { if (Invite.runTest(“Cache.addItem”)) { Invite.createSandboxAndFork(); if (Invite.isTestProcess()) { if (testAddItem(i) == false) Invite.fail(); else Invite.succeed(); Invite.destroySandboxAndExit(); } } return __addItem(i); }

  15. Configuration • Each instrumented method has a set probability ρwith which its test(s) will run • To avoid bottlenecks, can also configure: • Maximum allowed performance overhead • Maximum number of simultaneous tests • Also, what action to take when a test fails

  16. Case Studies • Applied testing approach to two caching systems • OSCache 2.1.1 • Apache JCS 1.3 • Both had known defects that were found by users (no corresponding unit tests for these defects) • Goal: demonstrate that “traditional” unit tests would miss these but In Vivo testing would detect them

  17. Experimental Setup • An undergraduate student created unit tests for the methods that contained the defects • These tests passed in “development” • Student was then asked to convert the unit tests to In Vivo tests • Driver created to simulate real usage in a “deployment environment”

  18. Discussion • In Vivo testing revealed all defects, even though unit testing did not • Some defects only appeared in certain states, e.g. when the cache was at full capacity • These are the very types of defects that In Vivo testing is targeted at • However, the approach depends heavily on the quality of the tests themselves

  19. Performance Evaluation • We instrumented three C and two Java applications with the framework and varied the value ρ(probability that a test is run) • Applications were run with real-world inputs on a dual-core 3GHz server with 1GB RAM • No restraints were placed on maximum allowable overhead or simultaneous tests

  20. Experimental Results Time (seconds) 0% 25% 50% 75% 100% percent of function calls resulting in tests

  21. Discussion • Percent overhead is not a meaningful metric since it depends on the number of tests run • More tests = more overhead • Short-running programs with lots of tests will have significantly more “overhead” than long-running programs • For C, the overhead was 1.5ms per test • For Java, around 5.5ms per test

  22. Future Work • Ensure that test does not affect the external system state (database, network, etc.) • Adjust frequency of test execution based on context or resource availability (CPU usage, number of threads, etc.) • Apply approach to certain domains, e.g. security testing

  23. Conclusion • We have presented a new testing approach called in vivo testing designed to execute tests in the deployment environment • We have also presented an implementation framework called Invite • In Vivo testing is an effective technique at detecting defects not caught in the lab

  24. The In Vivo Testing Approach Christian Murphy, Gail Kaiser, Ian Vo, Matt Chu Columbia University

More Related