310 likes | 326 Views
Using Runtime Testing to Detect Defects in Applications without Test Oracles. Chris Murphy Columbia University November 10, 2008. About Me. 3 rd -year PhD student at Columbia University Advised by Prof. Gail Kaiser Research interests: Software Testing Computer Science Education
E N D
Using Runtime Testing to Detect Defects in Applications without Test Oracles Chris Murphy Columbia University November 10, 2008
About Me • 3rd-year PhD student at Columbia University • Advised by Prof. Gail Kaiser • Research interests: • Software Testing • Computer Science Education • Computer-Supported Cooperative Work
Introduction • This thesis addresses the problem of testing complex, highly configurable systems, particularly those without “test oracles” that indicate what the correct output should be for arbitrary input • We adapt a technique that leverages built-in “pseudo-oracles” and perform testing in the deployment environment in order to address limitations regarding defects that reveal themselves only in certain states or with certain input data
Overview • Problem Statement & Requirements • Approach & Hypotheses • Model & Architecture • Feasibility & Preliminary Results • Related Work • Expected Contributions
Problem Statement • Some applications, such as in Machine Learning, do not have test oracles for the general case • Even if certain defects may be detectable, others can only be revealed as a result of particular input values, configurations, application states, or runtime environments that may not have been tested prior to deploying a software product
Observation • Even when there is no oracle in the general case, there can still be a limited subset of inputs such that: • they can at least reveal certain types of defects, e.g. catastrophic failures (crashes), and/or • the expected output can actually be known • These inputs may be generated based on past inputs and their respective outputs, as in “Metamorphic Testing” [Chen ’98]
Metamorphic Testing • Originally designed as an approach for creating follow-up test cases based on those that have not revealed any defects • If input x produces output f(x), then the function’s “metamorphic properties” are used to guide a transformation function t, which is applied to produce t(x) • We can then predict the expected value of f(t(x)) based on the known value of f(x) • If f(t(x)) is not as expected, then a defect exists
Metamorphic Testing Example • Anomaly-based network intrusion detection systems build a “model” of normal behavior • In some cases, the model may consider the byte distribution of data in the incoming network packet; anything deemed anomalous causes an alert • We cannot know a priori whether a particular packet should cause an alert • However, if we permute the order of the bytes, the result (anomalous or not) should be the same as for the original packet, since the system only considers the distribution of the bytes
Proposed Approach • To address the problem of functions and/or applications that have no test oracle, we use Metamorphic Testing • In such applications, to reveal defects that are dependent on input data, configuration, application state, or the runtime environment, we continue Metamorphic Testing in the field, after deployment and during actual usage
Approach Details • Initial input/output pairs are taken from actual executions • We cannot know whether the output is correct but we at least know that the input is something that comes up in practice, and is useful as a valid test case • We then apply “metamorphic properties” to get test input, so that we should be able to predict the test output • Although we cannot know whether the test output is correct either, if it is not as predicted then there is a defect • Since this runs in the field, we have to ensure that users don't notice this testing, e.g. see the test output, experience a sudden performance lag, etc.
Hypotheses • For programs that do not have a test oracle, conducting Metamorphic Testing within the context of the application running in the field can reveal defects that would not ordinarily otherwise be found • This can be done without affecting the application state from the users’ perspective, and with minimal performance overhead
Proposed Model • Automated Metamorphic System Testing: Conducts system-level Metamorphic Testing in the deployment environment • Metamorphic Runtime Checking: A separate testing technique that, for individual units (functions), supports the execution of Metamorphic Tests that are executed “from within” the context of the running application
Automated Metamorphic System Testing • Checks that the metamorphic properties of the entire system hold after execution • Treats the application as a black box • Multiple invocations run in parallel, and results are compared upon completion • User only sees output from the “original” invocation
Amsterdam: Automated Metamorphic System Testing Framework • Metamorphic properties are specified in XML • Input transformation • Runtime options • Output comparison • Framework provides out-of-box support for numerous transformation and comparison functions but is extendable to support custom operations • Additional invocations are executed in parallel in separate sandboxes that have their own virtual execution environment
Metamorphic Runtime Checking • For individual units (functions), we check whether the metamorphic properties hold as the application is running, using actual input from real executions and the application’s current state • Function arguments are modified according to the metamorphic properties • Function is called again (in an isolated sandbox) with the new input • Outputs are compared
Function foo is about to be executed Run a pre-test? Run a post-test? Program continues Execute foo no no yes yes Create a sandbox for the test Execute postTestFoo Execute preTestFoo Record success/failure Record success/failure Create a sandbox for the test
Foundation: In Vivo Testing • To facilitate testing “from within” a running program, we will extend In Vivo Testing [Chu ICST’08] • In Vivo Tests are analogous to unit tests but they test from within the context of the running application as it executes in the deployment environment, as opposed to a clean slate • They test that sequences of actions produce the expected results, no matter what the configuration, state, or runtime environment
Complementary Testing Approaches • Metamorphic Testing addresses a limitation of In Vivo Testing: • The need for a test oracle • In Vivo Testing addresses some limitations of Metamorphic Testing: • Availability of initial test data • Detecting defects that only appear in certain states, configurations, or environments; or occur only intermittently
Columbus: Metamorphic Runtime Checking Framework • Tests (specifications of metamorphic properties) are written by developers, then select components of the application are instrumented with tests at compile-time • Configuration includes: probability of running a test for each function; maximum number of concurrent tests; action to take when test fails; whether to assign tests to separate processor/core • Test sandbox can be created by simple “fork” or by creating a virtual execution environment
Preliminary Results • Identified categories of metamorphic properties in the domain of Machine Learning • Detected defects with Metamorphic System Testing • Detected defects with In Vivo Testing • Detected defects with Metamorphic Runtime Checking
Categories of Metamorphic Properties[Murphy SEKE’08] • Additive: Increase (or decrease) numerical values by a constant • Multiplicative: Multiply numerical values by a constant • Permutative: Randomly permute the order of elements in a set • Invertive: Reverse the order of elements in a set • Inclusive: Add a new element to a set • Exclusive: Remove an element from a set • ML apps such as ranking, classification, and anomaly detection exhibit these properties
Feasibility: Metamorphic System Testing • We performed system-level metamorphic testing on various types of Machine Learning applications [Murphy SEKE’08] • Detected previously-unknown defects in a real-world network intrusion detection system • However, this testing was not automated: inputs were modified with one-off scripts and outputs were compared manually
Feasibility: In Vivo Testing • We have previously developed an implementation of the In Vivo Testing framework for Java applications called Invite [Chu ICST’08] • Targeted towards applications in which defects were not obvious to the user (not necessarily those without test oracles) • Detected known defects in OSCache that were not found by traditional unit tests • Uses “fork” to create new processes for sandbox
Feasibility: Metamorphic Runtime Checking • We have developed a system by which functions’ metamorphic properties are specified using an extension to JML • Specifications converted into metamorphic unit tests by a tool called Corduroy[Murphy ’08] • tests run using JML Runtime Assertion Checking • Detected defects in WEKA and RapidMiner machine learning toolkits
Related Work: Absence of Oracles • Pseudo-oracles [Davis ACM’81] • Testing non-testable programs [Weyuker TCJ’82] • Overview of approaches [Baresi ’01] • Embedded assertion languages • Extrinsic interface contracts • Pure specification languages • Trace checking & log file analysis • Using metamorphic testing [Chen JIST’02]
Related Work: Testing in the Field • Perpetual Testing [Osterweil QW’96] • Gamma [Orso ISSTA’02] • Skoll [Memon ICSE’04] • Cooperative Bug Isolation [Liblit RAMSS’04] • Failure-Oblivious Computing [Rinard OSDI’04] • Security systems that monitor for errors
Methodology (1) • To further demonstrate feasibility, we will conduct Automated Metamorphic System Testing and Metamorphic Runtime Checking on real-world Machine Learning applications as they run under normal operation inthe field • We will alsoshow that certain defects would not ordinarily have been detected by using Metamorphic Testing (or other techniques) priorto deployment
Methodology (2) • To show that our testing approach advances the state of the art in testing applications thathave no test oracle, we will compare it to other techniques that could be used to address this same problem • Symbolic execution • Model checking • Program invariants • Formal specification languages
Expected Contributions • Automated Metamorphic System Testing and a testing framework called Amsterdam • Metamorphic Runtime Checking and a testing framework called Columbus • A set of guidelines for assisting the formulation and specification of metamorphic properties
Using Runtime Testing to Detect Defects in Applications without Test Oracles Chris Murphy cmurphy@cs.columbia.edu