170 likes | 313 Views
A lightweight framework for testing database applications. Joe Tang Eric Lo Hong Kong Polytechnic University. Our focus. System testing (or black-box testing) A database application its correctness/behavior depends on The application code + The information in the database.
E N D
A lightweight framework for testing database applications Joe Tang Eric Lo Hong Kong Polytechnic University
Our focus • System testing (or black-box testing) • A database application its correctness/behavior depends on • The application code + • The information in the database
How to test a database application? • Test preparation: • In a particular “correct” release • A tester ‘plays’ the system and the sequence of actions (e.g., clicks) is recorded as a test trace/case T • E.g., T1: A user queries all product • E.g., T2: A user add a product • The output of the system is recorded as the “expected results” of that trace • For database applications, “the output of the system” depends on the database content • A test trace may modify the database content • For ease of managing multiple test traces, we reset the database content at the beginning of recording each test trace
How to test a database application? • Test execution: • For each test trace T • Reset the database content • Runs the sequence of actions (e.g., clicks) recorded in T • Match the system output with the expected output • Problem: Resetting the DB content is expensive • Involves content recovery + log cleaning + thread resetting [ICSE 04] • About 1-2 minutes for each reset • If 1000 test traces 2000 minutes (33 hours)
DBA testing optimization • Test automation • Execute the test traces (and DB resets) automatically (vs. manually one-by-one) • Test execution strategies • Test optimization algorithms 2 + 3 aims to minimize the number of DB resets
Related work 2. Test execution strategies • Optimistic [vldbj]: Execute reset lazily • T1 T2 T3 R T3 3. Test optimization algorithms • SLICE Algorithm [vldbj]: • If T1 T2 R this time • Next time we try T2 T1 …
Problems 2. Test execution strategies • Optimistic [vldbj]: Execute reset lazily • T1 T2 T3 R T3 • May introduce false positives • E.g., T2 covers a bug but it says nothing! 3. Test optimization algorithms • SLICE Algorithm [vldbj]: • If T1 T2 R this time • Next time we try T2 T1 …
Problems 2. Test execution strategies • Optimistic [vldbj]: Execute reset lazily • T1 T2 T3 R T3 • May introduce false positives • E.g., T2 covers a bug but it says nothing! 3. Test optimization algorithms • SLICE Algorithm [vldbj]: • If T1 T2 R this time • Next time we try T2 T1 … • Large overhead keep swapping info • Get worse when +/- test traces
This paper • Test execution strategy • SAFE-OPTIMISTIC • No false positives • Test optimization algorithm • SLICE* • No overhead • Comparable performance to SLICE • Better than SLICE when +/- test traces
Test execution strategySAFE-OPTIMISTIC • Also “execute resets lazily” • Test preparation: • Record not only the system output • + query results • Test execution • Match not only the system output • + Match query results
Test optimization algorithmSLICE* algorithm • Collection of “slices” • If T1 T2 T3 R T3 T4 T5 • Then we know <T1 T2> and <T3 T4 T5> are good • Next time: swap the slices, and thus try: • T3 T4 T5 T1 T2
Evaluation • A real world case study • An on-line procurement system • Test database 1.5GB • A database reset 1.9 min • Synthetic experiments • Vary the number of test cases • Vary the degree of “conflicts” between test cases • Vary % of update in the test suite
Conclusion • SLICE* and SAFE-OPTIMISTIC • Run tests on database applications • Efficiently • Safe (no false positives) • Able to deal with test suite update
References • [vldbj]Florian Haftmann, Donald Kossmann, Eric Lo: A framework for efficient regression tests on database applications. VLDB J. 16(1): 145-164 (2007) • [ICSE04]R. Chatterjee, G. Arun, S. Agarwal, B. Speckhard, and R. Vasudevan. Using data versioning in database application development. ICSE 2004.