A lightweight framework for testing database applications Joe Tang Eric Lo Hong Kong Polytechnic...

Preview:

Citation preview

A lightweight framework for testing database applications

Joe TangEric Lo

Hong Kong Polytechnic University

Our focus

• System testing (or black-box testing)

• A database application its correctness/behavior depends on– The application code + – The information in the database

2

How to test a database application?

• Test preparation:– In a particular “correct” release

• A tester ‘plays’ the system and the sequence of actions (e.g., clicks) is recorded as a test trace/case T

– E.g., T1: A user queries all product

– E.g., T2: A user add a product

• The output of the system is recorded as the “expected results” of that trace

– For database applications, “the output of the system” depends on the database content

• A test trace may modify the database content– For ease of managing multiple test traces, we reset the

database content at the beginning of recording each test trace

3

How to test a database application?

• Test execution:– For each test trace T

• Reset the database content• Runs the sequence of actions (e.g., clicks) recorded in T• Match the system output with the expected output

• Problem: Resetting the DB content is expensive– Involves content recovery + log cleaning + thread

resetting [ICSE 04]– About 1-2 minutes for each reset

– If 1000 test traces 2000 minutes (33 hours)

4

DBA testing optimization

1. Test automation– Execute the test traces (and DB resets)

automatically (vs. manually one-by-one)

2. Test execution strategies

3. Test optimization algorithms

2 + 3 aims to minimize the number of DB resets

5

Related work

2. Test execution strategies

• Optimistic [vldbj]: Execute reset lazily – T1 T2 T3 R T3

3. Test optimization algorithms

• SLICE Algorithm [vldbj]:– If T1 T2 R this time– Next time we try T2 T1 …

6

Problems

2. Test execution strategies

• Optimistic [vldbj]: Execute reset lazily – T1 T2 T3 R T3

• May introduce false positives• E.g., T2 covers a bug but it says nothing!

3. Test optimization algorithms

• SLICE Algorithm [vldbj]:– If T1 T2 R this time– Next time we try T2 T1 …

7

Problems

2. Test execution strategies

• Optimistic [vldbj]: Execute reset lazily – T1 T2 T3 R T3

• May introduce false positives

• E.g., T2 covers a bug but it says nothing!

3. Test optimization algorithms

• SLICE Algorithm [vldbj]:– If T1 T2 R this time– Next time we try T2 T1 …

• Large overhead keep swapping info• Get worse when +/- test traces

8

This paper

• Test execution strategy– SAFE-OPTIMISTIC

• No false positives

• Test optimization algorithm– SLICE*

• No overhead• Comparable performance to SLICE• Better than SLICE when +/- test traces

9

Test execution strategySAFE-OPTIMISTIC

• Also “execute resets lazily”– Test preparation:

• Record not only the system output• + query results

– Test execution• Match not only the system output• + Match query results

10

Implementation

11

Test optimization algorithmSLICE* algorithm

• Collection of “slices”

• If T1 T2 T3 R T3 T4 T5

• Then we know <T1 T2> and <T3 T4 T5> are good

• Next time: swap the slices, and thus try:

• T3 T4 T5 T1 T2

12

Evaluation

• A real world case study– An on-line procurement system– Test database 1.5GB– A database reset 1.9 min

• Synthetic experiments– Vary the number of test cases– Vary the degree of “conflicts” between test

cases– Vary % of update in the test suite

13

Real case study

14

1000 test traces, 100K conflicts

15

Conclusion

• SLICE* and SAFE-OPTIMISTIC– Run tests on database applications

• Efficiently• Safe (no false positives)• Able to deal with test suite update

16

References

• [vldbj]Florian Haftmann, Donald Kossmann, Eric Lo: A framework for efficient regression tests on database applications. VLDB J. 16(1): 145-164 (2007)

• [ICSE04]R. Chatterjee, G. Arun, S. Agarwal, B. Speckhard, and R. Vasudevan. Using data versioning in database application development. ICSE 2004.

17

Recommended