a comprehensive framework for testing database centric
play

A Comprehensive Framework for Testing Database-Centric Software - PowerPoint PPT Presentation

A Comprehensive Framework for Testing Database-Centric Software Applications Gregory M. Kapfhammer Department of Computer Science University of Pittsburgh PhD Dissertation Defense April 19, 2007 PhD Dissertation Defense, University of


  1. A Comprehensive Framework for Testing Database-Centric Software Applications Gregory M. Kapfhammer Department of Computer Science University of Pittsburgh PhD Dissertation Defense April 19, 2007 PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 1

  2. Dissertation Committee Director : Dr. Mary Lou Soffa (University of Virginia) Dr. Panos Chrysanthis (University of Pittsburgh) Dr. Bruce Childers (University of Pittsburgh) Dr. Jeffrey Voas (SAIC) With support and encouragement from countless individuals! PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 2

  3. Motivation The Risks Digest, Volume 22, Issue 64, 2003 Jeppesen reports airspace boundary problems About 350 airspace boundaries contained in Jeppesen NavData are incorrect, the FAA has warned. The error occurred at Jeppe- sen after a software upgrade when information was pulled from a database containing 20,000 airspace boundaries worldwide for the March NavData update, which takes effect March 20. Important Point : Practically all use of databases occurs from within applica- tion programs [Silberschatz et al., 2006, pg. 311]. PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 3

  4. Research Contributions Comprehensive framework that tests a program’s interaction with the complex state and structure of a database Database interaction fault model Database-aware representations Test adequacy Test coverage monitoring Regression testing Worst-case analysis of the algorithms and empirical evaluation with six case study applications PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 4

  5. Traditional Software Testing P Input Output a 5 print ... exit Final Result: 45 Byte Code Virtual Graphical Machine Database Interface File Operating System System Execution Environment Defects (e.g., bugs, faults, errors) can exist in program P and all aspects of P ’s environment PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 5

  6. Testing Environment Interactions P Input Output a 5 print ... exit Final Result: 45 Virtual Machine Database Operating System File System Execution Environment Defects can also exist in P ’s interaction with its environment PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 6

  7. Focus on Database Interactions P m insert select update delete D D 1 e Program P can view and/or modify the state of the database PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 7

  8. Types of Applications Database−Centric Applications Interaction Approach Program Location Embedded Interface Inside DBMS Outside DBMS Testing framework relevant to all types of applications Current tool support focuses on Interface-Outside applications Example: Java application that submits SQL String s to HSQLDB relational database using JDBC drivers PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 8

  9. Research Contributions Database Interaction Fault Model Test Adequacy Criteria Test Coverage Monitoring Regression Testing Reduction Prioritization PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 9

  10. Database Interaction Faults: (1-v) P uses update or m insert to incorrectly P modify items within database insert update Commission fault that violates database validity expected Database-aware adequacy criteria before after can support fault actual isolation PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 10

  11. Database Interaction Faults: (1-c) P uses delete to m P remove incorrect items from database delete Commission fault that violates database completeness expected Database-aware adequacy criteria can before after support fault isolation actual PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 11

  12. Data Flow-Based Test Adequacy P 3 from R delete define(R) where A > 100 m i select * from R use(R) 6 The intraprocedural database interaction association � n 3 , n 6 , R � exists within method m i PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 12

  13. Research Contributions Database Interaction Fault Model Test Adequacy Criteria Test Coverage Monitoring Regression Testing Reduction Prioritization PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 13

  14. Test Adequacy Component P, C Representation ICFG Constructor Database-Aware Database Representation Constructor Database Database Interaction Database Interaction ICFG Data Flow Entities Analyzer (DI-ICFG) Analyzer Database Interaction Associations Process : Create a database-aware representation and perform data flow analysis Purpose : Identify the database interaction associations (i.e., the test requirements) PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 14

  15. Database-Aware Representation Database interaction qu_lck = "UPDATE UserInfo ..." + temp1 + ";" graphs (DIGs) are update_lock = m_connect.createStatement() placed before interaction D A entry entry G G r 1 r 2 point define(temp2) define(temp3) Multiple DIGs can be use(temp4) exit G r 1 integrated into a single exit G r 2 CFG I r result_lock = update_lock.executeUpdate(qu_lck) Analyze interaction in a if( result_lock == 1) control-flow sensitive completed = true fashion exit lockAccount PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 15

  16. Data Flow Time Overhead P � R c P P � D P � R P � A P � A v 25 21.06 20.94 20.74 20.77 20.78 20.50 20 Time � sec � 15 10 5 P P � D P � R P � R c P � A P � A v Database Granularity 2 . 7% increase in time overhead from P to P + A v ( TM ) PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 16

  17. Research Contributions Database Interaction Fault Model Test Adequacy Criteria Test Coverage Monitoring Regression Testing Reduction Prioritization PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 17

  18. Database-Aware Coverage Monitoring Program Test Suite Adequacy Instrumented Criteria Program Instrumentation Instrumented Test Suite Test Suite Coverage Execution Results Test Adequacy Requirements Calculation Adequacy Measurements Purpose : Record how the program interacts with the database during test suite execution PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 18

  19. Database-Aware Instrumentation Test Coverage Monitoring Instrumentation Interaction Location Interaction Type Program Test Suite Defining Using Defining-Using Efficiently monitor coverage without changing the behavior of the program under test Record coverage information in a database interaction calling context tree (DI-CCT) PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 19

  20. Configuring the Coverage Monitor Configuration of the Test Coverage Monitor Instrumentation Tree Format Tree Type Tree Storage Static Dynamic Binary XML Traditional Database-Aware Standard Compressed Source Code Bytecode CCT DCT Interaction Level DI-DCT DI-CCT Database Relation Attribute Record Attribute Value Flexible and efficient approach that fully supports both traditional and database-centric applications PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 20

  21. Static Instrumentation: Time FF PI RM ST TM GB All 10 � sec � 8.687 8 Time 5.583 6 5.169 Instrumentation 4.404 4.391 4.396 4.394 4 Static 2 FF PI RM ST TM GB All Application Attach probes to all of the applications in less than nine seconds Static approach is less flexible than dynamic instrumentation PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 21

  22. Static Instrumentation: Space As A ZIP GZIP PACK 75782 80000 � 68456 68475 � bytes 60000 Size 40000 Application 25084 19419 20000 9034 ZIP GZIP PACK Compression Technique � GB � Increase in bytecode size may be large (space vs. time trade-off) PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 22

  23. Static vs. Dynamic Instrumentation Norm Sta � CCT Sta � DCT Dyn � CCT Dyn � DCT 14 11.435 12 11.084 10 � sec � 8.026 7.626 8 6.939 Time 6 TCM 4 2 Norm Sta � CCT Sta � DCT Dyn � CCT Dyn � DCT Instrumentation Technique Tree Type � GB � � Static is faster than dynamic / CCT is faster than DCT The coverage monitor is both efficient and effective PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 23

  24. Size of the Instrumented Applications Compr Tech Before Instr (bytes) After Instr (bytes) None 29275 887609 Zip 15623 41351 Gzip 10624 35594 Pack 5699 34497 Average static size across all case study applications Compress the bytecodes with general purpose techniques Specialized compressor nicely reduces space overhead PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 24

  25. Database Interaction Levels Percent Increase ( % ) CCT Interaction Level TCM Time (sec) Program 7.44 12.39 Database 7.51 13.44 Relation 7.56 14.20 Attribute 8.91 34.59 Record 8.90 34.44 Attribute Value 10.14 53.17 Static instrumentation supports efficient monitoring 53% increase in testing time at finest level of interaction PhD Dissertation Defense, University of Pittsburgh, April 19, 2007 – p. 25

Recommend


More recommend