s ens a sensitivity analysis for quantitative change
play

S ENS A: Sensitivity Analysis for Quantitative Change-impact - PowerPoint PPT Presentation

S ENS A: Sensitivity Analysis for Quantitative Change-impact Prediction Haipeng Cai Siyuan Jiang Raul Santelices Ying-jie Zhang* Yiji Zhang University of Notre Dame, USA * Tsinghua University, China Supported by ONR Award


  1. S ENS A: Sensitivity Analysis for Quantitative Change-impact Prediction Haipeng Cai  Siyuan Jiang  Raul Santelices  Ying-jie Zhang* Yiji Zhang   University of Notre Dame, USA * Tsinghua University, China Supported by ONR Award N000141410037 SCAM 2014

  2. What we do 2 M1, M2, M3, M4, M5 M2 Candidate Change Locations Program base Predictive Dynamic Change-impact Analysis (CIA)  Challenge 1: Coarse granularity (missing details) Predicted Impacts  Challenge 2: Large size M1, M2, M3, M5 (incurring prohibitive costs)

  3. What we do 3 M1, M2, M3, M4, M5 M2 Candidate Change Locations Program base Predictive Dynamic Change-impact Analysis (CIA)  Challenge 1: Coarse granularity (missing details) Predicted Impacts Solution: statement-level analysis  Challenge 2: Large size M1, M2, M3, M5 (incurring prohibitive costs) Solution: prioritize change impacts

  4. Technique: overview 4 S ENS A Program Quantified P Impacts Execution Differencing Sensitivity Analysis Candidate Change Location Execution Instrumented Test Suite Histories P

  5. Technique: sensitivity analysis 5 Candidate Change Location Static S ENS A Program Instrumenter Instrumented Program Modified Modified Modified Original …… Execution 1 Execution 2 Execution N Execution Runtime

  6. Technique: execution differencing 6 // change Original Execution Modified Execution Statement Value Execution Statement Value Results (statements): 20 False 20 False History 6 6 True 6 False 7 11 -3 17 11 -3 12 -3 12 -3 7 - Impact set 17 False - (for statement 6) 4 -3 4 -3

  7. How S ENS A works 7 //change Original Execution Multiple Modified Executions Statement Impact Frequency Execution Differencing 6 1 7 1 Impact Quantification 17 1 2 0 Only one modified Quantified execution for this 0 … Impact Set example 21 0

  8. Subject programs and statistics 8 Subject Description Lines of Code Tests Changes Schedule1 Priority Scheduler 290 2,650 7 NanoXML XML parser 3,521 214 7 XML-Security Encryption library 22,361 92 7 Ant Java project build tool 44,862 205 7

  9. Experimental methodology 9 program, test suite, statement S Actual change at S S ENS A Actual-impact computation Quantified impacts Actual impacts (ground truth) Impact-set Comparison Metrics computation

  10. Experimental methodology 10 program, test suite, statement S Actual change at S S ENS A Actual-impact computation Quantified impacts Actual impacts (ground truth) Impact-set Comparison Only for evaluation: not parts of Metrics computation S ENS A

  11. Experimental methodology 11  Metrics  Effectiveness: inspection effort  Percentage of worse-case inspection cost  Cost: computation time  Two variants: S ENS A-R AND , S ENS A-I NC  Compare to: static slicing, dynamic slicing, ideal case  Ideal case: best prediction possible  use the actual impact set as the prediction result

  12. Results: inspect effort 12 Ideal case static slicing dynamic slicing S ENS A-R AND S ENS A-I NC 60% 50% 40% 30% 20% 10% 0% Schedule1 NanoXML XML- Ant Overall security

  13. Results: computation time 13 Subject Static analysis Instrumented run Post-processing 6 sec 4,757 sec 1,054 sec Schedule1 17 sec 773 sec 10 sec NanoXML 179 sec 343 sec 21 sec XML-Security 943 sec 439 sec 7 sec Ant  Static analysis and post-processing cost little time  Runtime cost dominates the total cost  Come from multiple modified executions  Can be greatly reduced by executing all modifications in parallel

  14. Results: computation time 14 Subject Static analysis Instrumented run Post-processing 6 sec 4,757 sec 1,054 sec Schedule1 17 sec 773 sec 10 sec NanoXML 179 sec 343 sec 21 sec XML-Security 943 sec 439 sec 7 sec Ant  Static analysis and post-processing cost little time  Runtime cost dominates the total cost  Come from multiple modified executions Highly Parallelizable  Can be greatly reduced by executing all modifications in parallel

  15. Conclusion 15  Contributions  A novel approach to quantifying dependencies and, based on that, a quantitative dynamic impact prediction technique  An empirical study of the new approach showing the significantly better effectiveness of the new approach than slicing, at reasonable costs  Future Work  To expand the study by including more subjects and more types of changes  To apply the dependence-quantification approach to tasks other than impact analysis

  16. Conclusion 16  Contributions  A novel approach to quantifying dependencies and, based on that, a quantitative dynamic impact prediction technique  An empirical study of the new approach showing the significantly better effectiveness of the new approach than slicing, at reasonable costs  Future Work  To expand the study by including more subjects and more types of changes  To apply the dependence-quantification approach to tasks other than impact analysis

  17. 17

  18. Controversial statements 18  Test suite augmentation is irrelevant to alleviating the limitation of dynamic analysis that the execute set used does not fully represent the program behavior.  Quantitative dependence analysis is more effective than traditional non-quantified dependence analysis.

Recommend


More recommend