theme a system for testing by hardware monitoring events
play

THeME: A System for Testing by Hardware Monitoring Events Kristen - PowerPoint PPT Presentation

THeME: A System for Testing by Hardware Monitoring Events Kristen R. Walcott-Justice Jason Mars kwalcott@uccs.edu jom5x@cs.virginia.edu University of Colorado - University of California - Colorado Springs San Diego Mary Lou Soffa


  1. THeME: A System for Testing by Hardware Monitoring Events Kristen R. Walcott-Justice Jason Mars kwalcott@uccs.edu jom5x@cs.virginia.edu University of Colorado - University of California - Colorado Springs San Diego Mary Lou Soffa soffa@cs.virginia.edu University of Virginia Issta 2012, July 17, Minneapolis, MN Wednesday, July 18, 12 1

  2. Developing Reliable Software Software lifecycle Testing Maintenance Require Release Design ment and Implementation Patching Bug fix • Measuring test quality: • Recompilation • High run time overheads • Large code growth Wednesday, July 18, 12 2

  3. Expense of Traditional Test Coverage Analysis Will B2 • Instrumentation Will B1 execute? execute? • Probe • Payload Branch Executed? √ • Branch analysis overheads: B1 B2 • Time: 10% - 30% • Code growth: 60% - 90% Wednesday, July 18, 12 3

  4. Efficient Program Monitoring Optimization Profiling Race Detection Software-Level Monitoring Wednesday, July 18, 12 4

  5. Efficient Program Monitoring Optimization Profiling Race Detection Software-Level Monitoring Hardware Monitoring Wednesday, July 18, 12 5

  6. What is a Hardware Mechanism? System State Sample Sample read() Wednesday, July 18, 12 6

  7. Using Hardware Mechanisms • Developed for operating system performance analysis • Widely available on nearly all processors • Low overhead • Short setup time (318µs) • Quick read time (3.5µs) • Use of samples • Estimate profiles • Reveal program execution behavior • Removes need for instrumentation Wednesday, July 18, 12 7

  8. Hardware Mechanisms in Testing: Goals and Challenges • Structural testing requires more exact data • Can we capture ALL events with which we are concerned? • Can we capture ONLY the events with which we are concerned? • Tradeoff: • Amount of information collected • Overhead of sampling Wednesday, July 18, 12 8

  9. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 9

  10. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 9

  11. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 9

  12. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 9

  13. Branch Vector Recording: Last Branch Record (LBR) • Mechanism for partial branch 1 2 profiling • Intended for OS performance 4 3 and debugging 5 6 SAMPLE • Tracks set of executed branches • Branch source 7 8 • Branch destination • Sample == Set of branches Last Branch Record Hardware Mechanism 2 4 6 . . . . . . . . “Branch Vector” Branch Vector ( ≤ 16 branches) Wednesday, July 18, 12 10

  14. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 11

  15. Enabling Fall-through Visibility Challenge: Hardware branch-based monitors can only see 1 of 2 branch edges • Methods 1 1 • Supplement with more samples Jump FT Jump FT • Use static analysis to infer branches 2 3 2 3 • Minor program modification new 4 5 4 • Our Solution: new jump Insert innocuous unconditional 5 branches Wednesday, July 18, 12 12

  16. Enabling Fall-through Visibility Challenge: Hardware branch-based monitors can only see 1 of 2 branch edges • Methods 1 1 • Supplement with more samples Jump FT Jump FT • Use static analysis to infer branches 2 3 2 3 • Minor program modification new 4 5 4 • Our Solution: new jump Insert innocuous unconditional 5 branches Wednesday, July 18, 12 12

  17. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 13

  18. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 14

  19. THeME: Testing by Hardware Monitoring Events Branch Sampler Hardware Coverage Program modification Sampling/Monitoring Calculation Wednesday, July 18, 12 15

  20. Improving Branch Coverage • Sampling → Some missed data 1 • Goal: Improve coverage using 2 static analysis 5 • Dominator analysis 7 • Associate seen branches with 8 control flow graph • Branch b executed → branch 11 c also executed Wednesday, July 18, 12 16

  21. Experiment and System Design • Intel Core i7 860 quad-core processor • LBR size of 16 branches • Linux 2.6.34 • Hardware access tools: libpfm4 (user-level), perf (kernel-level) • SPEC2006 C Benchmarks • Metrics: • Efficiency- time • Code growth size • Effectiveness- branch coverage • Instrumented vs Hardware Monitoring Wednesday, July 18, 12 17

  22. Results: Enabling Fall-Through Visibility • Impact: • Increases time overhead • Increases code growth • How compared to instrumentation? Time overhead Benchmark Branch Time (s) Mod. Time Instr. Time Cov. (s) (s) bzip2 64.20% 1499 1514 1599 h264ref 35.72% 1753 1786 1890 libquantum 39.07% 1056 1178 1236 mcf 74.01% 529 539 575 sjeng 48.87% 1028 1162 1312 Avg: 5% Avg: 14% increase increase Wednesday, July 18, 12 18

  23. Results: Enabling Fall-Through Visibility • Impact: • Increases time overhead • Increases code growth • How compared to instrumentation? Code Growth Benchmark Native Mod. Instr. Size (kB) % Increase % Increase bzip2 260 kB 1.52 32.65 h264ref 2892 kB 0.69 18.39 libquantum 208 kB 0 20.00 mcf 128 kB 0 17.95 sjeng 592 kB 0.67 30.05 Avg: 0.5% Avg: 24% Wednesday, July 18, 12 19

  24. Results: Testing on a Single Core - Effectiveness Wednesday, July 18, 12 20

  25. Results: Testing on a Single Core - Effectiveness Wednesday, July 18, 12 20

  26. Results: Testing on a Single Core - Efficiency Percent Time Overhead Using Interrupt Driven Approach on Ref Inputs 60% 50% Percent time overhead 40% 500K 1M 30% 5M 10M 20% 50M 10% 0% − 10% bzip h264ref libquantum mcf sjeng Sample periods per benchmark Wednesday, July 18, 12 21

  27. Results: Better Coverage at High Sample Rates Wednesday, July 18, 12 22

  28. Results: Better Coverage at High Sample Rates Wednesday, July 18, 12 22

  29. Results: Better Coverage at High Sample Rates 71% Wednesday, July 18, 12 22

  30. Results: Better Coverage at High Sample Rates 90% 72% Wednesday, July 18, 12 22

  31. Results: Testing on a Multiple Cores - Efficiency Percent Time Overhead Splitting Inputs Across Cores 40% Percent time overhead 30% 500K 20% 1M 5M 10M 10% 50M 0% − 10% − 20% bzip2 h264ef Sample periods per benchmark Wednesday, July 18, 12 23

  32. Hardware Monitoring Benefits • Low overhead, effective branch testing technique • Up to 90% of branch coverage • 2% time improvement • 0.5% code growth (compared to 60% to 90%) • Test coverage approximation • Testing on resource constrained devices • “Imprecise” tasks (e.g. regression test prioritization) • Partial program monitoring • Significant benefits • Enable testing on resource constrained devices • Generates full picture of program execution Wednesday, July 18, 12 24

  33. Conclusions and Future Work • Extensible, portable system for single or multiple cores • Up to 11.13% improvement in time overhead • Up to 90% of the coverage reported by instrumentation • Reduced time overhead (~2%) • Negligible code growth • Future work: • Combine hardware monitoring with limited instrumentation • Implement on resource constrained device • Extend system to other coverage metrics Wednesday, July 18, 12 25

  34. Thank You! Website: http://www.cs.virginia.edu/walcott Questions? Wednesday, July 18, 12 26

Recommend


More recommend