e6
play

E6 Thursday, March 8, 2001 11:30 AM H OW TO E VALUATE AND S ELECT A - PDF document

P R E S E N T A T I O N Presentation Bio E6 Thursday, March 8, 2001 11:30 AM H OW TO E VALUATE AND S ELECT A H IGH -E ND L OAD T ESTING T OOL Marquis Harding Reality Test International Conference On Software Test Automation March 5-8, 2001


  1. P R E S E N T A T I O N Presentation Bio E6 Thursday, March 8, 2001 11:30 AM H OW TO E VALUATE AND S ELECT A H IGH -E ND L OAD T ESTING T OOL Marquis Harding Reality Test International Conference On Software Test Automation March 5-8, 2001 San Jose, CA, USA

  2. A Methodology for Evaluating Alternative Load Testing Tools Marquis Harding Re a l i t y T e s t

  3. The Selection Problem � Tool selection is a difficult choice � Many alternatives � Costly � Long evaluation period � No standard evaluation method � No standard evaluation criteria This is a Reality Test 2

  4. Agenda � Tool evaluation methodology � The experiment � The results � Technical environment � Technical skill set � Customer evaluation � Environment details � Evaluation methodology � Results This is a Reality Test 3

  5. What Is the Objective? � Predict , diagnose and correct problems in the the system under test (SUT) before deployment. 50 Response Time Incorrect 40 Behavior 30 Unacceptable 20 Performance 10 0 Users This is a Reality Test 4

  6. What Tool Characteristics Matter? � Must scale on Production Equivalent Hardware � Must accurately represent real workload � Must be maintainable and repeatable when SUT changes are tested � Must be Cost Effective 50 50 Current SUT Response Time Response Time Performance 40 40 30 30 Reconfigured 20 20 Performance 10 10 0 0 Users This is a Reality Test 5

  7. Tool Evaluation - The Experiment � Tool evaluation is an experiment � You need to : Gather information Identify materials Identify methodology Identify metrics Execute Analyze � Experiment must be Repeatable Refresh Database Refresh logs Reset This is a Reality Test 6

  8. Information Gathering � Vendor web sites � Vendor literature packs � Local user groups � Internet resources � Customer references This is a Reality Test 7

  9. Identify Materials � Materials required Tool Target system Refresh mechanism Monitoring tools Analysis tools Time � And most importantly Technical support Management support This is a Reality Test 8

  10. Determine Methodology � Determine functions to test � 1 to 3 or more representative scenarios representative scenarios � Start with read only scenario then insert & � Vary complexity � Create input data � Consider security � You can’t test everything This is a Reality Test 9

  11. Determine Metrics � Quantitative metrics � Memory usage � CPU usage � Qualitative metrics � Ease of use � Recording process � Scripting � Reporting � Protocol support This is a Reality Test 10

  12. Executing the Test � Some things to consider Network Load - Day vs. Night System Load Stress of measurement tools � Test must be Repeatable Refresh Database Refresh logs Reset This is a Reality Test 11

  13. Analyze Results � Validate Run � Invalid Return Results � Dropped Connections � Examine Timing Data � Tool Data � External Reporting Data This is a Reality Test 12

  14. Technical Environment � Ample supply of driver machines � As much hard drive storage space as possible � Keep Staged Database backups/dump files � Keep all result files � Ample Memory � Budget 3MB per VU � Double your worst case time estimate Playback must � Every error, omission and oversight costs one hour - � Server response times slow with additional users � User log on time grows exponentially This is a Reality Test 13

  15. Tool Implementation � Technical Skill Set Project Management Networking System Under Test Architecture Unix HTTP PerformanceStudio PerformanceStudio Tool Knowledge SQL Statistics Business Processes Windows NT Database Management This is a Reality Test 14

  16. Customer Evaluation � After Information Gathering, the decision came down to evaluate performance testing tools on a real production system! � Good Management Support � Fair Technical Support � Other Measurement Aids � WinNT - MS Perfmon � SQL Server - SQL Trace � WinDiff This is a Reality Test 15

  17. The Experiment � Project X � SQL Server driven application for customer � Application to track user maintenance � Evaluate performance testing tools on real production system � All were shipping versions � Qualitative and Quantitative Analysis This is a Reality Test 16

  18. Project Time � Preparation Time � Total time elapsed 2 Months � Active time spent on project 2 Weeks � Execution Time � Total time elapsed 6 Days � Active time spent on project 5 Days � Analysis Time � Total time elapsed 5 Days � Active time spent on project 5 Days This is a Reality Test 17

  19. Technical Environment � Recording Environment � Application: Customer service � Client: Gateway Pentium 200 Windows NT Server � Server: SQL Server 6.5 Dell Pentium II 450, 512 MB Ram � Tools: Current shipping versions This is a Reality Test 18

  20. The Recording Process For a fair evaluation, scripts had to be I DENTI CAL • Three scenarios identified 2 - Focussed on specific areas of concern 1 - Complex Business Process • Complex Business Process Scenario Dropped Proved redundant - first two yielded sufficient Script was complex and additional effort would This is a Reality Test 19

  21. The Recording Process Cont. � Recorded original scripts with one tool. � Used Tool specific recording to capture the � Play back 1 instance of original script � Capture transactions � Both scripts were edited for Data Correlation � Tool output and SQL Trace outputs analyzed with WinDiff to ensure they were exactly the This is a Reality Test 20

  22. The Execution Process � Executed four tests User Load: 1, 50, 150 and 300 Virtual Users � � Scheduling Difficulties Tool 1 scheduling features available � � Random events � Complex logon patterns � User profiling � 5 Days to Execute Generally off hours � This is a Reality Test 21

  23. Execution Hardware Driver Machines Gateway Pentium 166 Mhz 128MB Gateway Pentium II 233 Mhz 256MB Dell Pentium II 450 Mhz 512MB Dell Pentium II 450 Mhz 512MB Dell Pentium II 450 Mhz 512MB Controller Gateway Pentium II 233 Mhz 256MB This is a Reality Test 22

  24. Analysis - Quantitative Results � Used NT Performance Monitor � Memory Metric : Available Bytes � Processor Metric: % Processor Time � Used SQL Trace to analyze Database � Verify that all tools performing same This is a Reality Test 23

  25. Tool 1 Processor & Memory Stats Dell Pentium II 450 Mhz , 512MB, 60 Virtual Users Script 1 Memory Processor Average Footprint 1.60 MB/VU Average Processor Utilization This is a Reality Test 24

  26. Tool 1 Processor & Memory Stats Gateway Pentium 166 Mhz,128MB, 60 Virtual User Script 2 Memory Processor Average Footprint 0. 52 MB/VU Average Processor Utilization This is a Reality Test 25

  27. Tool 1 and SQL Server Statistics Dell Pentium II 450 Mhz , 512MB Logon This is a Reality Test 26

  28. Different Log-on Emulation Tool 2, May Not accurately emulate connections for the SUT Tool 1, emulates connections as they were recorded This is a Reality Test 27

  29. Surprising Differences! Tool 1, found •Database Locking Verified as problem by real user testing •Accurate connection modeling •Accurate pacing This is a Reality Test 28

  30. Analysis - Qualitative Results � Ease of Use � Script Length using Tool 1 Script 1 2,715Lines Script 2 2,032 Average 2,374 � Script Development Time Tool 1 2 Days per Script Note : Knowledge gained by scripting in other tools saved scripting time. This is a Reality Test 29

  31. Analysis - Qualitative Results � Features worth mentioning Data Smart recording Network recording Script splitting Accurate script pacing Timing of individual Accurate connection commands emulation Complex scheduling On-line monitoring Server error handling Detailed reporting Shared memory Support mechanism Ability to pass information between virtual users. This is a Reality Test 30

  32. Lessons Learned � Tool choice matters! � Performance testing works! � Revealed application architecture deficiencies � Found deadlocks � Found redundant database code � Determined optimization points � Be prepared � Time estimates � Double your hard drive space � Off hours availability This is a Reality Test 31

  33. Marquis Harding Marquis Harding has over twenty-five years of Information Technologies and Software Quality Assurance experience. His backqround includes development and QA of large and mid-range mainframe, client/server, and Internet systems, senior management of QA and testing for large companies that span the financial, telecommunications and software industries. Mark has personated at international conferences on software development and testing. Marquis is a disabled Vietnam Veteran. While at Microsoft Corporation, he held the positions of Group Quality Assurance/Test Manager for Windows.com, Windows Update.com, and Microsoft.com, and Test Manager for IT Sales & Marketing/Product Support Services. In six years at Charles Schwab & Co., Inc., he held the positions of Senior Test Manager, ITG, as well as Development Manager for Schwab’s Financial Advisor Services division. Prior to this was a seventeen-year career at Pacific Telesis where he was employed as Manager of Information Technology Support for the CFO and Executive Vice President of Operations.

Recommend


More recommend