quality assurance test development execution
play

Quality Assurance: Test Development & Execution Developing - PDF document

Quality Assurance: Test Development & Execution Developing Test Strategy Ian S. King Test Development Lead Windows CE Base OS Team Microsoft Corporation Elements of Test Strategy Where is your focus? Test specification The


  1. Quality Assurance: Test Development & Execution Developing Test Strategy Ian S. King Test Development Lead Windows CE Base OS Team Microsoft Corporation Elements of Test Strategy Where is your focus? � Test specification � The customer � Test plan � The customer � Test harness/architecture � The customer � Test case generation � The customer � Test schedule � The customer � The customer � The customer � Schedule and budget Requirements feed into test design Test Specifications � What factors are important to the customer? � What questions do I want to answer about this code? Think of this as experiment design � Reliability vs. security � In what dimensions will I ask these questions? � Reliability vs. performance � Functionality � Features vs. reliability � Security � Cost vs. ? � Reliability � What are the customer’s expectations? � Performance � Scalability � How will the customer use the software? � Manageability 1

  2. Test specification: goals Test specification: example � CreateFile method � Design issues � Implementation issues � Should return valid, unique handle for � Do you understand the � Is the implementation � initial ‘open’ for appropriate resource design and goals? logically consistent? � subsequent calls for shareable resource � Is the design logically � Have you addressed � for files, should create file if it doesn’t exist consistent? potential defects arising � Should return NULL handle and set error indicator if from implementation? resource is � Is the design testable? � nonexistent device � inappropriate for ‘open’ action � in use and not shareable � unavailable because of error condition (e.g. no disk space) � Must recognize valid forms of resource name � Filename, device, ? Challenges: Methods of delivering software Enterprise/Data Center � Enterprise/data center � Usually requires 24x7 availability � Traditional: hardware vendor was software vendor � Full system test may be prohibitively � Support usually explicit and structured expensive – a second data center? � Embedded systems � Software is shipped as built-in component � Management is a priority � Often doesn’t “look like” computing technology � Predictive data to avoid failure � “Shrink wrap” � Diagnostic data to quickly diagnose failure � Software is often installed by end user � Rollback/restart to recover from failure � Goal: minimal involvement post-sale � Online ‘update’ - subscription � Minimal user involvement – goal is transparency Challenges: Challenges: Embedded Systems Shrink Wrap Software � Software compatibility matrix � Software may be “hardwired” (e.g. mask ROM) � Operating systems � Dependencies (expected and unexpected) � End user is not prepared for upgrade � Conflicts with other software scenarios � Hardware configuration issues � Field service or product return may be necessary � Dependencies (expected and unexpected) � End user does not see hardware vs. software � Resource conflicts � End user may not see software at all � Completely unrelated weirdness � Who wrote your fuel injection software? � N.B.: there’s no one “on the ground” 2

  3. Trimming the matrix: risk analysis in test design Test Plans � It’s a combinatorial impossibility to test it all � How will I ask my questions? Think of this as the “Methods” section � Example: eight modules that can be combined � Understand domain and range � One hour per test of each combination � Establish equivalence classes � Twenty person-years (40 hr weeks, 2 wks vacation) � Evaluate test areas and prioritize based on: � Address domain classes � Valid cases � Customer priorities � Invalid cases � Estimated customer impact � Boundary conditions � Cost of test � Error conditions � Cost of potential field service � Fault tolerance/stress/performance Test plan: goals Test plan: example CreateFile method � � Enables development of tests � Valid cases execute for each resource supporting ‘open’ action � � Proof of testability – if you can’t design it, you � opening existing device � opening existing file opening (creating) nonexistent file � execute for each such resource that supports sharing can’t do it � � multiple method calls in separate threads/processes � multiple method calls in single thread/process � Invalid cases � Review: what did you miss? nonexistent device � file path does not exist � in use and not shareable � � Error cases insufficient disk space � invalid form of name � � permissions violation � Boundary cases � e.g. execute to/past system limit on open device handles device name at/past name length limit (MAXPATH) � � Fault tolerance execute on failed/corrupted filesystem � execute on failed but present device � Performance testing Security Testing � Is data/access safe from those who should � Test for performance behavior not have it? � Does it meet requirements? � Is data/access available to those who should � Customer requirements have it? � Definitional requirements (e.g. Ethernet) � How is privilege granted/revoked? � Test for resource utilization � Is the system safe from unauthorized control? � Understand resource requirements � Example: denial of service � Test performance early � Collateral data that compromises security � Avoid costly redesign to meet performance � Example: network topology requirements 3

  4. Stress testing Globalization � Working stress: sustained operation at or � Localization near maximum capability � UI in the customer’s language � Goal: resource leak detection � German overruns the buffers � Japanese tests extended character sets � Breaking stress: operation beyond expected � Globalization maximum capability � Data in the customer’s language � Goal: understand failure scenario(s) � Non-US values ($ vs. Euro, ips vs. cgs) � “Failing safe” vs. unrecoverable failure or data � Mars Global Surveyor: mixed metric and SAE loss Test Cases Test case: example � Actual “how to” for individual tests � CreateFile method � Expected results � Valid cases � English � One level deeper than the Test Plan � open existing disk file with arbitrary name and full path, file � Automated or manual? permissions allowing access � create directory ‘c:\foo’ � Environmental/platform variables copy file ‘bar’ to directory ‘c:\foo’ from test server; � permissions are ‘Everyone: full access’ � execute CreateFile(‘c:foo\bar’, etc.) expected: non-null handle returned � Test Harness/Architecture Test Schedule � Test automation is nearly always worth the � Phases of testing time and expense � Unit testing (may be done by developers) � Component testing � How to automate? � Integration testing � Commercial harnesses � System testing � Roll-your-own (TUX) � Dependencies – when are features ready? � Record/replay tools � Use of stubs and harnesses � Scripted harness � When are tests ready? � Logging/Evaluation � Automation requires lead time � The long pole – how long does a test pass take? 4

  5. Where The Wild Things Are: Challenges and Pitfalls � “Everyone knows” – hallway design � “We won’t know until we get there” � “I don’t have time to write docs” � Feature creep/design “bugs” � Dependency on external groups 5

Recommend


More recommend