validation the goal of validation is to judge the quality
play

VALIDATION The goal of validation is to judge the quality of the - PDF document

VALIDATION The goal of validation is to judge the quality of the soft- ware from the users point of view; e.g., reliability. The goal of all validation techniques is: to reveal failures, to localize the faults that caused the


  1. VALIDATION � The goal of validation is to judge the quality of the soft- ware from the user’s point of view; e.g., reliability. � The goal of all validation techniques is: – to reveal failures, – to localize the faults that caused the failures, and – ultimately, to correct the faults, and thereby to achieve the highest possible confidence that the component conducts itself according to its spec- ification. � Validation can be considered an activity to improve the software and to evaluate the software. However, the quality of poorly understood software can’t be improved via validation. 1

  2. DIFFERENT TECHNIQUES (A) Execution-based validation (testing) (A1) Black-box (functional testing) Use the specification to develop test cases (the code is invisible): (A11) by constructing equivalence classes (A12) by performing boundary-value analysis (ideally using both A11 and A12 together) (A2) White-box (structural testing) Use source code to develop test cases (the specification may be invisible) in order to achieve: (A21) statement coverage (A22) control-path coverage (A23) control-construct covering (A24) multiple-condition coverage (...and many more) (B) Non-execution-based validation (reading) Really “verification,” this is presented here as a con- trasting example to the execution-based techniques. (B1) symbolic execution (B2) reading (B21) sequential (B22) control-flow oriented (B23) stepwise abstraction 2

  3. READING VIA STEPWISE ABSTRACTION Example: 01: if x != 0 then 02: y := 5; 03: else 04: z := z - x; 05: endif 06: if z > 1 then 07: z := z / x; 08: else 09: z := 0; 10: endif Develop abstractions of this code using Mill’s functional no- tation. Abstraction lines 01–05: 6 = 0 y : = 5 z : = x z x ! j � Abstraction lines 06–10: > 1 z : = z : = 0 z z =x ! j Abstraction lines 01–10: 6 = 0 > 1 z : = 5 z : = 5 ; 0 ( x ( z y ; ; z =x y ; ) ! ! j j > 1 z : = z : = 0 ( z ( z x ) =x )) ! � j 3

  4. PRINCIPLES OF THE CODE-READING PROCESS 1. Detect how the program would fail if run � Determine the meaning (behavior) of a component through reading. [ pg m ] ! – define the meaning of every elemental program part, and describe it as a conditional instruction. Remark: conditions and parallel instructions should be described as formally as possible. – aggregate the elemental functions according to the control flow. � Compare the abstract meaning function with the given specification to detect possible failures. ! is m ] ? f [ pg � Remark: if specifications for single design parts ex- ist, the diagnosis can proceed in a stepwise fashion. 2. Isolate the faults that would lead to failures � Search out the cause of the inappropriate behavior (i.e., the fault) in the code. Remark: The deep understanding of the code gained in the reading step should ease fault localization. 4

  5. REQUIRED DOCUMENTS FOR CODE READING Specification Source code executable Steps component f pg m 1. detect likely failures X ( m ] ) [ pg ! Compare ( m ] ? x X f [ pg ) � 2. isolate faults X X 5

  6. TESTING Goal: to define a finite set T of tests (T � input x output) where: � the probability of revealing all failures is high � the belief that all failures were revealed is strong There are different criteria: � for choosing tests � for deciding about the “completeness” of the tests (i.e., when to stop testing) Each test is specified through a pair of i :(TF , TD) where: T � TF represents the test case (i.e., a possible sequence of invocations of the component) � TD represents the test data (i.e., pairs (i 2 input, o 2 output)) Remark: o describes the expected result, according to the specification, for the input i. 6

  7. TESTING Continued Test Results: i : ( 2 output, 2 [Pgm](i)) 0 T o o � o : expected result � 0 : actual result o � Remark: when testing is not done systematically, mis- 0 are frequently overlooked. matches between o and o Documentation of a test case: T 1 : comment (test’s purpose) � Test case: f 1 Test data (input, expected result) Test result (actual result) Myers: A successful test case is one that causes a failure! 7

  8. BLACK-BOX TESTING Example: 6 = 0 > 1 z : = 5 z : = 5 ; 0 ( x ( z y ; ; z =x y ; ) ! ! j j > 1 z : = z : = 0 ( z ( z x ) =x )) ! � j Test data 1. Divide the inputs into equivalence classes for which identical behavior is expected from the component. (a) 6 = 0 > 1 x z ^ (b) 6 = 0 � 1 x z ^ (c) = 0 > 1 x z ^ (d) = 0 � 1 x z ^ 2. Select test data � Using equivalence classes: Use one (or more) tests per equivalence class. f <x=1, z=4>, <x=1,z=0>, <z=0,z=4>, <x=0,z=0> g � Using boundary value analysis: Give special treatment to boundary values (bound- aries of equivalence classes). (a) f < �1 ,inc(1)>,< + 1 ,inc(1)>,< + 1 >,< + 1 > �1 ; + 1 ; g (b) f < �1 ,1>,< + 1 ,1>,< �1 >,< �1 > �1 ; + 1 ; g (c) f <0, inc(1)>, <0, + 1 > g (d) f <o, 1>, <0, �1 > g (+/- 1 corresponds to the largest/smallest repre- sentable number, inc(x) to the smallest number representable that is larger than x. ) 8

  9. PRINCIPLES OF THE BLACK-BOX TESTING PROCESS 1. Detect failures � Define the tests (TF , TD) using the component’s specification: – identify test cases – identify test data Criteria for completeness of the tests are: – at least one test is defined for each equivalence class – at least one test is defined for every vague point in the specification � The program is executed for the input part of each test (to obtain an actual result). � Failures are diagnosed in the output by comparing the expected result with the actual result. 2. Isolate faults � Search for the cause (i.e., the fault in the code) of the detected failure by reading/debugging. 9

  10. REQUIRED DOCUMENTS FOR BLACK-BOX TESTING Specification Source code executable Steps component f pg m Generate test X cases Execute test X cases Diagnosis X X isolate faults X X X 10

  11. WHITE-BOX TESTING N x != 0 Y 01: if x != 0 then y := 5 z := z - x 02: y := 5; 03: else 04: z := z - x; N 05: endif z > 1 06: if z > 1 then Y 07: z := z / x; 08: else z := z / x z := 0 09: z := 0; 10: endif Test data (for source code) 1. statement coverage Select tests to execute each statement at least once f <x=0, z=1>, <x=1, z=3> g 2. control-path coverage Select tests to traverse each edge of the program’s control-flow graph at least once f <x=0, z=1>, <x=1, z=3> g 3. complete control-path coverage Select tests to traverse each elementary path at least once f <x=0, z=1>, <x=1, z=3>, <x=0, z=3>, <x=1, z=1> g 11

  12. REFINEMENT OF COVERAGE CRITERIA Multiple-condition coverage � In the case of combined boolean conditions (e.g., a ^ b), make sure that all cases are tested: a b a ^ b t t t t f f f t f f f f � In the case of comparison operators (e.g., a � b), make sure that both possibilities are tested (a = b, a < b). � In the case of loops (e.g., while <expr> do S end), make sure that the loop is executed 0 times, 1 time, and n > 1 times. 12

  13. PRINCIPLES OF THE WHITE-BOX TESTING PROCESS 1. Detect failures � Define the tests (TF , TD) using the source code – identify test cases – identify test data Criteria for completeness of the tests are: – According to the “statement-coverage” approach: (# XEQ statements / # existing statements ) = 1 – According to the “control-path-coverage” approach: (# XEQ paths / # existing paths) = 1 � (G) := # paths - # nodes + 2 � The program is executed for the input part of each test (to obtain an actual result) � Failures are diagnosed in the output by comparing the expected result with the actual result � Coverage values attained are checked with a sup- port tool. 2. Isolate faults � Search for the cause (i.e., the fault in the code) of the detected failure by reading/debugging 13

  14. REQUIRED DOCUMENTS FOR WHITE-BOX TESTING Specification Source code executable Steps component f pg m Generate test X cases Execute test (X) X cases Diagnosis X (X) X isolate faults X X X 14

Recommend


More recommend