SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 1 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques Edmund Clarke Anubhav Gupta James Kukula Ofer Strichman
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 2 Abstraction in Model Checking I • Set of variables V = { x 1 , . . . , x n } . • Set of states S = D x 1 × · · · × D x n . • Set of initial states I ⊆ S . • Set of transitions R ⊆ S × S . • Transition system M = ( S, I, R ) .
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 3 Abstract Model h : S → ˆ M = (ˆ ˆ S, ˆ I, ˆ R ) Abstraction Function S I h h h h h ˆ S = { ˆ s | ∃ s. s ∈ S ∧ h ( s ) = ˆ s }
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 4 Abstract Model h : S → ˆ M = (ˆ ˆ S, ˆ I, ˆ R ) Abstraction Function S I I ˆ I = { ˆ s | ∃ s. I ( s ) ∧ h ( s ) = ˆ s }
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 5 Abstract Model h : S → ˆ M = (ˆ ˆ S, ˆ I, ˆ R ) Abstraction Function S I I ˆ R = { (ˆ s 1 , ˆ s 2 ) | ∃ s 1 . ∃ s 2 . R ( s 1 , s 2 ) ∧ h ( s 1 ) = ˆ s 1 ∧ h ( s 2 ) = ˆ s 2 }
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 6 Model Checking • AG p , p is a non-temporal propositional formula • p respects h if for all s ∈ S , h ( s ) | = p ⇒ s | = p p p p ~p p p ~p ~p p p ~p ~p p p p p respects h
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 7 Model Checking • AG p , p is a non-temporal propositional formula • p respects h if for all s ∈ S , h ( s ) | = p ⇒ s | = p p p ~p p ~p p ~p ~p p p ~p ~p p p p p does not respect h
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 8 Preservation Theorem Let ˆ M be an abstraction of M corresponding to the abstraction function h , and p be a propositional formula that respects h . Then ˆ M | = AG p ⇒ M | = AG p p p p p ~p p p p p ~p p p p p ~p
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 9 Converse of Preservation Theorem ˆ M �| = AG p �⇒ M �| = AG p ~p p p p ~p p p p ~p ~p p p p ~p ~p Counterexample is spurious. Abstraction is too coarse.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 10 Refinement h ′ is a refinement of h if 1. ∀ s 1 , s 2 ∈ S , h ′ ( s 1 ) = h ′ ( s 2 ) implies h ( s 1 ) = h ( s 2 ) . 2. ∃ s 1 , s 2 ∈ S such that h ( s 1 ) = h ( s 2 ) and h ′ ( s 1 ) � = h ′ ( s 2 ) . p p ~p p ~p p p p ~p ~p p p p p ~p ~p
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 11 Refinement h ′ is a refinement of h if 1. ∀ s 1 , s 2 ∈ S , h ′ ( s 1 ) = h ′ ( s 2 ) implies h ( s 1 ) = h ( s 2 ) . 2. ∃ s 1 , s 2 ∈ S such that h ( s 1 ) = h ( s 2 ) and h ′ ( s 1 ) � = h ′ ( s 2 ) . p p ~p p ~p p p p ~p ~p p p p p ~p ~p
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 12 Abstraction-Refinement 1. Generate an initial abstraction function h . 2. Build abstract machine ˆ M based on h . Model check ˆ M . If ˆ M | = ϕ , then M | = ϕ . Return TRUE. 3. If ˆ M �| = ϕ , check the counterexample on the concrete model. If the counterexample is real, M �| = ϕ . Return FALSE. 4. Refine h , and go to step 2.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 13 Abstraction Function • Partition variables V into visible( V ) and invisible( I ) variables. V = { v 1 , . . . , v k } . • The partitioning defines our abstraction function h : S → ˆ S . The set of abstract states is ˆ S = D v 1 × · · · × D v k and the abstraction functions is h ( s ) = ( s ( v 1 ) . . . s ( v k )) x1 x2 x3 x4 0 0 0 1 } 0 0 0 0 x1 x2 0 0 0 0 1 0 0 0 1 1 • Refinement : Move variables from I to V .
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 14 Building Abstract Model ˆ M can be computed efficiently if R is in functional form, e.g. sequential circuits. R ( s, s ′ ) = ∃ i ( � m j =1 x ′ j = f x j ( s, i )) s ′ ) = ∃ s I ∃ i ( � x ′ s, s I , i )) ˆ R (ˆ s, ˆ x j ∈V ˆ j = f x j (ˆ x1 x2 x3 x4 x5 x6 x1 x2 i1 i2 i3 x3 x4 x5 x6 i1 i2 i3
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 15 Checking the Counterexample • Counterexample : � ˆ s m � s 1 , ˆ s 2 , . . . ˆ • Set of concrete paths for counterexample : m − 1 m � � ψ m = {� s 1 . . . s m � | I ( s 1 ) ∧ R ( s i , s i +1 ) ∧ h ( s i ) = ˆ s i } i =1 i =1 • The right-most conjunct is a restriction of the visible variables to their values in the counterexample. • Counterexample is spurious ⇐ ⇒ ψ m is empty. • Solve ψ m with a SAT solver.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 16 Checking the Counterexample • Similar to BMC formulas, except – Path restricted to counterexample. – Also restrict values of (original) inputs that are assigned by counterexample. • If ψ m is satisfiable we found a real bug. • If ψ m is unsatisfiable, refine.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 17 Refinement • Find largest index f (failure index), f < m such that ψ f is satisfiable. • The set D of all states d f such that there is a concrete path � d 1 ...d f � in ψ f is called the set of deadend states. Abstract Trace Concrete Dead } end Trace • No concrete transition from D to a concrete state in the next abstract state.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 18 Refinement • Since there is an abstract transition from ˆ s f to ˆ s f +1 , there is a non-empty set of transitions φ f from h − 1 (ˆ s f ) to h − 1 (ˆ s f +1 ) . φ f = {� s f , s f +1 � | R ( s f , s f +1 ) ∧ h ( s f ) = ˆ s f ∧ h ( s f +1 ) = ˆ s f +1 } • The set B of all states b f such that there is a transition � b f , b f +1 � in φ f is called the set of bad states. Abstract Trace Concrete Dead } end Trace { Bad
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 19 Refinement Dead ~p p p p ~p States p p p ~p ~p Bad States p p p ~p ~p
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 20 Refinement • There is a spurious transition from ˆ s f to ˆ s f +1 . • Spurious transition because D and B lie in the same abstract state. • Refinement : Put D and B is separate abstract states. ∀ d ∈ D, ∀ b ∈ B ( h ′ ( d ) � = h ′ ( b )) p p ~p p ~p p p p ~p ~p p p p p ~p ~p
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 21 Refinement as Separation Let S = { s 1 ...s m } and T = { t 1 ...t n } be two sets of states (binary vectors) of size l , representing assignments to a set of variables W , | W | = l . (The state separation problem) Find a minimal set of variables U = { u 1 ...u k } , U ⊆ W , such that for each pair of states ( s i , t j ) , 1 ≤ i ≤ m , 1 ≤ j ≤ n , there exists a variable u r ∈ U such that s i ( u r ) � = t j ( u r ) . Let H denote the separating set for D and B . The refinement h ′ is obtained by adding H to V . Proof : Since H separates D and B , for all d ∈ D , b ∈ B there exists u ∈ H s.t. d ( u ) � = b ( u ) . Hence, h ( d ) � = h ( b ) .
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 22 Refinement as Separation and Learning • For systems of realistic size, – It is not possible to generate D and B , either explicitly or symbolically. – Computationally expensive to separate large D and B . • Generate samples for D (denoted S D ) and B (denoted S B ) and try to infer the separating variables from the samples. • State of the art SAT solvers like Chaff can generate many samples in a short amount of time. • Our algorithm is complete because a counterexample will eventually be eliminated in subsequent iterations.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 23 Separation using Integer Linear Programming Separating S D from S B as an Integer Linear Programming (ILP) problem: Min � |I| i =1 v i � ( ∀ s ∈ S D ) ( ∀ t ∈ S B ) v i ≥ 1 subject to: 1 ≤ i ≤|I| , s ( v i ) � = t ( v i ) • v i = 1 if and only if v i is in the separating set. • One constraint per pair of states, stating that at least one of the variables that separates the two states should be selected.
SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 24 Example s 1 = (0 , 1 , 0 , 1) t 1 = (1 , 1 , 1 , 1) s 2 = (1 , 1 , 1 , 0) t 2 = (0 , 0 , 0 , 1) Min � 4 i =1 v i subject to: v 1 + v 3 ≥ 1 /* Separating s 1 from t 1 ∗ / v 2 ≥ 1 /* Separating s 1 from t 2 ∗ / v 4 ≥ 1 /* Separating s 2 from t 1 ∗ / v 1 + v 2 + v 3 + v 4 ≥ 1 /* Separating s 2 from t 2 ∗ / Optimal value of the objective function is 3, corresponding to one of the two optimal solutions ( v 1 , v 2 , v 4 ) and ( v 3 , v 2 , v 4 ) .
Recommend
More recommend