turbo charging lemmas on demand with don t care reasoning
play

Turbo-Charging Lemmas on Demand with Dont Care Reasoning Aina - PowerPoint PPT Presentation

Turbo-Charging Lemmas on Demand with Dont Care Reasoning Aina Niemetz, Mathias Preiner and Armin Biere Institute for Formal Models and Verification (FMV) Johannes Kepler University, Linz, Austria http://fmv.jku.at/ FMCAD 2014 October 21 -


  1. Turbo-Charging Lemmas on Demand with Don’t Care Reasoning Aina Niemetz, Mathias Preiner and Armin Biere Institute for Formal Models and Verification (FMV) Johannes Kepler University, Linz, Austria http://fmv.jku.at/ FMCAD 2014 October 21 - 24, 2014 Lausanne, Switzerland

  2. Introduction Lemmas on Demand • so-called lazy SMT approach • our SMT solver Boolector ◦ implements Lemmas on Demand for ◦ the quantifier-free theory of • fixed-size bit vectors • arrays • recently : Lemmas on Demand for Lambdas [DIFTS’13] ◦ generalization of Lemmas on Demand for Arrays [JSAT’09] ◦ arrays represented as uninterpreted functions ◦ array operations represented as lambda-terms ◦ reads represented as function applications

  3. Lemmas on Demand Workflow: Original Procedure LOD LOD • bit vector formula abstraction Formula φ Preprocessing π Abstraction ( bit vector skeleton ) α ( π ) • enumeration of truth assignments Refinement ξ = { l } ∧ ξ ( candidate models ) sistent incon- α ( π ) ∧ ξ • iterative refinement with lemmas Consistency sat σ ( α ( π ) ∧ ξ ) DP B until convergence Check unsat consistent Partial Model σ p ( α ( π ) ∧ ξ ) Extraction sat unsat

  4. Lemmas on Demand Workflow: Original Procedure LOD LOD − → each candidate model is a full Formula φ Preprocessing π Abstraction truth assignment of the formula abstraction α ( π ) Refinement ξ = { l } ∧ ξ − → full candidate model needs to sistent incon- α ( π ) ∧ ξ be checked for consistency w.r.t. theories Consistency sat σ ( α ( π ) ∧ ξ ) DP B Check unsat consistent Full Candidate Model Partial Model σ p ( α ( π ) ∧ ξ ) Extraction sat unsat

  5. Lemmas on Demand Workflow: Original Procedure LOD LOD − → abstraction refinement usually Formula φ Preprocessing π Abstraction the most costly part of LOD α ( π ) − → cost generally correlates with Refinement ξ = { l } ∧ ξ number of refinements sistent incon- α ( π ) ∧ ξ − → checking the full candidate Consistency sat σ ( α ( π ) ∧ ξ ) DP B model often not required Check unsat consistent − → small subset responsible for Partial Model satisfying formula abstraction σ p ( α ( π ) ∧ ξ ) Extraction sat unsat

  6. Lemmas on Demand Workflow: Optimized Procedure LOD opt LOD • focus LOD on the relevant parts Formula φ Preprocessing π Abstraction of the input formula α ( π ) • exploit a posteriori observability Refinement ξ = { l } ∧ ξ don’t cares sistent incon- α ( π ) ∧ ξ • partial model extraction prior to Consistency sat σ ( α ( π ) ∧ ξ ) DP B consistency checking Check unsat consistent − → subsequently reduces the cost Partial Model Partial Model σ p ( α ( π ) ∧ ξ ) σ p ( α ( π ) ∧ ξ ) Optimization Extraction Extraction for consistency checking sat unsat Partial Candidate Model

  7. Lemmas on Demand Example: Input Formula Example. ψ 1 ≡ i � = k ∧ ( f ( i ) = e ∨ f ( k ) = v ) ∧ v = ite ( i = j, e, g ( j )) and eq and eq or ite 1 2 3 eq eq eq var var var var var apply 1 apply 2 apply 3 j e v i k g f

  8. Lemmas on Demand Example: Formula Abstraction Example. Bit Vector Skeleton and 1 eq 1 and 2 eq 2 and 3 ite 1 2 3 eq 5 eq 4 eq 3 var var var var var α ( apply 1 ) α ( apply 2 ) α ( apply 3 ) j e v i k 00

  9. Lemmas on Demand Example: Formula Abstraction Example. Full Candidate Model 1 and 1 1 1 0 eq 1 and 2 1 1 eq 2 0 00 and 3 ite 1 2 0 0 3 1 1 1 eq 5 eq 4 eq 3 var var var var var α ( apply 1 ) α ( apply 2 ) α ( apply 3 ) j e v i k 00 00 00 00 00 00 00 00 01

  10. Lemmas on Demand Example: Formula Abstraction Example. Full Candidate Model 1 and 1 Check consistency: 1 { apply 1 , apply 2 , apply 3 } 1 0 eq 1 and 2 1 1 eq 2 0 00 and 3 ite 1 2 0 0 3 1 1 1 eq 5 eq 4 eq 3 var var var var var α ( apply 1 ) α ( apply 2 ) α ( apply 3 ) j e v i k 00 00 00 00 00 00 00 01

  11. Lemmas on Demand Example: Formula Abstraction Example. Partial Candidate Model 1 and 1 Check consistency: 1 { apply 1 } 1 0 eq 1 and 2 1 1 eq 2 0 00 and 3 ite 2 1 X 0 3 1 1 eq 5 eq 4 eq 3 var var var var var α ( apply 1 ) α ( apply 1 ) α ( apply 2 ) α ( apply 3 ) j e v i k 00 00 00 00 00 01

  12. Partial Model Extraction Most intuitive: use justification-based approach − → Justification-based techniques in the context of • SMT ◦ prune the search space of DPLL(T) [ENTCS’05, MSRTR’07] • Model checking ◦ prune the search space of BMC [CAV’02] ◦ generalize proof obligations in PDR [E´ enFMCAD’11, ChoFMCAD’11] ◦ generalize candidate counter examples (CEGAR) [LPAR’08]

  13. Partial Model Extraction Our approach: Dual propagation -based partial model extraction • exploiting the duality of a formula abstraction ψ − → assignments satisfying ψ (the primal channel) falsify its negation ¬ ψ (the dual channel) • motivated by dual propagation techniques in QBF [AAAI’10] ◦ one solver with two channels ( online approach) ◦ symmetric propagation between primal and dual channel • here: offline dual propagation ◦ two solvers, one solver per channel ◦ consecutive propagation between primal and dual channel → primal generates full assignment before dual enables partial model extraction − based on the primal assignment

  14. Partial Model Extraction Dual Propagation-Based Approach Example. Boolean Level Primal channel: ψ 2 ≡ ( a ∧ b ) ∨ ( c ∧ d ) Dual channel: ¬ ψ 2 ≡ ( ¬ a ∨ ¬ b ) ∧ ( ¬ c ∨ ¬ d )

  15. Partial Model Extraction Dual Propagation-Based Approach Example. Boolean Level Primal channel: ψ 2 ≡ ( a ∧ b ) ∨ ( c ∧ d ) Dual channel: ¬ ψ 2 ≡ ( ¬ a ∨ ¬ b ) ∧ ( ¬ c ∨ ¬ d ) Primal assignment: σ ( ψ 2 ) ≡ { σ ( a ) = ⊤ , σ ( b ) = ⊤ , σ ( c ) = ⊤ , σ ( d ) = ⊤}

  16. Partial Model Extraction Dual Propagation-Based Approach Example. Boolean Level Primal channel: ψ 2 ≡ ( a ∧ b ) ∨ ( c ∧ d ) Dual channel: ¬ ψ 2 ≡ ( ¬ a ∨ ¬ b ) ∧ ( ¬ c ∨ ¬ d ) Primal assignment: σ ( ψ 2 ) ≡ { σ ( a ) = ⊤ , σ ( b ) = ⊤ , σ ( c ) = ⊤ , σ ( d ) = ⊤} Fix values of inputs via assumptions to the dual solver: Dual assumptions: { a = ⊤ , b = ⊤ , c = ⊤ , d = ⊤}

  17. Partial Model Extraction Dual Propagation-Based Approach Example. Boolean Level Primal channel: ψ 2 ≡ ( a ∧ b ) ∨ ( c ∧ d ) Dual channel: ¬ ψ 2 ≡ ( ¬ a ∨ ¬ b ) ∧ ( ¬ c ∨ ¬ d ) Primal assignment: σ ( ψ 2 ) ≡ { σ ( a ) = ⊤ , σ ( b ) = ⊤ , σ ( c ) = ⊤ , σ ( d ) = ⊤} Fix values of inputs via assumptions to the dual solver: Dual assumptions: { a = ⊤ , b = ⊤ , c = ⊤ , d = ⊤} Failed assumptions: { a = ⊤ , b = ⊤} − → sufficient to falsify ¬ ψ 2 − → sufficient to satisfy ψ 2

  18. Partial Model Extraction Dual Propagation-Based Approach Example. Boolean Level Primal channel: ψ 2 ≡ ( a ∧ b ) ∨ ( c ∧ d ) Dual channel: ¬ ψ 2 ≡ ( ¬ a ∨ ¬ b ) ∧ ( ¬ c ∨ ¬ d ) Primal assignment: σ ( ψ 2 ) ≡ { σ ( a ) = ⊤ , σ ( b ) = ⊤ , σ ( c ) = ⊤ , σ ( d ) = ⊤} Fix values of inputs via assumptions to the dual solver: Dual assumptions: { a = ⊤ , b = ⊤ , c = ⊤ , d = ⊤} Failed assumptions: { a = ⊤ , b = ⊤} Partial Model − → sufficient to falsify ¬ ψ 2 − → sufficient to satisfy ψ 2

  19. Partial Model Extraction Dual Propagation-Based Approach − → structural don’t care reasoning simulated via the dual solver − → no structural SAT solver necessary Example. (ctd) Input formula: ≡ ( a ∧ b ) ∨ ( c ∧ d ) ≡ ⊤ ψ 2 Primal SAT solver: CNF( ψ 2 ) ≡ ( ¬ o ∨ x ∨ y ) ∧ ( ¬ x ∨ o ) ∧ ≡ ? ( ¬ y ∨ o ) ∧ ( ¬ x ∨ a ) ∧ ( ¬ x ∨ b ) ∧ ( ¬ a ∨ ¬ b ∨ x ) ∧ ( ¬ y ∨ c ) ∧ ( ¬ y ∨ d ) ∧ ( ¬ c ∨ ¬ d ∨ y ) Dual SAT solver: CNF( ¬ ψ 2 ) ≡ ( ¬ a ∨ ¬ b ) ∧ ( ¬ c ∨ ¬ d ) ≡ ⊥ Dual assumptions: { a = ⊤ , b = ⊤ , c = ⊤ , d = ⊤} Partial Model: { a = ⊤ , b = ⊤} − → in contrast to partial model extraction techniques based on iterative removal of unnecessary assignments on the CNF level [FMCAD’13]

  20. Partial Model Extraction Dual Propagation-Based Approach − → we lift this approach to the word level Primal channel: Γ ≡ α ( π ) ∧ ξ ≡ α ( π ) ∧ l 1 ∧ ... ∧ l i − 1 Dual channel: ¬ Γ − → one SMT solver per channel − → one single dual solver instance to maintain ¬ Γ over all iterations

  21. Partial Model Extraction Dual Propagation-Based Approach Example. Word Level ψ 1 ≡ i � = k ∧ ( f ( i ) = e ∨ f ( k ) = v ) ∧ v = ite ( i = j, e, g ( j )) α ( ψ 1 ) ≡ i � = k ∧ ( α ( apply 1 ) = e ∨ α ( apply 2 ) = v ) ∧ v = ite ( i = j, e, α ( apply 3 )) � Primal solver: α ( ψ 1 ) Formula abstraction and its negation Dual solver: ¬ α ( ψ 1 ) Primal assignment: σ ( ψ 2 ) ≡ { σ ( i ) = 00 , σ ( j ) = 00 , σ ( e ) = 00 , σ ( v ) = 00 , σ ( k ) = 01 , α ( apply 1 ) = 00 , α ( apply 2 ) = 00 , α ( apply 3 ) = 00 } Fix values of inputs via assumptions to the dual solver: Dual assumptions: σ ( ψ 2 ) ≡ { i = 00 , j = 00 , e = 00 , v = 00 , k = 01 , α ( apply 1 ) = 00 , α ( apply 2 ) = 00 , α ( apply 3 ) = 00 } Failed assumptions: { i = 00 , j = 00 , e = 00 , v = 00 , k = 01 , α ( apply 1 ) = 00 }

Recommend


More recommend