on the benefits of enhancing optimization modulo theories
play

On the Benefits of Enhancing Optimization Modulo Theories with - PowerPoint PPT Presentation

On the Benefits of Enhancing Optimization Modulo Theories with Sorting Networks for M AX SMT Roberto Sebastiani, Patrick Trentin roberto.sebastiani@unitn.it trentin@disi.unitn.it DISI, University of Trento SMT Workshop, July 1 st , 2016 Roberto


  1. On the Benefits of Enhancing Optimization Modulo Theories with Sorting Networks for M AX SMT Roberto Sebastiani, Patrick Trentin roberto.sebastiani@unitn.it trentin@disi.unitn.it DISI, University of Trento SMT Workshop, July 1 st , 2016 Roberto Sebastiani, Patrick Trentin (DISI) On the Benefits of Enhancing Optimization Modulo Theories with Sorting Networks for M AX S Jul 1, 2016 1 / 31

  2. Contents Background & Motivation 1 Efficiency Issue 2 Solution: OMT with Sorting Networks 3 Experimental Evaluation 4 Roberto Sebastiani, Patrick Trentin (DISI) On the Benefits of Enhancing Optimization Modulo Theories with Sorting Networks for M AX S Jul 1, 2016 2 / 31

  3. Contents Background & Motivation 1 Efficiency Issue 2 Solution: OMT with Sorting Networks 3 Experimental Evaluation 4 Roberto Sebastiani, Patrick Trentin (DISI) On the Benefits of Enhancing Optimization Modulo Theories with Sorting Networks for M AX S Jul 1, 2016 3 / 31

  4. Optimization Modulo Theories (OMT) [15, 16, 18] Optimization Modulo Theories with LA objectives [15, 16, 6, 7, 17, 13, 4, 5] problem of finding a model for � ϕ, obj � , minimizing the value of obj : obj being a LA or Pseudo-Boolean cost function maximization dual extended to multiple objectives (linear combination, min-max, boxed, lexicographic, Pareto) [13, 4, 5, 20, 19] incremental [5, 20] 4 / 31

  5. OMT Applications Formal Verification Formal Verification of parametric systems [18, 14] SW verification & synthesis [12, 14] (e.g. BMC, invariant generation, program syntehsis, ...) computation of worst-case execution time of loop-free programs [11] computation of optimal structure of undirected Markov network [9] computation abstract transformers [13] ... 5 / 31

  6. OMT Applications Formal Verification Formal Verification of parametric systems [18, 14] SW verification & synthesis [12, 14] (e.g. BMC, invariant generation, program syntehsis, ...) computation of worst-case execution time of loop-free programs [11] computation of optimal structure of undirected Markov network [9] computation abstract transformers [13] ... Other Machine Learning P Y LMT [22], a Structured Learning Modulo Theories [22] tool that performs inference and learning in hybrid domains Requirement Engineering CGM-T OOL [1], a tool for computing optimal realization of a Goal Model enriched with preferences and resources 5 / 31

  7. Partial Weighted M AX SMT [15, 6, 7, 16, 17] A pair � ϕ h , ϕ s � , where ϕ h : set of “hard” T -clauses ϕ s : set of positive-weighted “soft” T -clauses goal: find ψ , ψ ⊆ ϕ s , s.t. ϕ h ∪ ψ is T -satisfiable and ψ has maximum-weight 6 / 31

  8. Partial Weighted M AX SMT [15, 6, 7, 16, 17] A pair � ϕ h , ϕ s � , where ϕ h : set of “hard” T -clauses ϕ s : set of positive-weighted “soft” T -clauses goal: find ψ , ψ ⊆ ϕ s , s.t. ϕ h ∪ ψ is T -satisfiable and ψ has maximum-weight Approaches M AX SAT engine + SMT’s T -Solvers encoded as OMT Pseudo-Boolean objective 6 / 31

  9. Partial Weighted M AX SMT [15, 6, 7, 16, 17] A pair � ϕ h , ϕ s � , where ϕ h : set of “hard” T -clauses ϕ s : set of positive-weighted “soft” T -clauses goal: find ψ , ψ ⊆ ϕ s , s.t. ϕ h ∪ ψ is T -satisfiable and ψ has maximum-weight Approaches M AX SAT engine + SMT’s T -Solvers encoded as OMT Pseudo-Boolean objective SMT + M AX SAT engine OMT encoding : ++ very efficient ++ can be used when objective is given by the linear (or min-max) combination of (for pure M AX SMT) Pseudo-Boolean and Arithmetic terms (e.g. LGDP [17], or [22]). ++ can handle multiple objectives at the same time as in [20] 6 / 31

  10. OMT encoding Given � ϕ h , ϕ s � , for each C i ∈ ϕ s introduce fresh Boolean variable A i � � ϕ � ϕ h ∪ { ( A i ∨ C i ) } ; obj � w i A i (1) C i ∈ ϕ s C i ∈ ϕ s its OMT encoding is a pair � ϕ, obj � [17] def ϕ h ∧ � ϕ = i (( A i → ( x i = w i )) ∧ ( ¬ A i → ( x i = 0 ))) ∧ � (( 0 ≤ x i ) ∧ ( x i ≤ w i )) ∗ i def obj = � i x i , x i fresh Real variable ∗ : Term � i ... + Early Pruning = improved efficiency 7 / 31

  11. OMT encoding Given � ϕ h , ϕ s � , for each C i ∈ ϕ s introduce fresh Boolean variable A i � � ϕ � ϕ h ∪ { ( A i ∨ C i ) } ; obj � w i A i (1) C i ∈ ϕ s C i ∈ ϕ s its OMT encoding is a pair � ϕ, obj � [17] def ϕ h ∧ � ϕ = i (( A i → ( x i = w i )) ∧ ( ¬ A i → ( x i = 0 ))) ∧ � (( 0 ≤ x i ) ∧ ( x i ≤ w i )) ∗ i def obj = � i x i , x i fresh Real variable ∗ : Term � i ... + Early Pruning = improved efficiency Problem: -- Performance bottleneck when dealing with Pseudo-Boolean objectives in the form � � w 1 · A i + ... + w n · A j i j 7 / 31

  12. Contents Background & Motivation 1 Efficiency Issue 2 Solution: OMT with Sorting Networks 3 Experimental Evaluation 4 8 / 31

  13. Running Example: efficiency issue Problem: � ϕ, min ( obj ) � , where obj := w · � n − 1 i = 0 A i , currently obj = k · w O PTIMIZATION S TEP : learn ¬ ( k · w ≤ obj ) and restart/jump to level 0 Example: with k = 2, w = 1 and n = 4 9 / 31

  14. Running Example: efficiency issue Problem: � n � ¬ ( k ≤ obj ) causes the inconsistency of truth assignments satisfying k exactly k variables in A 0 , ..., A n − 1 Example: with k = 2, w = 1 and n = 4 9 / 31

  15. Running Example: efficiency issue Problem: � n � ¬ ( k ≤ obj ) causes the inconsistency of truth assignments satisfying k exactly k variables in A 0 , ..., A n − 1 = ⇒ inconsistency is not revealed by Boolean Propagation Example: with k = 2, w = 1 and n = 4 9 / 31

  16. Running Example: efficiency issue Problem: � n � up to (expensive) calls to the LA -Solver required k Example: with k = 2, w = 1 and n = 4 9 / 31

  17. Contents Background & Motivation 1 Efficiency Issue 2 Solution: OMT with Sorting Networks 3 Experimental Evaluation 4 10 / 31

  18. Solution: Combine OMT with Sorting Networks Idea. enrich encoding with bi-directional sorting networks [21, 10, 3, 2] Given � ϕ, obj � , obj := w · � n − 1 i = 0 A i , and a sorting network relation C ( A 0 , ..., A n − 1 , B 0 , ..., B n − 1 ) s.t. k A i ’s are ⊤ ⇐ ⇒ { B 0 , ..., B k − 1 } are ⊤ , m − k A i ’s are ∗ ⇐ ⇒ { B k , ..., B m − 1 } are ∗ , n − m A i ’s are ⊥ ⇐ ⇒ { B m , ..., B n − 1 } are ⊥ then we encode it as � ϕ ′ , obj � , where n − 1 n − 2 ϕ ′ := ϕ ∧ C ( A 0 , ..., A n − 1 ) ∧ � � B i ↔ (( k + 1 ) · w ≤ obj ) ∧ B i + 1 → B i i = 0 i = 0 11 / 31

  19. OMT with Sorting Network Relation Properties : if ( k · w ≤ obj ) = ⊥ , then by BCP ∀ i ∈ [ k , n ] . B i − 1 = ⊥ Example: with k = 2, w = 1 and n = 4 12 / 31

  20. OMT with Sorting Network Relation Properties : if ( k · w ≤ obj ) = ⊥ , then by BCP ∀ i ∈ [ k , n ] . B i − 1 = ⊥ as soon as k − 1 A i are assigned ⊤ = ⇒ all others are unit-propagated to ⊥ Dual if ( k · w ≤ obj ) = ⊤ . Example: with k = 2, w = 1 and n = 4 12 / 31

  21. Running Example: OMT with sorting networks O PTIMIZATION S TEP : learn ¬ ( k · w ≤ obj ) and restart/jump to level 0 Example: with k = 2, w = 1 and n = 4 13 / 31

  22. Running Example: OMT with sorting networks O PTIMIZATION S TEP : learn ¬ ( k · w ≤ obj ) and restart/jump to level 0 as soon as k − 1 A i are assigned ⊤ = ⇒ all others are unit-propagated to ⊥ Example: with k = 2, w = 1 and n = 4 13 / 31

  23. Solution: Combine OMT with Sorting Networks Possible encodings for n × n Boolean relation C ( A 0 , ..., A n − 1 , B 0 , ..., B n − 1 ) are: Bi-directional Sequential Counter [21], in O ( n 2 ) sum of A i ’s, unary representation Bi-directional Cardinality Network [10, 3, 2], in O ( n log 2 n ) based on merge-sort algorithm idea 14 / 31

  24. Solution: Combine OMT with Sorting Networks Possible encodings for n × n Boolean relation C ( A 0 , ..., A n − 1 , B 0 , ..., B n − 1 ) are: Bi-directional Sequential Counter [21], in O ( n 2 ) sum of A i ’s, unary representation Bi-directional Cardinality Network [10, 3, 2], in O ( n log 2 n ) based on merge-sort algorithm idea Generalization The same performance issue occurs for � ϕ, obj � , where obj = τ 1 + ... + τ m , i = k j � ∀ j ∈ [ 1 , m ] . ( τ j = w j · A ji ) ∧ ( 0 ≤ τ j ) ∧ ( τ j ≤ w j · k j ) i = 0 Solution: use a separate sorting circuit for each term τ j add clauses in the form ( w j · i ≤ τ j ) → ( w j · i ≤ obj ) 14 / 31

  25. Contents Background & Motivation 1 Efficiency Issue 2 Solution: OMT with Sorting Networks 3 Experimental Evaluation 4 15 / 31

  26. Experimental Evaluation Test Framework : two 8-core 2.20Ghz Xeon Linux machines with 64 GB of RAM Tools : O PTI M ATH SAT, OMT ( LA ) tool based on M ATH SAT5 [8] ν Z , general OMT solver based on Z3 [4] (Partial) Correctness Check : all configurations agree on optimal lexicographic solution ⇒ � = otherwise = ⇒ used Z3 to check model is both satisfiable and optimal 16 / 31

  27. Experiment #1 Benchmark-Set : Structured Learning Modulo Theories [22]: inference in hybrid domain � cover = w i A i i � � obj = w j · B j + cover − w k · C k − | K − cover | j k 500 formulas, 600 s . timeout 17 / 31

Recommend


More recommend