Outline Boolean-Based Optimization Example Applications Fundamental Techniques Practical Algorithms Results, Conclusions & Research Directions
Design Debugging [SMVLS’07] Correct circuit Faulty circuit r r y AND y AND s s z AND OR z Input stimuli: � r , s � = � 0 , 1 � Input stimuli: � r , s � = � 0 , 1 � Valid output: � y , z � = � 0 , 0 � Invalid output: � y , z � = � 0 , 0 � • The model: – Hard clauses: Input and output values – Soft clauses: CNF representation of circuit • The problem: – Maximize number of satisfied clauses (i.e. circuit gates)
Software Package Upgrades with MaxSAT [MBCV’06,TSJL’07,AL’08,ALMS’09,ALBL’10] • Universe of software packages: { p 1 , . . . , p n } • Associate x i with p i : x i = 1 iff p i is installed • Constraints associated with package p i : ( p i , D i , C i ) – D i : dependencies (required packages) for installing p i – C i : conflicts (disallowed packages) for installing p i • Example problem: Maximum Installability – Maximum number of packages that can be installed – Package constraints represent hard clauses – Soft clauses: ( x i ) Package constraints: ( p 1 , { p 2 ∨ p 3 } , { p 4 } ) ( p 2 , { p 3 } , { p 4 } ) ( p 3 , { p 2 } , ∅ ) ( p 4 , { p 2 , p 3 } , ∅ )
Software Package Upgrades with MaxSAT [MBCV’06,TSJL’07,AL’08,ALMS’09,ALBL’10] • Universe of software packages: { p 1 , . . . , p n } • Associate x i with p i : x i = 1 iff p i is installed • Constraints associated with package p i : ( p i , D i , C i ) – D i : dependencies (required packages) for installing p i – C i : conflicts (disallowed packages) for installing p i • Example problem: Maximum Installability – Maximum number of packages that can be installed – Package constraints represent hard clauses – Soft clauses: ( x i ) Package constraints: MaxSAT formulation: = { ( ¬ x 1 ∨ x 2 ∨ x 3 ) , ( ¬ x 1 ∨ ¬ x 4 ) , ( p 1 , { p 2 ∨ p 3 } , { p 4 } ) ϕ H ( ¬ x 2 ∨ x 3 ) , ( ¬ x 2 ∨ ¬ x 4 ) , ( ¬ x 3 ∨ x 2 ) , ( p 2 , { p 3 } , { p 4 } ) ( ¬ x 4 ∨ x 2 ) , ( ¬ x 4 ∨ x 3 ) } ( p 3 , { p 2 } , ∅ ) = { ( x 1 ) , ( x 2 ) , ( x 3 ) , ( x 4 ) } ( p 4 , { p 2 , p 3 } , ∅ ) ϕ S
Key Engine for MUS Enumeration • MUS: irreducible unsatisfiable set of clauses – MCS: irreducible set of clauses such that complement is satisfiable – MSS: subset maximal satisfiable set of clauses
Key Engine for MUS Enumeration • MUS: irreducible unsatisfiable set of clauses – MCS: irreducible set of clauses such that complement is satisfiable – MSS: subset maximal satisfiable set of clauses • Enumeration of MUSes finds many applications: – Model checking with CEGAR, type inference & checking, etc. [ALS’08,BSW’03]
Key Engine for MUS Enumeration • MUS: irreducible unsatisfiable set of clauses – MCS: irreducible set of clauses such that complement is satisfiable – MSS: subset maximal satisfiable set of clauses • Enumeration of MUSes finds many applications: – Model checking with CEGAR, type inference & checking, etc. [ALS’08,BSW’03] • How to enumerate MUSes? [E.g. LS’08] – Use hitting set duality between MUSes and MCSes [E.g. R’87,BL’03] ◮ An MUS is an irreducible hitting set of a formula’s MCSes ◮ An MCS is an irreducible hitting set of a formula’s MUSes – Can enumerate MCSes and then use them to compute MUSes
Key Engine for MUS Enumeration • MUS: irreducible unsatisfiable set of clauses – MCS: irreducible set of clauses such that complement is satisfiable – MSS: subset maximal satisfiable set of clauses • Enumeration of MUSes finds many applications: – Model checking with CEGAR, type inference & checking, etc. [ALS’08,BSW’03] • How to enumerate MUSes? [E.g. LS’08] – Use hitting set duality between MUSes and MCSes [E.g. R’87,BL’03] ◮ An MUS is an irreducible hitting set of a formula’s MCSes ◮ An MCS is an irreducible hitting set of a formula’s MUSes – Can enumerate MCSes and then use them to compute MUSes – Use MaxSAT enumeration for computing all MSSes
Many Other Applications • Error localization in C code [JM’11] • Haplotyping with pedigrees [GLMSO’10] • Course timetabling [AN’10] • Combinatorial auctions [HLGS’08] • Minimizing Disclosure of Private Information in Credential-Based Interactions [AVFPS’10] • Reasoning over Biological Networks [GL’12] • Binate/unate covering – Haplotype inference [GMSLO’11] – Digital filter design [ACFM’08] – FSM synthesis [e.g. HS’96] – Logic minimization [e.g. HS’96] – ... • ...
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Practical Algorithms Results, Conclusions & Research Directions
Main Techniques • Unit propagation – For computing lower bounds in B&B MaxSAT • Stochastic Local Search – For computing upper bounds (e.g. B&B MaxSAT) • Unsatisfiable subformulas (or cores) – Used in core-guided MaxSAT algorithms • CNF encodings – Cardinality constraints – PB constraints
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Cardinality Constraints Pseudo-Boolean Constraints Practical Algorithms Results, Conclusions & Research Directions
Cardinality Constraints • How to handle cardinality constraints, � n j =1 x j ≤ k ? – How to handle AtMost1 constraints, � n j =1 x j ≤ 1 ? – General form: � n ⊳ ∈ { <, ≤ , = , ≥ , > } j =1 x j ⊲ ⊳ k , with ⊲ • Solution #1: – Use PB solver – Difficult to keep up with advances in SAT technology – For SAT/UNSAT, best solvers already encode to CNF ◮ E.g. Minisat+, but also QMaxSat, MSUnCore, (W)PM2
Cardinality Constraints • How to handle cardinality constraints, � n j =1 x j ≤ k ? – How to handle AtMost1 constraints, � n j =1 x j ≤ 1 ? – General form: � n ⊳ ∈ { <, ≤ , = , ≥ , > } j =1 x j ⊲ ⊳ k , with ⊲ • Solution #1: – Use PB solver – Difficult to keep up with advances in SAT technology – For SAT/UNSAT, best solvers already encode to CNF ◮ E.g. Minisat+, but also QMaxSat, MSUnCore, (W)PM2 • Solution #2: – Encode cardinality constraints to CNF – Use SAT solver
Equals1, AtLeast1 & AtMost1 Constraints • � n j =1 x j = 1: encode with ( � n j =1 x j ≤ 1) ∧ ( � n j =1 x j ≥ 1) • � n j =1 x j ≥ 1: encode with ( x 1 ∨ x 2 ∨ . . . ∨ x n ) • � n j =1 x j ≤ 1 encode with: – Pairwise encoding ◮ Clauses: O ( n 2 ) ; No auxiliary variables – Sequential counter [S’05] ◮ Clauses: O ( n ) ; Auxiliary variables: O ( n ) – Bitwise encoding [P’07,FP’01] ◮ Clauses: O ( n log n ) ; Auxiliary variables: O (log n ) – ...
Bitwise Encoding • Encode � n j =1 x j ≤ 1 with bitwise encoding: • An example: x 1 + x 2 + x 3 ≤ 1
Bitwise Encoding • Encode � n j =1 x j ≤ 1 with bitwise encoding: r = ⌈ log n ⌉ (with n > 1) – Auxiliary variables v 0 , . . . , v r − 1 ; – If x j = 1, then v 0 . . . v j − 1 = b 0 . . . b j − 1 , the binary encoding j − 1 x j → ( v 0 = b 0 ) ∧ . . . ∧ ( v j − 1 = b j − 1 ) ⇔ ( ¬ x j ∨ ( v 0 = b 0 ) ∧ . . . ∧ ( v j − 1 = b j − 1 )) • An example: x 1 + x 2 + x 3 ≤ 1 j − 1 v 1 v 0 x 1 0 00 x 2 1 01 x 3 2 10
Bitwise Encoding • Encode � n j =1 x j ≤ 1 with bitwise encoding: r = ⌈ log n ⌉ (with n > 1) – Auxiliary variables v 0 , . . . , v r − 1 ; – If x j = 1, then v 0 . . . v j − 1 = b 0 . . . b j − 1 , the binary encoding j − 1 x j → ( v 0 = b 0 ) ∧ . . . ∧ ( v j − 1 = b j − 1 ) ⇔ ( ¬ x j ∨ ( v 0 = b 0 ) ∧ . . . ∧ ( v j − 1 = b j − 1 )) – Clauses ( ¬ x j ∨ ( v i ↔ b i )) = ( ¬ x j ∨ l i ), i = 0 , . . . , r − 1, where ◮ l i ≡ v i , if b i = 1 ◮ l i ≡ ¬ v i , otherwise • An example: x 1 + x 2 + x 3 ≤ 1 j − 1 v 1 v 0 ( ¬ x 1 ∨ ¬ v 1 ) ∧ ( ¬ x 1 ∨ ¬ v 0 ) x 1 0 00 ( ¬ x 2 ∨ ¬ v 1 ) ∧ ( ¬ x 2 ∨ v 0 ) x 2 1 01 ( ¬ x 3 ∨ v 1 ) ∧ ( ¬ x 3 ∨ ¬ v 0 ) x 3 2 10
Bitwise Encoding • Encode � n j =1 x j ≤ 1 with bitwise encoding: r = ⌈ log n ⌉ (with n > 1) – Auxiliary variables v 0 , . . . , v r − 1 ; – If x j = 1, then v 0 . . . v j − 1 = b 0 . . . b j − 1 , the binary encoding j − 1 x j → ( v 0 = b 0 ) ∧ . . . ∧ ( v j − 1 = b j − 1 ) ⇔ ( ¬ x j ∨ ( v 0 = b 0 ) ∧ . . . ∧ ( v j − 1 = b j − 1 )) – Clauses ( ¬ x j ∨ ( v i ↔ b i )) = ( ¬ x j ∨ l i ), i = 0 , . . . , r − 1, where ◮ l i ≡ v i , if b i = 1 ◮ l i ≡ ¬ v i , otherwise – If x j = 1, assignment to v i variables must encode j − 1 ◮ All other x variables must take value 0 – If all x j = 0, any assignment to v i variables is consistent – O ( n log n ) clauses ; O (log n ) auxiliary variables • An example: x 1 + x 2 + x 3 ≤ 1 j − 1 v 1 v 0 ( ¬ x 1 ∨ ¬ v 1 ) ∧ ( ¬ x 1 ∨ ¬ v 0 ) x 1 0 00 ( ¬ x 2 ∨ ¬ v 1 ) ∧ ( ¬ x 2 ∨ v 0 ) x 2 1 01 ( ¬ x 3 ∨ v 1 ) ∧ ( ¬ x 3 ∨ ¬ v 0 ) x 3 2 10
General Cardinality Constraints • General form: � n j =1 x j ≤ k (or � n j =1 x j ≥ k ) – Sequential counters [S’05] ◮ Clauses/Variables: O ( n k ) – BDDs [ES’06] ◮ Clauses/Variables: O ( n k ) – Sorting networks [ES’06] ◮ Clauses/Variables: O ( n log 2 n ) – Cardinality Networks: [ANORC’09,ANORC’11a] ◮ Clauses/Variables: O ( n log 2 k ) – Pairwise Cardinality Networks: [CZI’10] – ...
Sequential Counter • Encode � n j =1 x j ≤ k with sequential counter: x 1 x 2 x n s 1 , 1 s n − 1 , 1 s 2 , 1 s 1 , 2 s 2 , 2 s n − 1 , 2 s 1 ,k s 2 ,k s n − 1 ,k v 1 v 2 v n • Equations for each block 1 < i < n , 1 < j < k : s i = � i s i , 1 = s i − 1 , 1 ∨ x i j =1 x j s i , j = s i − 1 , j ∨ s i − 1 , j − 1 ∧ x i s i represented in unary v i = ( s i − 1 , k ∧ x i ) = 0
Sequential Counter • CNF formula for � n j =1 x j ≤ k : – Assume: k > 0 ∧ n > 1 – Indeces: 1 < i < n , 1 < j ≤ k ( ¬ x 1 ∨ x 1 , 1 ) ( ¬ s 1 , j ) ( ¬ x i ∨ s i , 1 ) ( ¬ s i − 1 , 1 ∨ s i , 1 ) ( ¬ x i ∨ ¬ s i − 1 , j − 1 ∨ s i , j ) ( ¬ s i − 1 , j ∨ s i , j ) ( ¬ x i ∨ ¬ s i − 1 , k ) ( ¬ x n ∨ ¬ s n − 1 , k ) • O ( n k ) clauses & variables
Sorting Networks I • Encode � n j =1 x j ≤ k with sorting network: – Unary representation – Use odd-even merging networks [B’68,ES’06,ANORC’09] – Recursive definition of merging networks
Sorting Networks I • Encode � n j =1 x j ≤ k with sorting network: – Unary representation – Use odd-even merging networks [B’68,ES’06,ANORC’09] – Recursive definition of merging networks ◮ Base Case: Merge( a 1 , b 1 ) � ( � c 1 , c 2 � , { c 2 = min( a 1 , b 1 ) , c 1 = max( a 1 , b 1 ) }
Sorting Networks I • Encode � n j =1 x j ≤ k with sorting network: – Unary representation – Use odd-even merging networks [B’68,ES’06,ANORC’09] – Recursive definition of merging networks ◮ Base Case: Merge( a 1 , b 1 ) � ( � c 1 , c 2 � , { c 2 = min( a 1 , b 1 ) , c 1 = max( a 1 , b 1 ) } ◮ Let: Merge( � a 1 , a 3 , . . . , a n − 1 � , � b 1 , b 3 , . . . , b n − 1 � ) � ( � d 1 , . . . , d n � , S odd ) Merge( � a 2 , a 4 , . . . , a n � , � b 2 , b 4 , . . . , b n � ) � ( � e 1 , . . . , e n � , S even )
Sorting Networks I • Encode � n j =1 x j ≤ k with sorting network: – Unary representation – Use odd-even merging networks [B’68,ES’06,ANORC’09] – Recursive definition of merging networks ◮ Base Case: Merge( a 1 , b 1 ) � ( � c 1 , c 2 � , { c 2 = min( a 1 , b 1 ) , c 1 = max( a 1 , b 1 ) } ◮ Let: Merge( � a 1 , a 3 , . . . , a n − 1 � , � b 1 , b 3 , . . . , b n − 1 � ) � ( � d 1 , . . . , d n � , S odd ) Merge( � a 2 , a 4 , . . . , a n � , � b 2 , b 4 , . . . , b n � ) � ( � e 1 , . . . , e n � , S even ) ◮ Then: Merge( � a 1 , a 2 , . . . , a n � , � b 1 , b 2 , . . . , b n � ) � ( � d 1 , c 1 , . . . , c 2 n − 1 , e n � , S odd ∪ S even ∪ S mrg ) ◮ Where: S mrg = � n − 1 i =1 { c 2 i +1 = min( d i +1 , e i ) , c 2 i = max ( d i +1 , e i ) }
Sorting Networks I • Encode � n j =1 x j ≤ k with sorting network: – Unary representation – Use odd-even merging networks [B’68,ES’06,ANORC’09] – Recursive definition of merging networks ◮ Base Case: Merge( a 1 , b 1 ) � ( � c 1 , c 2 � , { c 2 = min( a 1 , b 1 ) , c 1 = max( a 1 , b 1 ) } ◮ Let: Merge( � a 1 , a 3 , . . . , a n − 1 � , � b 1 , b 3 , . . . , b n − 1 � ) � ( � d 1 , . . . , d n � , S odd ) Merge( � a 2 , a 4 , . . . , a n � , � b 2 , b 4 , . . . , b n � ) � ( � e 1 , . . . , e n � , S even ) ◮ Then: Merge( � a 1 , a 2 , . . . , a n � , � b 1 , b 2 , . . . , b n � ) � ( � d 1 , c 1 , . . . , c 2 n − 1 , e n � , S odd ∪ S even ∪ S mrg ) ◮ Where: S mrg = � n − 1 i =1 { c 2 i +1 = min( d i +1 , e i ) , c 2 i = max ( d i +1 , e i ) } – Note: min ≡ AND and max ≡ OR
Sorting Networks II • Recursive definition of sorting networks – Base Case (2 n = 2): Sort( a 1 , b 1 ) � Merge ( a 1 , b 1 )
Sorting Networks II • Recursive definition of sorting networks – Base Case (2 n = 2): Sort( a 1 , b 1 ) � Merge ( a 1 , b 1 ) – Inductive Step (2 n > 2): ◮ Let, � Sort( a 1 , . . . , a n ) ( � d 1 , . . . , d n � , S D ) � ( � d ′ 1 , . . . , d ′ n � , S ′ Sort( a n +1 , . . . , a 2 n ) D ) � Merge( � d 1 , . . . , d n � , � d ′ 1 , . . . , d ′ n � ) ( � c 1 , . . . , c 2 n � , S M )
Sorting Networks II • Recursive definition of sorting networks – Base Case (2 n = 2): Sort( a 1 , b 1 ) � Merge ( a 1 , b 1 ) – Inductive Step (2 n > 2): ◮ Let, � Sort( a 1 , . . . , a n ) ( � d 1 , . . . , d n � , S D ) � ( � d ′ 1 , . . . , d ′ n � , S ′ Sort( a n +1 , . . . , a 2 n ) D ) � Merge( � d 1 , . . . , d n � , � d ′ 1 , . . . , d ′ n � ) ( � c 1 , . . . , c 2 n � , S M ) ◮ Then, Sort( � a 1 , , . . . , a 2 n � ) � ( � c 1 , . . . , c 2 n � , S D ∪ S ′ D ∪ S M )
Sorting Networks II • Recursive definition of sorting networks – Base Case (2 n = 2): Sort( a 1 , b 1 ) � Merge ( a 1 , b 1 ) – Inductive Step (2 n > 2): ◮ Let, � Sort( a 1 , . . . , a n ) ( � d 1 , . . . , d n � , S D ) � ( � d ′ 1 , . . . , d ′ n � , S ′ Sort( a n +1 , . . . , a 2 n ) D ) � Merge( � d 1 , . . . , d n � , � d ′ 1 , . . . , d ′ n � ) ( � c 1 , . . . , c 2 n � , S M ) ◮ Then, Sort( � a 1 , , . . . , a 2 n � ) � ( � c 1 , . . . , c 2 n � , S D ∪ S ′ D ∪ S M ) – Let � z 1 , . . . , z n � be the sorted output. The output constraint is: z i = 0 , i > k
Sorting Networks III • Sort � a 1 , a 2 , a 3 , a 4 � : c 1 a 1 Merge Merge a 2 c 2 Merge c 3 a 3 Merge Merge c 4 a 4 where each Merge block contains 1 min (AND) and 1 max (OR) operators
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Cardinality Constraints Pseudo-Boolean Constraints Practical Algorithms Results, Conclusions & Research Directions
Pseudo-Boolean Constraints • General form: � n j =1 a j x j ≤ b – Operational encoding [W’98] ◮ Clauses/Variables: O ( n ) ◮ Does not guarantee arc-consistency – BDDs [ES’06] ◮ Worst-case exponential number of clauses – Polynomial watchdog encoding [BBR’09] ◮ Let ν ( n ) = log( n ) log( a max ) ◮ Clauses: O ( n 3 ν ( n )) ; Aux variables: O ( n 2 ν ( n )) – Improved polynomial watchdog encoding [ANORC’11b] ◮ Clauses & aux variables: O ( n 3 log( a max )) – ...
Encoding PB Constraints with BDDs I • Encode 3 x 1 + 3 x 2 + x 3 ≤ 3 • Construct BDD – E.g. analyze variables by decreasing coefficients • Extract ITE-based circuit from BDD x 1 1 0 x 2 x 2 1 0 1 0 x 3 x 3 0 1 1 0 1 0 0 1 0 1
Encoding PB Constraints with BDDs I • Encode 3 x 1 + 3 x 2 + x 3 ≤ 3 • Construct BDD – E.g. analyze variables by decreasing coefficients • Extract ITE-based circuit from BDD 1 z x 1 s x 1 ITE 1 0 1 0 a b x 2 x 2 1 0 1 0 z z s s ITE ITE x 3 x 3 x 2 x 2 0 1 0 1 0 1 1 0 1 0 a b a b 0 1 z z 0 1 0 1 s s x 3 ITE x 3 ITE 0 1 0 1 a b a b 1 0 1 0
Encoding PB Constraints with BDDs II • Encode 3 x 1 + 3 x 2 + x 3 ≤ 3 • Extract ITE-based circuit from BDD • Simplify and create final circuit: 1 z s x 1 ITE 1 0 a b NAND NOR x 2 x 3 x 3 x 2
More on PB Constraints • How about � n j =1 a j x j = k ?
More on PB Constraints • How about � n j =1 a j x j = k ? – Can use ( � n j =1 a j x j ≥ k ) ∧ ( � n j =1 a j x j ≤ k ), but... ◮ � n j =1 a j x j = k is a subset-sum constraint (special case of a knapsack constraint)
More on PB Constraints • How about � n j =1 a j x j = k ? – Can use ( � n j =1 a j x j ≥ k ) ∧ ( � n j =1 a j x j ≤ k ), but... ◮ � n j =1 a j x j = k is a subset-sum constraint (special case of a knapsack constraint) ◮ Cannot find all consequences in polynomial time [S’03,FS’02,T’03]
More on PB Constraints • How about � n j =1 a j x j = k ? – Can use ( � n j =1 a j x j ≥ k ) ∧ ( � n j =1 a j x j ≤ k ), but... ◮ � n j =1 a j x j = k is a subset-sum constraint (special case of a knapsack constraint) ◮ Cannot find all consequences in polynomial time [S’03,FS’02,T’03] • Example: 4 x 1 + 3 x 2 + 2 x 3 = 5
More on PB Constraints • How about � n j =1 a j x j = k ? – Can use ( � n j =1 a j x j ≥ k ) ∧ ( � n j =1 a j x j ≤ k ), but... ◮ � n j =1 a j x j = k is a subset-sum constraint (special case of a knapsack constraint) ◮ Cannot find all consequences in polynomial time [S’03,FS’02,T’03] • Example: 4 x 1 + 3 x 2 + 2 x 3 = 5 – Replace by (4 x 1 + 3 x 2 + 2 x 3 ≥ 5) ∧ (4 x 1 + 3 x 2 + 2 x 3 ≤ 5)
More on PB Constraints • How about � n j =1 a j x j = k ? – Can use ( � n j =1 a j x j ≥ k ) ∧ ( � n j =1 a j x j ≤ k ), but... ◮ � n j =1 a j x j = k is a subset-sum constraint (special case of a knapsack constraint) ◮ Cannot find all consequences in polynomial time [S’03,FS’02,T’03] • Example: 4 x 1 + 3 x 2 + 2 x 3 = 5 – Replace by (4 x 1 + 3 x 2 + 2 x 3 ≥ 5) ∧ (4 x 1 + 3 x 2 + 2 x 3 ≤ 5) – Let x 2 = 0
More on PB Constraints • How about � n j =1 a j x j = k ? – Can use ( � n j =1 a j x j ≥ k ) ∧ ( � n j =1 a j x j ≤ k ), but... ◮ � n j =1 a j x j = k is a subset-sum constraint (special case of a knapsack constraint) ◮ Cannot find all consequences in polynomial time [S’03,FS’02,T’03] • Example: 4 x 1 + 3 x 2 + 2 x 3 = 5 – Replace by (4 x 1 + 3 x 2 + 2 x 3 ≥ 5) ∧ (4 x 1 + 3 x 2 + 2 x 3 ≤ 5) – Let x 2 = 0 – Either constraint can still be satisfied, but not both
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Practical Algorithms Results, Conclusions & Research Directions
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Practical Algorithms Notation B&B Search for MaxSAT & PBO Iterative SAT Solving Core-Guided Algorithms Results, Conclusions & Research Directions
Definitions • Cost of assignment: – Sum of weights of unsatisfied clauses • Optimum solution (OPT): – Assignment with minimum cost • Upper Bound (UB): – Assignment with cost not less than OPT – E.g. � c i ∈ ϕ w i + 1; hard clauses may be inconsistent • Lower Bound (LB): – No assignment with cost no larger than LB – E.g. − 1; it may be possible to satisfy all soft clauses
Definitions • Cost of assignment: – Sum of weights of unsatisfied clauses • Optimum solution (OPT): – Assignment with minimum cost • Upper Bound (UB): – Assignment with cost not less than OPT – E.g. � c i ∈ ϕ w i + 1; hard clauses may be inconsistent • Lower Bound (LB): – No assignment with cost no larger than LB – E.g. − 1; it may be possible to satisfy all soft clauses OPT LB UB
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Practical Algorithms Notation B&B Search for MaxSAT & PBO Iterative SAT Solving Core-Guided Algorithms Results, Conclusions & Research Directions
Branch-and-Bound Search for MaxSAT • Unit propagation is unsound for MaxSAT [e.g. BLM’07] {{ x 1 } , {¬ x 1 , ¬ x 2 } , {¬ x 1 , ¬ x 3 } , { x 2 } , { x 3 }}
Branch-and-Bound Search for MaxSAT • Unit propagation is unsound for MaxSAT [e.g. BLM’07] {{ x 1 } , {¬ x 1 , ¬ x 2 } , {¬ x 1 , ¬ x 3 } , { x 2 } , { x 3 }} • Standard B&B search [LMP’07,HLO’08,LHG’08] – No unit propagation ◮ No conflict-driven clause learning
Branch-and-Bound Search for MaxSAT • Unit propagation is unsound for MaxSAT [e.g. BLM’07] {{ x 1 } , {¬ x 1 , ¬ x 2 } , {¬ x 1 , ¬ x 3 } , { x 2 } , { x 3 }} • Standard B&B search [LMP’07,HLO’08,LHG’08] – No unit propagation ◮ No conflict-driven clause learning • Refine UBs on number of empty clauses • Estimate LBs – Unit propagation provides LBs – Bound search when LB ≥ UB • Inference rules to prune search [HL’06,LMP’07] • Optionally: use stochastic local search to identify UBs [HLO’08]
Branch-and-Bound Search for PBO minimize � w j · x j j ∈ N a ij l j ≥ b i , subject to � j ∈ N x j } , x j ∈ { 0 , 1 } , a ij , b i , w j ∈ N + l j ∈ { x j , ¯ 0 • Standard B&B search [MMS’02,MMS’04,MMS’06,SS’06,NO’06] • Refine UBs on value of cost function – Any model for the constraints refines UB • Estimate LBs – Standard techniques: LP relaxations; MIS; etc. – Bound search when LB ≥ UB • Native handling of PB constraints (optional)
Branch-and-Bound Search for PBO minimize � w j · x j j ∈ N a ij l j ≥ b i , subject to � j ∈ N x j } , x j ∈ { 0 , 1 } , a ij , b i , w j ∈ N + l j ∈ { x j , ¯ 0 • Standard B&B search [MMS’02,MMS’04,MMS’06,SS’06,NO’06] • Refine UBs on value of cost function – Any model for the constraints refines UB • Estimate LBs – Standard techniques: LP relaxations; MIS; etc. – Bound search when LB ≥ UB • Native handling of PB constraints (optional) • Integrate SAT techniques – Unit propagation; Clause learning; Restarts; VSIDS; etc.
Outline Boolean-Based Optimization Example Applications Fundamental Techniques Practical Algorithms Notation B&B Search for MaxSAT & PBO Iterative SAT Solving Core-Guided Algorithms Results, Conclusions & Research Directions
Iterative SAT Solving OPT LB UB • Iteratively refine upper bound (UB) until UB = OPT – Linear search SAT-UNSAT (or strengthening)
Iterative SAT Solving OPT LB UB • Iteratively refine upper bound (UB) until UB = OPT – Linear search SAT-UNSAT (or strengthening) • Iteratively refine lower bound (LB) until LB = OPT – Linear search UNSAT-SAT
Iterative SAT Solving OPT LB UB • Iteratively refine upper bound (UB) until UB = OPT – Linear search SAT-UNSAT (or strengthening) • Iteratively refine lower bound (LB) until LB = OPT – Linear search UNSAT-SAT • Iteratively refine lower & upper bounds until LB k = UB k − 1 – Linear search by refining LB&UB – Binary search on cost of unsatisfied clauses
Iterative SAT Solving OPT LB UB • Iteratively refine upper bound (UB) until UB = OPT – Linear search SAT-UNSAT (or strengthening) • Iteratively refine lower bound (LB) until LB = OPT – Linear search UNSAT-SAT • Iteratively refine lower & upper bounds until LB k = UB k − 1 – Linear search by refining LB&UB – Binary search on cost of unsatisfied clauses • By default: – All soft clauses relaxed: replace c i with c i ∪ { r i } – Cardinality/PB constraints to represent LBs & UBs
Iterative SAT Solving OPT LB UB • Iteratively refine upper bound (UB) until UB = OPT – Linear search SAT-UNSAT (or strengthening) • Iteratively refine lower bound (LB) until LB = OPT – Linear search UNSAT-SAT • Iteratively refine lower & upper bounds until LB k = UB k − 1 – Linear search by refining LB&UB – Binary search on cost of unsatisfied clauses • By default: Not for core-guided approaches ! – All soft clauses relaxed: replace c i with c i ∪ { r i } – Cardinality/PB constraints to represent LBs & UBs
Iterative SAT Solving – Refine UB OPT LB UB 0 • Require � w i r i ≤ UB 0 − 1
Iterative SAT Solving – Refine UB OPT LB UB 1 UB 0 • Require � w i r i ≤ UB 0 − 1 • While SAT, refine UB – New UB given by cost of unsatisfied clauses, i.e. � w i r i
Iterative SAT Solving – Refine UB OPT LB UB 2 UB 0 • Require � w i r i ≤ UB 0 − 1 • While SAT, refine UB – New UB given by cost of unsatisfied clauses, i.e. � w i r i
Iterative SAT Solving – Refine UB OPT LB UB k UB 0 • Require � w i r i ≤ UB 0 − 1 • While SAT, refine UB – New UB given by cost of unsatisfied clauses, i.e. � w i r i • Repeat until constraint � w i r i ≤ UB k − 1 becomes UNSAT – UB k denotes the optimum value
Iterative SAT Solving – Refine UB OPT LB UB k UB 0 • Require � w i r i ≤ UB 0 − 1 • While SAT, refine UB – New UB given by cost of unsatisfied clauses, i.e. � w i r i • Repeat until constraint � w i r i ≤ UB k − 1 becomes UNSAT – UB k denotes the optimum value • Worst-case # of iterations exponential on instance size
Iterative SAT Solving – Refine UB OPT LB UB k UB 0 • Require � w i r i ≤ UB 0 − 1 • While SAT, refine UB – New UB given by cost of unsatisfied clauses, i.e. � w i r i • Repeat until constraint � w i r i ≤ UB k − 1 becomes UNSAT – UB k denotes the optimum value • Worst-case # of iterations exponential on instance size • Example tools: – Minisat+: CNF encoding of constraints [ES’06] – SAT4J: native handling of constraints [LBP’10] – QMaxSat: CNF encoding of constraints – ...
Iterative SAT Solving – Refine LB OPT LB 0 UB • Require � w i r i ≤ LB 0 + 1
Iterative SAT Solving – Refine LB OPT LB 0 LB 1 UB • Require � w i r i ≤ LB 0 + 1 • While UNSAT, refine LB, i.e. LB k ← LB k − 1 + 1
Iterative SAT Solving – Refine LB OPT LB 0 LB 2 UB • Require � w i r i ≤ LB 0 + 1 • While UNSAT, refine LB, i.e. LB k ← LB k − 1 + 1
Recommend
More recommend