The DPLL Algorithm a ∨ b ) ∧ ( a ∨ ¯ a ∨ ¯ F = ( x ∨ y ) ∧ ( a ∨ b ) ∧ (¯ b ) ∧ (¯ b ) N Unassigned variables ? Level Dec. Unit Prop. Y Satisfiable 0 ∅ Assign value to variable 1 x ¯ y ¯ 2 a ¯ b ⊥ Unit propagation N Conflict ? Y x x ¯ N Can undo decision ? y y ¯ ¯ a a Y Unsatisfiable Backtrack & flip variable a a ¯ a a ¯ • Optional: pure literal rule
Outline Basic Definitions DPLL Solvers CDCL Solvers What Next in CDCL Solvers? CNF Encodings
What is a CDCL SAT Solver? • Extend DPLL SAT solver with: [DP60,DLL62] – Clause learning & non-chronological backtracking [MSS96,BS97,Z97] – Search restarts [GSK98,BMS00,H07,B08] – Lazy data structures – Conflict-guided branching – ...
What is a CDCL SAT Solver? • Extend DPLL SAT solver with: [DP60,DLL62] – Clause learning & non-chronological backtracking [MSS96,BS97,Z97] ◮ Exploit UIPs [MSS96,SSS12] ◮ Minimize learned clauses [SB09,VG09] ◮ Opportunistically delete clauses [MSS96,MSS99,GN02] – Search restarts [GSK98,BMS00,H07,B08] – Lazy data structures ◮ Watched literals [MMZZM01] – Conflict-guided branching ◮ Lightweight branching heuristics [MMZZM01] ◮ Phase saving [PD07] – ...
How Significant are CDCL SAT Solvers? Results of the SAT competition/race winners on the SAT 2009 application benchmarks, 20mn timeout 1200 Limmat (2002) Zchaff (2002) Berkmin (2002) Forklift (2003) Siege (2003) 1000 Zchaff (2004) SatELite (2005) Minisat 2 (2006) Picosat (2007) Rsat (2007) Minisat 2.1 (2008) 800 Precosat (2009) Glucose (2009) CPU Time (in seconds) Clasp (2009) Cryptominisat (2010) Lingeling (2010) Minisat 2.2 (2010) 600 Glucose 2 (2011) Glueminisat (2011) Contrasat (2011) Lingeling 587f (2011) 400 GRASP 200 DPLL 0 0 20 40 60 80 100 120 140 160 180 200 ? ? Number of problems solved
Outline Basic Definitions DPLL Solvers CDCL Solvers Clause Learning, UIPs & Minimization Search Restarts & Lazy Data Structures What Next in CDCL Solvers? CNF Encodings
Clause Learning Level Dec. Unit Prop. 0 ∅ 1 x x 2 y 3 ⊥ z z a b
Clause Learning Level Dec. Unit Prop. 0 ∅ 1 x x 2 y 3 ⊥ z z a b • Analyze conflict
Clause Learning Level Dec. Unit Prop. 0 ∅ 1 x 2 y 3 ⊥ z a b • Analyze conflict – Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels
Clause Learning Level Dec. Unit Prop. 0 ∅ 1 x 2 y 3 ⊥ z a b • Analyze conflict – Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels x ∨ ¯ – Create new clause: (¯ z )
Clause Learning Level Dec. Unit Prop. 0 ∅ a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y 3 ⊥ z a (¯ x ∨ ¯ z ) b • Analyze conflict – Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels x ∨ ¯ – Create new clause: (¯ z ) • Can relate clause learning with resolution
Clause Learning Level Dec. Unit Prop. 0 ∅ a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y 3 ⊥ z a (¯ x ∨ ¯ z ) b • Analyze conflict – Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels x ∨ ¯ – Create new clause: (¯ z ) • Can relate clause learning with resolution
Clause Learning Level Dec. Unit Prop. 0 ∅ a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y 3 ⊥ z a (¯ x ∨ ¯ z ) b • Analyze conflict – Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels x ∨ ¯ – Create new clause: (¯ z ) • Can relate clause learning with resolution
Clause Learning Level Dec. Unit Prop. 0 ∅ a ∨ ¯ (¯ b ) (¯ z ∨ b ) (¯ x ∨ ¯ z ∨ a ) 1 x (¯ a ∨ ¯ z ) 2 y 3 ⊥ z a (¯ x ∨ ¯ z ) b • Analyze conflict – Reasons: x and z ◮ Decision variable & literals assigned at lower decision levels x ∨ ¯ – Create new clause: (¯ z ) • Can relate clause learning with resolution – Learned clauses result from ( selected ) resolution operations
Clause Learning – After Backtracking Level Dec. Unit Prop. 0 ∅ 1 x z 2 y 3 ⊥ z z a b
Clause Learning – After Backtracking Level Dec. Unit Prop. 0 ∅ 1 x z 2 y 3 ⊥ z a b • Clause (¯ x ∨ ¯ z ) is asserting at decision level 1
Clause Learning – After Backtracking Level Dec. Unit Prop. Level Dec. Unit Prop. 0 ∅ 0 ∅ 1 1 ¯ x z x z 2 y 3 ⊥ z a b • Clause (¯ x ∨ ¯ z ) is asserting at decision level 1
Clause Learning – After Backtracking Level Dec. Unit Prop. Level Dec. Unit Prop. 0 ∅ 0 ∅ 1 1 ¯ x z x z 2 y 3 ⊥ z a b • Clause (¯ x ∨ ¯ z ) is asserting at decision level 1 • Learned clauses are always asserting [MSS96,MSS99] • Backtracking differs from plain DPLL: – Always bactrack after a conflict [MMZZM01]
Unique Implication Points (UIPs) Level Dec. Unit Prop. 0 ∅ 1 w w 2 x 3 y 4 z a c ⊥ b
Unique Implication Points (UIPs) (¯ Level Dec. Unit Prop. b ∨ ¯ c ) ( ¯ w ∨ ¯ a ∨ c ) (¯ x ∨ ¯ a ∨ b ) (¯ y ∨ ¯ z ∨ a ) 0 ∅ a ∨ ¯ 1 w ( ¯ w ∨ ¯ b ) 2 x x ( ¯ ( ¯ w ∨ ¯ w ∨ ¯ x ∨ ¯ x ∨ ¯ a ) a ) 3 y y ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z z a c ⊥ b • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z )
Unique Implication Points (UIPs) (¯ Level Dec. Unit Prop. b ∨ ¯ c ) ( ¯ w ∨ ¯ a ∨ c ) (¯ x ∨ ¯ a ∨ b ) (¯ y ∨ ¯ z ∨ a ) 0 ∅ a ∨ ¯ 1 w ( ¯ w ∨ ¯ b ) 2 x x ( ¯ ( ¯ w ∨ ¯ w ∨ ¯ x ∨ ¯ x ∨ ¯ a ) a ) 3 y y ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z z a a c ⊥ b • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) • But a is an UIP – Dominator in DAG for level 4
Unique Implication Points (UIPs) (¯ Level Dec. Unit Prop. b ∨ ¯ c ) ( ¯ w ∨ ¯ a ∨ c ) (¯ x ∨ ¯ a ∨ b ) (¯ y ∨ ¯ z ∨ a ) 0 ∅ a ∨ ¯ 1 w ( ¯ w ∨ ¯ b ) 2 x ( ¯ ( ¯ w ∨ ¯ w ∨ ¯ x ∨ ¯ x ∨ ¯ a ) a ) 3 y ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) 4 z a c ⊥ b • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) • But a is an UIP – Dominator in DAG for level 4 • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a )
Multiple UIPs Level Dec. Unit Prop. 0 ∅ 1 w w 2 x 3 y 4 z r a a c ⊥ s b
Multiple UIPs Level Dec. Unit Prop. • First UIP: 0 ∅ – Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a ) 1 w 2 x 3 y y 4 z r a c ⊥ s b
Multiple UIPs Level Dec. Unit Prop. • First UIP: 0 ∅ – Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a ) 1 w • But there can be more than 1 UIP 2 x x 3 y y 4 z r a c ⊥ s b
Multiple UIPs Level Dec. Unit Prop. • First UIP: 0 ∅ – Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a ) 1 w • But there can be more than 1 UIP 2 x • Second UIP: – Learn clause (¯ x ∨ ¯ z ∨ a ) 3 y 4 z r a c ⊥ s b
Multiple UIPs Level Dec. Unit Prop. • First UIP: 0 ∅ – Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a ) 1 w • But there can be more than 1 UIP 2 x • Second UIP: – Learn clause (¯ x ∨ ¯ z ∨ a ) 3 y • In practice smaller clauses more effective 4 z r a c – Compare with ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) ⊥ s b
Multiple UIPs Level Dec. Unit Prop. • First UIP: 0 ∅ – Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a ) 1 w • But there can be more than 1 UIP 2 x • Second UIP: – Learn clause (¯ x ∨ ¯ z ∨ a ) 3 y • In practice smaller clauses more effective 4 z r a c – Compare with ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) ⊥ s b • Multiple UIPs proposed in GRASP [MSS96] – First UIP learning proposed in Chaff [MMZZM01] • Not used in recent state of the art CDCL SAT solvers
Multiple UIPs Level Dec. Unit Prop. • First UIP: 0 ∅ – Learn clause ( ¯ w ∨ ¯ x ∨ ¯ a ) 1 w • But there can be more than 1 UIP 2 x • Second UIP: – Learn clause (¯ x ∨ ¯ z ∨ a ) 3 y • In practice smaller clauses more effective 4 z r a c – Compare with ( ¯ w ∨ ¯ x ∨ ¯ y ∨ ¯ z ) ⊥ s b • Multiple UIPs proposed in GRASP [MSS96] – First UIP learning proposed in Chaff [MMZZM01] • Not used in recent state of the art CDCL SAT solvers • Recent results show it can be beneficial on current instances [SSS12]
Clause Minimization I Level Dec. Unit Prop. ∅ 0 1 x x b b 2 y 3 z c ⊥ a
Clause Minimization I z ∨ ¯ Level Dec. Unit Prop. (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) ∅ 0 z ∨ ¯ b ∨ ¯ 1 x b (¯ a ) 2 y y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) 3 z z c ⊥ a z ∨ ¯ • Learn clause (¯ x ∨ ¯ y ∨ ¯ b )
Clause Minimization I z ∨ ¯ Level Dec. Unit Prop. (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) (¯ x ∨ b ) ∅ 0 z ∨ ¯ b ∨ ¯ 1 x b (¯ a ) 2 y y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) 3 z z c ⊥ a z ∨ ¯ • Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) • Apply self-subsuming resolution (i.e. local minimization) [SB09]
Clause Minimization I z ∨ ¯ Level Dec. Unit Prop. (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) (¯ x ∨ b ) ∅ 0 z ∨ ¯ b ∨ ¯ 1 x b (¯ a ) 2 y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) 3 z c ⊥ (¯ x ∨ ¯ y ∨ ¯ z ) a z ∨ ¯ • Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) • Apply self-subsuming resolution (i.e. local minimization)
Clause Minimization I z ∨ ¯ Level Dec. Unit Prop. (¯ a ∨ ¯ c ) (¯ b ∨ c ) (¯ x ∨ ¯ y ∨ ¯ z ∨ a ) (¯ x ∨ b ) ∅ 0 z ∨ ¯ b ∨ ¯ 1 x b (¯ a ) 2 y z ∨ ¯ (¯ x ∨ ¯ y ∨ ¯ b ) 3 z c ⊥ (¯ x ∨ ¯ y ∨ ¯ z ) a z ∨ ¯ • Learn clause (¯ x ∨ ¯ y ∨ ¯ b ) • Apply self-subsuming resolution (i.e. local minimization) • Learn clause (¯ x ∨ ¯ y ∨ ¯ z )
Clause Minimization II Level Dec. Unit Prop. 0 ∅ 1 w w a c c b 2 x e d ⊥
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ 1 w a c b 2 x x e d ⊥
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ • Cannot apply self-subsuming 1 w a c resolution – Resolving with reason of c yields b a ∨ ¯ ( ¯ w ∨ ¯ x ∨ ¯ b ) 2 x x e d ⊥
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ • Cannot apply self-subsuming 1 w a c resolution – Resolving with reason of c yields b a ∨ ¯ ( ¯ w ∨ ¯ x ∨ ¯ b ) • Can apply recursive minimization 2 x x e d ⊥
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ • Cannot apply self-subsuming 1 w a c resolution – Resolving with reason of c yields b a ∨ ¯ ( ¯ w ∨ ¯ x ∨ ¯ b ) • Can apply recursive minimization 2 x x e d ⊥ • Marked nodes: literals in learned clause [SB09]
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ • Cannot apply self-subsuming 1 w a c resolution – Resolving with reason of c yields b a ∨ ¯ ( ¯ w ∨ ¯ x ∨ ¯ b ) • Can apply recursive minimization 2 x e d ⊥ • Marked nodes: literals in learned clause [SB09] • Trace back from c until marked nodes or new decision nodes – Drop literal c if only marked nodes visited
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ • Cannot apply self-subsuming 1 w a c resolution – Resolving with reason of c yields b a ∨ ¯ ( ¯ w ∨ ¯ x ∨ ¯ b ) • Can apply recursive minimization 2 x e • Learn clause ( ¯ w ∨ ¯ x ) d ⊥ • Marked nodes: literals in learned clause [SB09] • Trace back from c until marked nodes or new decision nodes – Drop literal c if only marked nodes visited
Clause Minimization II Level Dec. Unit Prop. • Learn clause ( ¯ w ∨ ¯ x ∨ ¯ c ) 0 ∅ • Cannot apply self-subsuming 1 w a c resolution – Resolving with reason of c yields b a ∨ ¯ ( ¯ w ∨ ¯ x ∨ ¯ b ) • Can apply recursive minimization 2 x e • Learn clause ( ¯ w ∨ ¯ x ) d ⊥ • Marked nodes: literals in learned clause [SB09] • Trace back from c until marked nodes or new decision nodes – Drop literal c if only marked nodes visited • Complexity of recursive minimization?
Outline Basic Definitions DPLL Solvers CDCL Solvers Clause Learning, UIPs & Minimization Search Restarts & Lazy Data Structures What Next in CDCL Solvers? CNF Encodings
Search Restarts I • Heavy-tail behavior: [GSK98] – 10000 runs, branching randomization on industrial instance • Use rapid randomized restarts (search restarts)
Search Restarts II • Restart search after a number of conflicts cutoff cutoff solution
Search Restarts II • Restart search after a number of conflicts • Increase cutoff after each restart – Guarantees completeness – Different policies exist (see refs) cutoff cutoff solution
Search Restarts II • Restart search after a number of conflicts • Increase cutoff after each restart – Guarantees completeness – Different policies exist (see refs) • Works for SAT & UNSAT cutoff cutoff solution instances. Why?
Search Restarts II • Restart search after a number of conflicts • Increase cutoff after each restart – Guarantees completeness – Different policies exist (see refs) • Works for SAT & UNSAT instances. Why? • Learned clauses effective after restart(s)
Data Structures Basics • Each literal l should access clauses containing l – Why?
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L – Clause learning can generate large clauses ◮ Worst-case size: O ( n )
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L – Clause learning can generate large clauses ◮ Worst-case size: O ( n ) – Worst-case number of literals: O ( m n )
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L – Clause learning can generate large clauses ◮ Worst-case size: O ( n ) – Worst-case number of literals: O ( m n ) – In practice, Unit propagation slow-down worse than linear as clauses are learned !
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L – Clause learning can generate large clauses ◮ Worst-case size: O ( n ) – Worst-case number of literals: O ( m n ) – In practice, Unit propagation slow-down worse than linear as clauses are learned ! • Clause learning to be effective requires a more efficient representation:
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L – Clause learning can generate large clauses ◮ Worst-case size: O ( n ) – Worst-case number of literals: O ( m n ) – In practice, Unit propagation slow-down worse than linear as clauses are learned ! • Clause learning to be effective requires a more efficient representation: Watched Literals
Data Structures Basics • Each literal l should access clauses containing l – Why? Unit propagation • Clause with k literals results in k references, from literals to the clause • Number of clause references equals number of literals, L – Clause learning can generate large clauses ◮ Worst-case size: O ( n ) – Worst-case number of literals: O ( m n ) – In practice, Unit propagation slow-down worse than linear as clauses are learned ! • Clause learning to be effective requires a more efficient representation: Watched Literals – Watched literals are one example of lazy data structures ◮ But there are others
Watched Literals [MMZZM01] • Important states of a clause
Watched Literals [MMZZM01] • Important states of a clause • Associate 2 references with each clause
Watched Literals [MMZZM01] • Important states of a clause • Associate 2 references with each clause • Deciding unit requires traversing all literals
Watched Literals [MMZZM01] • Important states of a clause • Associate 2 references with each clause • Deciding unit requires traversing all literals • References unchanged when backtracking
Watched Literals, in Practice [ES03,G13] Fixed • In practice, first two positions of clause are watched Assign watched Swap Assign watched Swap
Watched Literals, in Practice [ES03,G13] Fixed • In practice, first two positions of clause are watched • May require to traverse already assigned literals, multiple times Assign watched Swap Assign watched Swap
Watched Literals, in Practice [ES03,G13] Fixed • In practice, first two positions of clause are watched • May require to traverse already assigned literals, multiple times Assign watched • Worst-case time of unit propagation is quadratic on clause size and so on number Swap of literals Assign watched Swap
Watched Literals, in Practice [ES03,G13] Fixed • In practice, first two positions of clause are watched • May require to traverse already assigned literals, multiple times Assign watched • Worst-case time of unit propagation is quadratic on clause size and so on number Swap of literals • In practice, no gains observed Assign watched from considering alternative implementations (see previous slide) Swap
Additional Key Techniques • Lightweight branching [e.g. MMZZM01] – Use conflict to bias variables to branch on, associate score with each variable – Prefer recent bias by regularly decreasing variable scores
Additional Key Techniques • Lightweight branching [e.g. MMZZM01] – Use conflict to bias variables to branch on, associate score with each variable – Prefer recent bias by regularly decreasing variable scores • Clause deletion policies – Not practical to keep all learned clauses – Delete larger clauses [e.g. MSS96] – Delete less used clauses [e.g. GN02,ES03]
Additional Key Techniques • Lightweight branching [e.g. MMZZM01] – Use conflict to bias variables to branch on, associate score with each variable – Prefer recent bias by regularly decreasing variable scores • Clause deletion policies – Not practical to keep all learned clauses – Delete larger clauses [e.g. MSS96] – Delete less used clauses [e.g. GN02,ES03] • Proven recent techniques: – Phase saving [PD07] – Luby restarts [H07] – Literal blocks distance [AS09]
Recommend
More recommend