scheduling and sat
play

Scheduling and SAT Emmanuel Hebrard Toulouse Outline Introduction - PowerPoint PPT Presentation

Scheduling and SAT Emmanuel Hebrard Toulouse Outline Introduction 1 Scheduling and SAT Encoding 2 Scheduling and SAT Heuristics 3 Scheduling and SAT Hybrids 4 Conclusion 5 2 / 53 Outline Introduction 1 Preamble Scheduling


  1. Conflict Driven Clause Leaning (CDCL) “Evolved” from DPLL Turning point: clause learning ([GRASP] then [Chaff]) ◮ First SAT-Solver competition in 2002 9 / 53

  2. Conflict Driven Clause Leaning (CDCL) “Evolved” from DPLL Turning point: clause learning ([GRASP] then [Chaff]) ◮ First SAT-Solver competition in 2002 Dive in the “search tree” (make decisions) ◮ Unit propagate: if a must be true, then a cannot satisfy a clause 9 / 53

  3. Conflict Driven Clause Leaning (CDCL) “Evolved” from DPLL Turning point: clause learning ([GRASP] then [Chaff]) ◮ First SAT-Solver competition in 2002 Dive in the “search tree” (make decisions) ◮ Unit propagate: if a must be true, then a cannot satisfy a clause ◮ a ∨ b ∨ c effectively becomes b ∨ c 9 / 53

  4. Conflict Driven Clause Leaning (CDCL) “Evolved” from DPLL Turning point: clause learning ([GRASP] then [Chaff]) ◮ First SAT-Solver competition in 2002 Dive in the “search tree” (make decisions) ◮ Unit propagate: if a must be true, then a cannot satisfy a clause ◮ a ∨ b ∨ c effectively becomes b ∨ c Until reaching a conflicts (dead-end) ◮ Extract a learned clause ◮ Backjump several levels and unit-propagate the learned clause 9 / 53

  5. Conflict Driven Clause Leaning (CDCL) “Evolved” from DPLL Turning point: clause learning ([GRASP] then [Chaff]) ◮ First SAT-Solver competition in 2002 Dive in the “search tree” (make decisions) ◮ Unit propagate: if a must be true, then a cannot satisfy a clause ◮ a ∨ b ∨ c effectively becomes b ∨ c Until reaching a conflicts (dead-end) ◮ Extract a learned clause ◮ Backjump several levels and unit-propagate the learned clause Adaptive branching heuristics (weight conflicting literals) 9 / 53

  6. Conflict Driven Clause Leaning (CDCL) “Evolved” from DPLL Turning point: clause learning ([GRASP] then [Chaff]) ◮ First SAT-Solver competition in 2002 Dive in the “search tree” (make decisions) ◮ Unit propagate: if a must be true, then a cannot satisfy a clause ◮ a ∨ b ∨ c effectively becomes b ∨ c Until reaching a conflicts (dead-end) ◮ Extract a learned clause ◮ Backjump several levels and unit-propagate the learned clause Adaptive branching heuristics (weight conflicting literals) And also: restart, simplify the clause base, forget clauses, etc. 9 / 53

  7. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  8. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  9. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  10. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  11. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l j i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  12. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l j i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n c k l b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  13. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l j i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n c k l b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m d m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i

  14. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l j i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n c k l b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m d m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i e o n j

  15. CDCL: Example f c ∨ h ∨ n ∨ m a ∨ f ∨ g c ∨ l g a a ∨ b ∨ h d ∨ k ∨ l a ∨ c d ∨ g ∨ l a ∨ i ∨ l j i b h g ∨ n ∨ o a ∨ k ∨ j b ∨ d h ∨ o ∨ j ∨ n c k l b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m d m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i e o n ⊥ j 10 / 53

  16. CDCL: Example o n h ( h ∨ o ∨ j ∨ n ) j 11 / 53

  17. CDCL: Example o n h ( h ∨ o ∨ j ∨ n ) j 11 / 53

  18. CDCL: Example o n h ( h ∨ o ∨ j ∨ n ) ≡ ( h ∧ o ∧ n ) → j j 11 / 53

  19. CDCL: Example f , a , b , c , d , e f g a g , i , k , l , m j i j b h h c o k l n d m j e o n ⊥ j ⊥

  20. CDCL: Example f , a , b , c , d , e f g a g , i , k , l , m j j i j b h h h c o k l n d m j e o o n n ⊥ j ⊥

  21. CDCL: Example f , a , b , c , d , e f g g a i , k , l , m j j g i j b h h h c k l n o d m j e o o n n ⊥ j ⊥ 12 / 53

  22. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m g a a ∨ b ∨ h c ∨ l a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l j i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n c k l b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m d m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i e o n ⊥ j 13 / 53

  23. CDCL: Example f a ∨ f ∨ g c ∨ h ∨ n ∨ m a ∨ b ∨ h c ∨ l g a a ∨ c d ∨ k ∨ l a ∨ i ∨ l d ∨ g ∨ l j i b h a ∨ k ∨ j g ∨ n ∨ o b ∨ d h ∨ o ∨ j ∨ n b ∨ g ∨ n i ∨ j b ∨ f ∨ n ∨ k d ∨ l ∨ m c ∨ k e ∨ m ∨ n c ∨ k ∨ i ∨ l f ∨ h ∨ i 13 / 53

  24. CDCL: Example f c ∨ h ∨ n ∨ m a ∨ f ∨ g c ∨ l a ∨ b ∨ h g a d ∨ k ∨ l a ∨ c d ∨ g ∨ l a ∨ i ∨ l j i b g ∨ n ∨ o h a ∨ k ∨ j h ∨ o ∨ j ∨ n b ∨ d i ∨ j b ∨ g ∨ n d ∨ l ∨ m b ∨ f ∨ n ∨ k e ∨ m ∨ n c ∨ k f ∨ h ∨ i c ∨ k ∨ i ∨ l g ∨ h ∨ j ∨ n 13 / 53

  25. CDCL: Example f c ∨ h ∨ n ∨ m a ∨ f ∨ g c ∨ l a ∨ b ∨ h g a d ∨ k ∨ l a ∨ c d ∨ g ∨ l a ∨ i ∨ l j n i b g ∨ n ∨ o h a ∨ k ∨ j h ∨ o ∨ j ∨ n b ∨ d i ∨ j b ∨ g ∨ n d ∨ l ∨ m b ∨ f ∨ n ∨ k e ∨ m ∨ n c ∨ k f ∨ h ∨ i c ∨ k ∨ i ∨ l g ∨ h ∨ j ∨ n 13 / 53

  26. Outline Introduction 1 Scheduling and SAT Encoding 2 Formulation into SAT Scheduling by encoding into SAT Scheduling and SAT Heuristics 3 Scheduling and SAT Hybrids 4 Conclusion 5 14 / 53

  27. CNF encoding The way we encode problems into SAT has a huge impact on efficiency ◮ Encoding of Planning problems ◮ Encoding of CSP (Direct, Log, AC-encoding) ◮ Encoding of Pseudo-Boolean (Adder, Sorter) 15 / 53

  28. Direct Encoding Domain An atom i v for each pair ( x i , v ∈ D ( x i )) ◮ i v ⇔ x i = v (for instance: 2 ↔ 0100) Must take at least a value: i 1 ∨ i 2 ∨ . . . ∨ i n Must take at most one value: � v � = w ∈ D ( x i ) i v ∨ i w 16 / 53

  29. Direct Encoding Domain An atom i v for each pair ( x i , v ∈ D ( x i )) ◮ i v ⇔ x i = v (for instance: 2 ↔ 0100) Must take at least a value: i 1 ∨ i 2 ∨ . . . ∨ i n Must take at most one value: � v � = w ∈ D ( x i ) i v ∨ i w Complexity O ( n 2 ) space: n ( n − 1) / 2 binary clauses and one n -ary clause. There are different ways to encode the constraints. 16 / 53

  30. Constraints: Tuple Encoding Example of constraint: x i < x j x i 1 2 3 4 x j 1 i 1 ∨ j 1 i 2 ∨ j 1 i 3 ∨ j 1 i 4 ∨ j 1 2 i 2 ∨ j 2 i 3 ∨ j 2 i 4 ∨ j 2 3 i 3 ∨ j 3 i 4 ∨ j 3 4 i 4 ∨ j 3 17 / 53

  31. Constraints: Tuple Encoding Example of constraint: x i < x j x i 1 2 3 4 x j 1 i 1 ∨ j 1 i 2 ∨ j 1 i 3 ∨ j 1 i 4 ∨ j 1 2 i 2 ∨ j 2 i 3 ∨ j 2 i 4 ∨ j 2 3 i 3 ∨ j 3 i 4 ∨ j 3 4 i 4 ∨ j 3 Costly (in space) and weak (in propagation) O ( n 2 ) binary clauses. i 4 ( x i � = 4) and j 1 ( x j � = 1) are inconsistent, but not unit propagated. 17 / 53

  32. Constraints: AC Encoding [Kasif 90] Example of constraint: x i < x j assignment atom support x i = 1 i 1 ∨ j 2 ∨ j 3 ∨ j 4 x i = 2 i 2 ∨ j 3 ∨ j 4 x i = 3 ∨ i 3 j 4 x i = 4 i 4 ∨ ⊥ 18 / 53

  33. Constraints: AC Encoding [Kasif 90] Example of constraint: x i < x j assignment atom support x i = 1 i 1 ∨ j 2 ∨ j 3 ∨ j 4 x i = 2 i 2 ∨ j 3 ∨ j 4 x i = 3 ∨ i 3 j 4 x i = 4 i 4 ∨ ⊥ Same space complexity, better propagation O ( n ) n -ary clauses i 4 ( x i � = 4) and j 1 ( x j � = 1) are unit clauses. 18 / 53

  34. Order Encoding [Crawford & Backer 94] Domain An atom i v for each pair ( x i , v ∈ D ( x i )) ◮ i v ⇔ x i ≤ v (for instance: 2 ↔ 0111) Bound propagation: ◮ If x i ≤ v then x i ≤ v + 1 ◮ � v ∈ D ( x i ) i v ∨ i v +1 19 / 53

  35. Order Encoding [Crawford & Backer 94] Domain An atom i v for each pair ( x i , v ∈ D ( x i )) ◮ i v ⇔ x i ≤ v (for instance: 2 ↔ 0111) Bound propagation: ◮ If x i ≤ v then x i ≤ v + 1 ◮ � v ∈ D ( x i ) i v ∨ i v +1 Complexity O ( n ) space ( n − 1 binary clauses) 19 / 53

  36. Constraints: BC Encoding Example of constraint: x i < x j relation clause x i > 0 ⇒ x j > 1 ⊥ ∨ j 1 x i > 1 ⇒ x j > 2 i 1 ∨ j 2 x i > 2 ⇒ x j > 3 i 2 ∨ j 3 x i > 3 ⇒ x j > 4 i 3 ∨ ⊥ 20 / 53

  37. Constraints: BC Encoding Example of constraint: x i < x j relation clause x i > 0 ⇒ x j > 1 ⊥ ∨ j 1 x i > 1 ⇒ x j > 2 i 1 ∨ j 2 x i > 2 ⇒ x j > 3 i 2 ∨ j 3 x i > 3 ⇒ x j > 4 i 3 ∨ ⊥ Better complexity and same propagation on some linear constraints O ( n ) space ( n binary clauses) i 3 ( x i ≤ 3) and j 1 ( x j > 1) are unit clauses. 20 / 53

  38. Log Encoding [Walsh 00] Domain An atom i k for each value in [1 , . . . , ⌊ log 2 ub ⌋ ] (assuming D ( x ) = [0 , . . . , ub ]) k =1 2 k ∗ i k = v ⇔ x i = v (for instance: 2 ↔ 01) ◮ � ub For interval domains, no need for extra clauses 21 / 53

  39. Log Encoding [Walsh 00] Domain An atom i k for each value in [1 , . . . , ⌊ log 2 ub ⌋ ] (assuming D ( x ) = [0 , . . . , ub ]) k =1 2 k ∗ i k = v ⇔ x i = v (for instance: 2 ↔ 01) ◮ � ub For interval domains, no need for extra clauses Complexity O (log 2 n ) space Propagation Encoding constraints is trickier, and less powerful 21 / 53

  40. Other Encodings Many more! Mix of direct and order encoding [lazy-FD, Numberjack] Mix of AC and log encoding [Gavanelli 2007] Mix of order and log encoding [Sugar, Tamura et al. 2006] 22 / 53

  41. Other Encodings Many more! Mix of direct and order encoding [lazy-FD, Numberjack] Mix of AC and log encoding [Gavanelli 2007] Mix of order and log encoding [Sugar, Tamura et al. 2006] ◮ Log encoding in a base B and order encoding inside a digit ◮ Excellent results on scheduling benchmarks! (with CDCL solvers) 22 / 53

  42. Order Encoding, Now and Then Progress of SAT solvers From a few hundreds variables in the 90’s to millions now 23 / 53

  43. Order Encoding, Now and Then Progress of SAT solvers From a few hundreds variables in the 90’s to millions now [Crawford & Backer 94] Instances from Sadeh, with 10 jobs, 5 operations each (45m cutoff) Tableau solved 90% of the instances (about 2 min when it did) 23 / 53

  44. Order Encoding, Now and Then Progress of SAT solvers From a few hundreds variables in the 90’s to millions now [Crawford & Backer 94] Instances from Sadeh, with 10 jobs, 5 operations each (45m cutoff) Tableau solved 90% of the instances (about 2 min when it did) [Tamura, Tanjo & Banbara] Same instances used during the CSP Solver Competition Similar model, hardware of course incomparable, MiniSat 23 / 53

  45. Order Encoding, Now and Then Progress of SAT solvers From a few hundreds variables in the 90’s to millions now [Crawford & Backer 94] Instances from Sadeh, with 10 jobs, 5 operations each (45m cutoff) Tableau solved 90% of the instances (about 2 min when it did) [Tamura, Tanjo & Banbara] Same instances used during the CSP Solver Competition Similar model, hardware of course incomparable, MiniSat The hardest instance requires a few 100s conflicts at the most 23 / 53

  46. Closing the Open Shop Instances [Gueret & Prins]: hard for local search, extremely easy for SAT/CP [Taillard]: Large, but relatively easy [Brucker]: Three open instances 24 / 53

  47. Closing the Open Shop Instances [Gueret & Prins]: hard for local search, extremely easy for SAT/CP [Taillard]: Large, but relatively easy [Brucker]: Three open instances results All instances solved and proved optimal ◮ The two hardest instances were decomposed into 120 subproblems, and required up to 13h to solve 24 / 53

  48. Closing the Open Shop Instances [Gueret & Prins]: hard for local search, extremely easy for SAT/CP [Taillard]: Large, but relatively easy [Brucker]: Three open instances results All instances solved and proved optimal ◮ The two hardest instances were decomposed into 120 subproblems, and required up to 13h to solve First approach to close the open shop! 24 / 53

  49. Solving vs. Encoding 25 / 53

  50. Solving vs. Encoding [Tamura et al.]’s encoding is better than order encoding 25 / 53

  51. Solving vs. Encoding [Tamura et al.]’s encoding is better than order encoding ◮ However, the huge difference with respect to [Crawford & Backer 94] is due to the solver 25 / 53

  52. Solving vs. Encoding [Tamura et al.]’s encoding is better than order encoding ◮ However, the huge difference with respect to [Crawford & Backer 94] is due to the solver It is now possible to efficiently solve some scheduling problem simply by formulating it as a CNF formula 25 / 53

  53. Outline Introduction 1 Scheduling and SAT Encoding 2 Scheduling and SAT Heuristics 3 A SAT-like Approach Comparison with the State of the Art Scheduling and SAT Hybrids 4 Conclusion 5 26 / 53

  54. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks 27 / 53

  55. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks ◮ Some hard instances ◮ Generic format (XCSP), the notions of resource is lost, no global constraint 27 / 53

  56. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks ◮ Some hard instances ◮ Generic format (XCSP), the notions of resource is lost, no global constraint ◮ Yet many solvers solved them ([Sugar], [Choco], [Mistral]) 27 / 53

  57. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks ◮ Some hard instances ◮ Generic format (XCSP), the notions of resource is lost, no global constraint ◮ Yet many solvers solved them ([Sugar], [Choco], [Mistral]) Experiment with Weighted degree [Boussemart et al. 04] 27 / 53

  58. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks ◮ Some hard instances ◮ Generic format (XCSP), the notions of resource is lost, no global constraint ◮ Yet many solvers solved them ([Sugar], [Choco], [Mistral]) Experiment with Weighted degree [Boussemart et al. 04] ◮ Similar simple model in [Mistral], same observation [Grimes] 27 / 53

  59. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks ◮ Some hard instances ◮ Generic format (XCSP), the notions of resource is lost, no global constraint ◮ Yet many solvers solved them ([Sugar], [Choco], [Mistral]) Experiment with Weighted degree [Boussemart et al. 04] ◮ Similar simple model in [Mistral], same observation [Grimes] ◮ Open shop instances closed by [Tamura et al.] can be solved to optimality in a few minutes 27 / 53

  60. A SAT-like Approach [Grimes & Hebrard 09] CSP Solver Competition: scheduling benchmarks ◮ Some hard instances ◮ Generic format (XCSP), the notions of resource is lost, no global constraint ◮ Yet many solvers solved them ([Sugar], [Choco], [Mistral]) Experiment with Weighted degree [Boussemart et al. 04] ◮ Similar simple model in [Mistral], same observation [Grimes] ◮ Open shop instances closed by [Tamura et al.] can be solved to optimality in a few minutes Are adaptive heuristics all that we need to solve disjunctive scheduling problems? 27 / 53

  61. Constraint Model 20 20 50 80 t 1 t 2 t 3 t 4 50 60 45 20 25 t 5 t 6 t 7 t 8 C max 50 50 30 40 t 9 t 10 t 11 t 12 Model A Variable for the start time of each task: t i ∈ [0 , . . . , C max ]. ◮ Precedence constraints: t i + p i ≤ t i +1 . 28 / 53

  62. Constraint Model t 2 b 2 , 7 b 2 , 9 t 7 b 7 , 9 t 9 Model A Variable for the start time of each task: t i ∈ [0 , . . . , C max ]. ◮ Precedence constraints: t i + p i ≤ t i +1 . A Boolean Variable standing for the relative order of each pair of conflicting tasks (disjunct): � 0 ⇔ t i + p i ≤ t j ◮ Binary Disjunctive constraints: b ij = 1 ⇔ t j + p j ≤ t i 29 / 53

  63. Search Strategy 30 / 53

  64. Search Strategy Adaptive heuristic Branch on Boolean variables only (order tasks on machines) Minimum domain over weighted degree [Boussemart et al. 04] 30 / 53

  65. Search Strategy Adaptive heuristic Branch on Boolean variables only (order tasks on machines) Minimum domain over weighted degree [Boussemart et al. 04] Guided search Follow the branch corresponding to the best solution [Beck 07] ≃ phase-saving heuristic in SAT [Pipatsrisawat & Darwiche 07] 30 / 53

  66. Search Strategy Adaptive heuristic Branch on Boolean variables only (order tasks on machines) Minimum domain over weighted degree [Boussemart et al. 04] Guided search Follow the branch corresponding to the best solution [Beck 07] ≃ phase-saving heuristic in SAT [Pipatsrisawat & Darwiche 07] Restarts Geometric [Walsh 99], nogoods on restarts [Lecoutre et al. 07] 30 / 53

Recommend


More recommend