encoding challenges for hard problems
play

Encoding Challenges for Hard Problems Marijn J.H. Heule Starting at - PowerPoint PPT Presentation

Encoding Challenges for Hard Problems Marijn J.H. Heule Starting at in August Matryoshka workshop June 12, 2019 1/33 Automated Reasoning Has Many Applications security planning and formal verification bioinformatics scheduling train


  1. Encoding Challenges for Hard Problems Marijn J.H. Heule Starting at in August Matryoshka workshop June 12, 2019 1/33

  2. Automated Reasoning Has Many Applications security planning and formal verification bioinformatics scheduling train safety exploit term rewriting automated theorem proving generation termination encode automated reasoner decode 2/33

  3. Breakthrough in SAT Solving in the Last 20 Years Satisfiability (SAT) problem: Can a Boolean formula be satisfied? mid ’90s: formulas solvable with thousands of variables and clauses now: formulas solvable with millions of variables and clauses Edmund Clarke: “a key Donald Knuth: “evidently a killer app, technology of the 21st century” because it is key to the solution of so many other problems” [Knuth ’15] [Biere, Heule, vanMaaren, and Walsh ’09] 3/33

  4. Representations Matrix Multiplication The Collatz Conjecture 4/33

  5. Representations Matrix Multiplication The Collatz Conjecture 5/33

  6. The Right Representation is Crucial What makes a problem hard? New angle: does the representation enable efficient reasoning? The famous pigeonhole principle: Can n + 1 pigeons be placed in n holes such that no hole contains multiple pigeons? ◮ Hard for many automated reasoning approaches ◮ Easy for a little kid given the right representation source: pecanpartnership.co.uk/2016/01/05/beware-pigeon-hole- overcoming-stereotypes-build-collaborative-culture 6/33

  7. Artisan Representations (joint work) Architectural 3D Layout Van der Waerden numbers [VSMM ’07] [EJoC ’07] Henriette Bier Edge-matching Puzzles Software Model Synthesis [LaSh ’08] [ICGI ’10, ESE ’13] Sicco Verwer Graceful Graphs Conway’s Game of Life [AAAI ’10] [EJoC ’13] Toby Walsh Willem van der Poel Clique-Width Connect the Pairs [SAT ’13, TOCL ’15] Donald Knuth Stefan Szeider Firewall Verification Pythagorean Triples [SSS ’16] [SAT ’16, CACM ’17] Mohamed Gouda Victor Marek Open Knight Tours Collatz conjecture [Open] Moshe Vardi Scott Aaronson 7/33

  8. Artisan Representations (joint work) Architectural 3D Layout Van der Waerden numbers [VSMM ’07] [EJoC ’07] Henriette Bier Software Model Synthesis Edge-matching Puzzles [ICGI ’10, ESE ’13] [LaSh ’08] Sicco Verwer Graceful Graphs Conway’s Game of Life [AAAI ’10] [EJoC ’13] Toby Walsh Willem van der Poel Clique-Width Connect the Pairs [SAT ’13, TOCL ’15] Donald Knuth Stefan Szeider Firewall Verification Pythagorean Triples [SSS ’16] [SAT ’16, CACM ’17] Mohamed Gouda Victor Marek Open Knight Tours Collatz conjecture [Open] Moshe Vardi Scott Aaronson 7/33

  9. Inprocessing [J¨ arvisalo, Heule, and Biere ’12] How to fix a poor representation fully automatically? Reformulate Reformulate CDCL 8/33

  10. Inprocessing [J¨ arvisalo, Heule, and Biere ’12] How to fix a poor representation fully automatically? Reformulate Reformulate CDCL Example: Bounded Variable Addition [Manthey, Heule, and Biere ’12] Replace ( a ∨ d ) ( a ∨ e ) ( b ∨ d ) ( b ∨ e ) ( c ∨ d ) ( c ∨ e ) by ( x ∨ a ) ( x ∨ b ) ( x ∨ c ) ( x ∨ d ) ( x ∨ e ) adds 1 variable removes 1 clause This technique is crucial for hard bioinformatics problems and turns the naive encoding of AtMostOne into the optimal one. 8/33

  11. Inprocessing [J¨ arvisalo, Heule, and Biere ’12] How to fix a poor representation fully automatically? Reformulate Reformulate CDCL Example: Bounded Variable Addition [Manthey, Heule, and Biere ’12] Replace ( a ∨ d ) ( a ∨ e ) ( b ∨ d ) ( b ∨ e ) ( c ∨ d ) ( c ∨ e ) by ( x ∨ a ) ( x ∨ b ) ( x ∨ c ) ( x ∨ d ) ( x ∨ e ) adds 1 variable removes 1 clause This technique is crucial for hard bioinformatics problems and turns the naive encoding of AtMostOne into the optimal one. 8/33

  12. Matrix Multiplication joint work with Manuel Kauers and Martina Seidl 9/33

  13. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 c 1,1 = a 1,1 · b 1,1 + a 1,2 · b 2,1 c 1,2 = a 1,1 · b 1,2 + a 1,2 · b 2,2 c 2,1 = a 2,1 · b 1,1 + a 2,2 · b 2,1 c 2,2 = a 2,1 · b 1,2 + a 2,2 · b 2,2 10/33

  14. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 c 1,1 = M 1 + M 4 − M 5 + M 7 c 1,2 = M 3 + M 5 c 2,1 = M 2 + M 4 c 2,2 = M 1 − M 2 + M 3 + M 6 10/33

  15. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 . . . where M 1 = ( a 1,1 + a 2,2 ) · ( b 1,1 + b 2,2 ) M 2 = ( a 2,1 + a 2,2 ) · b 1,1 M 3 = a 1,1 · ( b 1,2 − b 2,2 ) M 4 = a 2,2 · ( b 2,1 − b 1,1 ) M 5 = ( a 1,1 + a 1,2 ) · b 2,2 M 6 = ( a 2,1 − a 1,1 ) · ( b 1,1 + b 1,2 ) M 7 = ( a 1,2 − a 2,2 ) · ( b 2,1 + b 2,2 ) 10/33

  16. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 ◮ This scheme needs 7 multiplications instead of 8. 10/33

  17. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 ◮ This scheme needs 7 multiplications instead of 8. ◮ Recursive application allows to multiply n × n matrices with O ( n log 2 7 ) operations in the ground ring. 10/33

  18. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 ◮ This scheme needs 7 multiplications instead of 8. ◮ Recursive application allows to multiply n × n matrices with O ( n log 2 7 ) operations in the ground ring. ◮ Let ω be the smallest number so that n × n matrices can be multiplied using O ( n ω ) operations in the ground domain. 10/33

  19. Matrix Multiplication: Introduction � � � � � � a 1,1 a 1,2 b 1,1 b 1,2 c 1,1 c 1,2 = a 2,1 a 2,2 b 2,1 b 2,2 c 2,1 c 2,2 ◮ This scheme needs 7 multiplications instead of 8. ◮ Recursive application allows to multiply n × n matrices with O ( n log 2 7 ) operations in the ground ring. ◮ Let ω be the smallest number so that n × n matrices can be multiplied using O ( n ω ) operations in the ground domain. ◮ Then 2 ≤ ω < 3. What is the exact value? 10/33

  20. Efficient Matrix Multiplication: Theory ◮ Strassen 1969: ω ≤ log 2 7 ≤ 2.807 11/33

  21. Efficient Matrix Multiplication: Theory ◮ Strassen 1969: ω ≤ log 2 7 ≤ 2.807 ◮ Pan 1978: ω ≤ 2.796 ◮ Bini et al. 1979: ω ≤ 2.7799 ◮ Sch¨ onhage 1981: ω ≤ 2.522 ◮ Romani 1982: ω ≤ 2.517 ◮ Coppersmith/Winograd 1981: ω ≤ 2.496 ◮ Strassen 1986: ω ≤ 2.479 ◮ Coppersmith/Winograd 1990: ω ≤ 2.376 11/33

  22. Efficient Matrix Multiplication: Theory ◮ Strassen 1969: ω ≤ log 2 7 ≤ 2.807 ◮ Pan 1978: ω ≤ 2.796 ◮ Bini et al. 1979: ω ≤ 2.7799 ◮ Sch¨ onhage 1981: ω ≤ 2.522 ◮ Romani 1982: ω ≤ 2.517 ◮ Coppersmith/Winograd 1981: ω ≤ 2.496 ◮ Strassen 1986: ω ≤ 2.479 ◮ Coppersmith/Winograd 1990: ω ≤ 2.376 ◮ Stothers 2010: ω ≤ 2.374 ◮ Williams 2011: ω ≤ 2.3728642 ◮ Le Gall 2014: ω ≤ 2.3728639 11/33

  23. Efficient Matrix Multiplication: Practice ◮ Only Strassen’s algorithm beats the classical algorithm for reasonable problem sizes. 12/33

  24. Efficient Matrix Multiplication: Practice ◮ Only Strassen’s algorithm beats the classical algorithm for reasonable problem sizes. ◮ Want: a matrix multiplication algorithm that beats Strassen’s algorithm for matrices of moderate size. 12/33

  25. Efficient Matrix Multiplication: Practice ◮ Only Strassen’s algorithm beats the classical algorithm for reasonable problem sizes. ◮ Want: a matrix multiplication algorithm that beats Strassen’s algorithm for matrices of moderate size. ◮ Idea: instead of dividing the matrices into 2 × 2-block matrices, divide them into 3 × 3-block matrices. 12/33

  26. Efficient Matrix Multiplication: Practice ◮ Only Strassen’s algorithm beats the classical algorithm for reasonable problem sizes. ◮ Want: a matrix multiplication algorithm that beats Strassen’s algorithm for matrices of moderate size. ◮ Idea: instead of dividing the matrices into 2 × 2-block matrices, divide them into 3 × 3-block matrices. ◮ Question: What’s the minimal number of multiplications needed to multiply two 3 × 3 matrices? 12/33

  27. Efficient Matrix Multiplication: Practice ◮ Only Strassen’s algorithm beats the classical algorithm for reasonable problem sizes. ◮ Want: a matrix multiplication algorithm that beats Strassen’s algorithm for matrices of moderate size. ◮ Idea: instead of dividing the matrices into 2 × 2-block matrices, divide them into 3 × 3-block matrices. ◮ Question: What’s the minimal number of multiplications needed to multiply two 3 × 3 matrices? ◮ Answer: Nobody knows. 12/33

  28. The 3 x 3 Case is Still Open Question: What’s the minimal number of multiplications needed to multiply two 3 × 3 matrices? 13/33

  29. The 3 x 3 Case is Still Open Question: What’s the minimal number of multiplications needed to multiply two 3 × 3 matrices? ◮ naive algorithm: 27 13/33

  30. The 3 x 3 Case is Still Open Question: What’s the minimal number of multiplications needed to multiply two 3 × 3 matrices? ◮ naive algorithm: 27 ◮ padd with zeros, use Strassen twice, cleanup: 25 13/33

Recommend


More recommend