counterexample guided cartesian abstraction refinement
play

Counterexample-guided Cartesian Abstraction Refinement and Saturated - PowerPoint PPT Presentation

Counterexample-guided Cartesian Abstraction Refinement and Saturated Cost Partitioning for Optimal Classical Planning Jendrik Seipp February 28, 2018 University of Basel Planning Find a sequence of actions that achieves a goal. 1/35 Optimal


  1. Counterexample-guided Cartesian Abstraction Refinement and Saturated Cost Partitioning for Optimal Classical Planning Jendrik Seipp February 28, 2018 University of Basel

  2. Planning Find a sequence of actions that achieves a goal. 1/35

  3. Optimal Classical Planning drive drive drive load-in-A unload-in-B unload-in-A load-in-B 2/35

  4. Optimal Classical Planning: Example Abstraction drive drive drive load-in-A unload-in-A unload-in-B load-in-B 3/35

  5. Abstraction Heuristics • how to create abstractions? 4/35 • abstraction heuristics never overestimate → admissible • A ∗ + admissible heuristic → optimal plan • higher accuracy → better guidance

  6. Abstraction Heuristics 4/35 • abstraction heuristics never overestimate → admissible • A ∗ + admissible heuristic → optimal plan • higher accuracy → better guidance • how to create abstractions?

  7. Counterexample-guided Cartesian Abstraction Refinement

  8. Counterexample-guided Abstraction Refinement (CEGAR) CEGAR Algorithm • start with coarse abstraction • until finding concrete solution or running out of time: • find abstract solution • check if and why it fails in the real world • refine abstraction 5/35

  9. Example Refinement drive, (un)load-in-A, (un)load-in-B 6/35

  10. Example Refinement drive, (un)load-in-A drive unload-in-B load-in-B 6/35

  11. Example Refinement drive drive drive load-in-A unload-in-A unload-in-B load-in-B 6/35

  12. Classes of Abstractions Cartesian Abstractions • relation to other classes of abstractions? 7/35

  13. Projection (PDB) drive drive drive load-in-A unload-in-A unload-in-B load-in-B 8/35

  14. Cartesian Abstraction drive drive (un)load-in-A (un)load-in-B drive 8/35

  15. Merge-and-shrink Abstraction (un)load-in-B drive drive drive (un)load-in-A 8/35

  16. Classes of Abstractions • Projections (PDBs) refinement at least doubles number of states allow efficient and fine-grained refinement refinement complicated and expensive 9/35 • Cartesian Abstractions • Merge-and-shrink Abstractions

  17. Solved Tasks 700 706 CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 10/35 1 , 000 1 , 100 1 , 000 1 , 100

  18. CEGAR Drawbacks Diminishing Returns • finding solutions takes longer • heuristic values only increase logarithmically multiple smaller abstractions 11/35

  19. CEGAR Drawbacks Diminishing Returns • finding solutions takes longer • heuristic values only increase logarithmically 11/35 → multiple smaller abstractions

  20. Task Decomposition by Goals • build abstraction for each goal fact • problem: tasks with single goal fact 12/35

  21. Task Decomposition by Goals • build abstraction for each goal fact • problem: tasks with single goal fact 12/35

  22. Task Decomposition by Landmarks • build abstraction for each fact landmark drive drive drive load-in-A unload-in-B unload-in-A load-in-B 13/35

  23. Task Decomposition by Landmarks • build abstraction for each fact landmark drive drive drive load-in-A unload-in-B unload-in-A load-in-B 13/35

  24. h 1 s 2 h 2 s 2 • only selects best heuristic • does not combine heuristics Multiple Heuristics 1 5 • h s 2 maximize over estimates: 4 5 1 4 how to combine multiple heuristics? 4 s 5 s 2 ,s 3 ,s 4 s 1 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 14/35

  25. h 1 s 2 h 2 s 2 • only selects best heuristic • does not combine heuristics Multiple Heuristics 1 5 • h s 2 maximize over estimates: 4 5 1 4 how to combine multiple heuristics? 4 s 5 s 2 ,s 3 ,s 4 s 1 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 14/35

  26. • only selects best heuristic • does not combine heuristics Multiple Heuristics how to combine multiple heuristics? 5 • h s 2 maximize over estimates: 1 1 4 4 s 5 s 2 ,s 3 ,s 4 s 1 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 14/35 h 1 ( s 2 ) = 5 h 2 ( s 2 ) = 4

  27. • does not combine heuristics Multiple Heuristics how to combine multiple heuristics? • only selects best heuristic maximize over estimates: 1 1 4 4 s 5 s 2 ,s 3 ,s 4 s 1 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 14/35 h 1 ( s 2 ) = 5 h 2 ( s 2 ) = 4 • h ( s 2 ) = 5

  28. Multiple Heuristics s 2 ,s 3 ,s 4 • does not combine heuristics maximize over estimates: 1 1 4 4 how to combine multiple heuristics? s 5 s 1 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 14/35 h 1 ( s 2 ) = 5 h 2 ( s 2 ) = 4 • h ( s 2 ) = 5 • only selects best heuristic

  29. Multiple Heuristics: Cost Partitioning s 2 ,s 3 ,s 4 6 3 3 h s 2 1 1 4 4 s 5 s 1 Cost Partitioning 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 • sum of costs must not exceed original cost • split operator costs among heuristics 15/35

  30. Multiple Heuristics: Cost Partitioning s 2 ,s 3 ,s 4 6 3 3 h s 2 0 1 3 1 s 5 s 1 Cost Partitioning 1 0 1 2 s 4 ,s 5 s 3 s 1 ,s 2 • sum of costs must not exceed original cost • split operator costs among heuristics 15/35

  31. Multiple Heuristics: Cost Partitioning Cost Partitioning 0 1 3 1 s 5 s 2 ,s 3 ,s 4 s 1 1 0 1 2 s 4 ,s 5 s 3 s 1 ,s 2 • sum of costs must not exceed original cost • split operator costs among heuristics 15/35 h ( s 2 ) = 3 + 3 = 6

  32. Saturated Cost Partitioning

  33. Saturated Cost Partitioning s 2 ,s 3 ,s 4 8 3 5 h s 2 1 1 4 4 s 5 s 1 Saturated Cost Partitioning Algorithm 1 1 4 4 s 4 ,s 5 s 3 s 1 ,s 2 • use remaining costs for subsequent heuristics • use minimum costs preserving all estimates of h • order heuristics, then for each heuristic h: 16/35

  34. Saturated Cost Partitioning 1 8 3 5 h s 2 s 5 s 2 ,s 3 ,s 4 s 1 1 Saturated Cost Partitioning Algorithm 4 4 s 4 ,s 5 s 3 s 1 ,s 2 • use remaining costs for subsequent heuristics • use minimum costs preserving all estimates of h • order heuristics, then for each heuristic h: 16/35

  35. Saturated Cost Partitioning 1 8 3 5 h s 2 s 5 s 2 ,s 3 ,s 4 s 1 0 Saturated Cost Partitioning Algorithm 1 4 s 4 ,s 5 s 3 s 1 ,s 2 • use remaining costs for subsequent heuristics • use minimum costs preserving all estimates of h • order heuristics, then for each heuristic h: 16/35

  36. Saturated Cost Partitioning s 2 ,s 3 ,s 4 8 3 5 h s 2 0 1 3 0 s 5 s 1 Saturated Cost Partitioning Algorithm 1 0 1 4 s 4 ,s 5 s 3 s 1 ,s 2 • use remaining costs for subsequent heuristics • use minimum costs preserving all estimates of h • order heuristics, then for each heuristic h: 16/35

  37. Saturated Cost Partitioning s 2 ,s 3 ,s 4 8 3 5 h s 2 0 0 3 0 s 5 s 1 Saturated Cost Partitioning Algorithm 1 0 1 4 s 4 ,s 5 s 3 s 1 ,s 2 • use remaining costs for subsequent heuristics • use minimum costs preserving all estimates of h • order heuristics, then for each heuristic h: 16/35

  38. Saturated Cost Partitioning 1 0 0 3 0 s 5 s 2 ,s 3 ,s 4 s 1 0 Saturated Cost Partitioning Algorithm 1 4 s 4 ,s 5 s 3 s 1 ,s 2 • use remaining costs for subsequent heuristics • use minimum costs preserving all estimates of h • order heuristics, then for each heuristic h: 16/35 h ( s 2 ) = 5 + 3 = 8

  39. Solved Tasks 700 706 CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 17/35 1 , 000 1 , 100 1 , 000 1 , 100

  40. Solved Tasks 700 774 706 goals CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 17/35 1 , 000 1 , 100 1 , 000 1 , 100

  41. Solved Tasks 700 785 774 706 landmarks goals CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 17/35 1 , 000 1 , 100 1 , 000 1 , 100

  42. Solved Tasks 700 798 785 774 706 LMs+goals landmarks goals CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 17/35 1 , 000 1 , 100 1 , 000 1 , 100

  43. h SCP s 2 h SCP s 2 Order of Heuristics Is Important 1 7 4 3 8 3 5 1 4 s 1 ,s 2 4 s 5 s 2 ,s 3 ,s 4 s 1 1 1 4 4 s 4 ,s 5 s 3 18/35

  44. h SCP s 2 Order of Heuristics Is Important s 1 ,s 2 7 4 3 h SCP 0 0 3 0 s 5 s 2 ,s 3 ,s 4 s 1 1 0 1 4 s 4 ,s 5 s 3 18/35 → ( s 2 ) = 5 + 3 = 8

  45. Order of Heuristics Is Important s 1 ,s 2 h SCP h SCP 0 1 4 1 s 5 s 2 ,s 3 ,s 4 s 1 0 0 0 3 s 4 ,s 5 s 3 18/35 → ( s 2 ) = 5 + 3 = 8 ← ( s 2 ) = 3 + 4 = 7

  46. Finding a Good Order search for good order: greedy initial order + optimization 19/35 • n heuristics → n ! orders

  47. Finding a Good Order 19/35 • n heuristics → n ! orders → search for good order: greedy initial order + optimization

  48. Greedy Orders Goal: high estimates and low costs order by heuristic/costs ratio 20/35

  49. Greedy Orders Goal: high estimates and low costs 20/35 → order by heuristic/costs ratio

  50. Solved Tasks 700 798 785 774 706 LMs+goals landmarks goals CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 21/35 1 , 000 1 , 100 1 , 000 1 , 100

  51. Solved Tasks 700 866 798 785 774 706 greedy LMs+goals landmarks goals CEGAR 900 800 700 881 808 737 iPDB M&S PhO-Sys2 900 800 21/35 1 , 000 1 , 100 1 , 000 1 , 100

  52. Optimized Orders Optimization: finding initial order usually only first step Hill-climbing Search • start with initial order • until no better successor found: • switch positions of two heuristics • commit to first improving successor 22/35

Recommend


More recommend