a portfolio approach for enforcing minimality in a tree
play

A Portfolio Approach for Enforcing Minimality in a Tree - PowerPoint PPT Presentation

A Portfolio Approach for Enforcing Minimality in a Tree Decomposition Daniel J. Geschwender 1,2 R.J. Woodward 1,2 B.Y. Choueiry 1,2 S. D. Scott 2 1 Constraint Systems Laboratory 2 Department of Computer Science and Eng. University of


  1. A Portfolio Approach for Enforcing Minimality in a Tree Decomposition Daniel J. Geschwender 1,2 R.J. Woodward 1,2 B.Y. Choueiry 1,2 S. D. Scott 2 1 Constraint Systems Laboratory 2 Department of Computer Science and Eng. University of Nebraska-Lincoln Acknowledgements • Experiments conducted at UNL’s Holland Computing Center • Geschwender supported by a NSF Graduate Research Fellowship Grant No. 1041000 • NSF Grants No. RI-111795 and RI-1619344 Constraint Systems Laboratory 9/3/16 CP 2016 1

  2. Daniel Geschwender • 3 rd year PhD student at University of Nebraska – Lincoln’s Constraint Systems Laboratory • Studying high level relational consistencies and automated techniques for determining when to apply them • Always ready to play a board game! Constraint Systems Laboratory 9/4/16 CP 2016 2

  3. Claim: Cluster-level portfolio We advocate the use of an • for enforcing • on the of a tree decomposition • during in a backtrack search for solving CSPs Constraint Systems Laboratory 9/3/16 CP 2016 3

  4. Outline • Background – Minimality: property and algorithms (A LL S OL , P ER T UPLE ) – Minimality in a tree decomposition • Processing clusters: F ILTER C LUSTERS – GAC interleave – Cluster-level portfolio – Cluster-processing timeout • Training the classifier • Experiments • Conclusion Constraint Systems Laboratory 9/3/16 CP 2016 4

  5. Background: Minimality • Global consistency property • Every tuple in a relation can be extended to a full solution over the m relations � !tuple! ..… � !rela)on! ∀ m -1 relations Constraint Systems Laboratory 9/3/16 CP 2016 5

  6. Background : A LL S OL /P ER T UPLE A LL S OL [Karakashian, PhD 2013] • One search explores the entire search space • Finds all solutions without storing them, keeps tuples that appear in at least one t i t 3 t 2 solution t 1 • Better when there are many ‘almost’ solutions Constraint Systems Laboratory 9/3/16 CP 2016 6

  7. Background : A LL S OL /P ER T UPLE P ER T UPLE [Karakashian, PhD 2013] • For each tuple, finds one solution where it appears • Many searches that stop after the first solution t i • Better when many t 3 t 2 t 1 solutions are available Constraint Systems Laboratory 9/3/16 CP 2016 7

  8. Background: Tree decomposition, minimality • Minimality on clusters [Karakashian+ AAAI 2013] – Build a tree decomposition – Localize minimality to clusters – During search, after a variable instantiation • Enforce minimality on clusters • Propagate following tree structure • F ILTER C LUSTERS implements three improvements – GAC interleave – Cluster-level portfolio – Cluster-processing timeout Constraint Systems Laboratory 9/3/16 CP 2016 8

  9. F ILTER C LUSTERS : GAC interleave • It is often beneficial to run a lightweight algorithm (e.g., GAC) prior to running a more costly algorithm • We extend this idea and interleave a global GAC run between the processing of clusters Minimality GAC Minimality Constraint Systems Laboratory 9/3/16 CP 2016 9

  10. F ILTER C LUSTERS : Cluster-level portfolio • Performance of A LL S OL and P ER T UPLE vary • Sometimes both algorithms are too costly • Use algorithm portfolio on the cluster level – Different algorithms on different clusters – Different algorithms on the same cluster during propagation ‘AllSol’ t i ‘PerTuple’ t 3 t 2 t 1 ‘Neither’ Constraint Systems Laboratory 9/3/16 CP 2016 10

  11. F ILTER C LUSTERS : Cluster timeout • Limits the time for processing a single cluster • Allows recovery from a poor classification • When interrupted, partial results of – P ER T UPLE yield useful filtering – A LL S OL are useless Constraint Systems Laboratory 9/3/16 CP 2016 11

  12. Classifier Training: Data • 9362 individual clusters taken from 175 benchmarks • For each cluster instance i , collected – The values of 73 classification features – The runtime of A LL S OL : allSol ( i ) – The runtime of P ER T UPLE : perTuple ( i ) Constraint Systems Laboratory 9/3/16 CP 2016 12

  13. Classifier Training: Labels Runtime of All Instances ‘Neither’ Start 1x10 6 100000 P ER T UPLE Time (msec) No Yes ‘AllSol’ allSol ( i ) >10 min & 10000 perTuple ( i ) >10 min 1000 100 Yes No allSol ( i ) > perTuple ( i ) 10 ‘PerTuple’ 1 1x10 6 1 10 100 1000 10000 100000 A LL S OL Time (msec) ‘PerTuple’ ‘Neither’ ‘AllSol’ Constraint Systems Laboratory 9/3/16 CP 2016 13

  14. Classifier Training: Weights • Weight of a training instance i , weight ( i ) ( w ( allSol ( i ) , perTuple ( i )) label ( i ) =‘ AllSol ’ k ‘ PerTuple ’ weight ( i ) = 20 label ( i ) =‘ Neither ’ ⇠� ✓ ◆� ⇡ ⇠� ◆� ✓ a ⇡ � � w ( a, p ) = � log 10 � · | log 10 ( | a � p | + 0 . 01) | � � p • Designed to emphasize instance with both a a – large proportional difference p | a − p | – large absolute difference Constraint Systems Laboratory 9/3/16 CP 2016 14

  15. Classifier Training: Features • CSP parameters – #variables, #constraints, #values, #tuples – Constraint arity, constraint tightness – Relational linkage • Graph parameters: on dual, primal, and incidence graph – Density – Degree – Eccentricity – Clustering coefficient • Using several descriptive statistics – min, max, mean, coefficient of variation, entropy Constraint Systems Laboratory 9/3/16 CP 2016 15

  16. Classifier Training: Decision tree • We built a decision tree classifier using the J48 algorithm from the Weka machine learning software • Decision tree selected for: – Simplicity – Fast evaluation time – Only requires collection a subset of the features Constraint Systems Laboratory 9/3/16 CP 2016 16

  17. Experiments: Set up • Used 1055 instances from 42 benchmarks • Backtrack search, dynamic dom / deg ordering • Intel Xeon E5-2650 v3 2.30GHz processors with 12 GB memory • 2 hours total time out per instance • Compared GAC and six strategies (variants of F ILTER C LUSTERS ) Constraint Systems Laboratory 9/3/16 CP 2016 17

  18. Experiments: Tested strategies Constraint Systems Laboratory 9/3/16 CP 2016 18

  19. Experiments: Results P ER T UPLE + P ER T UPLE D EC T REE A LL S OL + R ANDOM A LL S OL GAC Completed Instances 550 472 567 514 633 643 685 Average Time (s) 2,471 3,075 2,081 2,789 1,622 1,427 1,121 Constraint Systems Laboratory 9/3/16 CP 2016 19

  20. Conclusions • A cluster-level portfolio, during lookahead – Is not only feasible , but also highly competitive • Enforcing a timeout on consistency algorithms – Prevents getting stuck on one part of the problem – Does not affect soundness • Future work – Dynamically determine timeout based on the anticipated amount of filtering – Heuristics for ordering the clusters Constraint Systems Laboratory 9/3/16 CP 2016 20

  21. Thank you Questions? Constraint Systems Laboratory 9/3/16 CP 2016 21

  22. Constraint Systems Laboratory 9/3/16 CP 2012 22

  23. Constraint Systems Laboratory 9/3/16 CP 2012 23

  24. Classifier Training: Evaluation • Using 10-fold cross validation • Using both weighted and un-weighted instances Constraint Systems Laboratory 9/3/16 CP 2016 24

  25. F ILTER C LUSTERS Enforce GAC globally Build cluster L IST � ? Repeat until quiescence For cluster C in L IST � ‘AllSol’ Classify C t i t 3 t 2 t 1 ‘PerTuple’ ‘Neither’ Process C within time limit Enforce GAC globally Reverse L IST � Constraint Systems Laboratory 9/3/16 CP 2016 25

  26. Experiments: Tested strategies (2) Runtime of All Instances 1x10 6 1 second cutoff per cluster 100000 P ER T UPLE Time (msec) 10000 1000 100 10 1 1x10 6 1 10 100 1000 10000 100000 A LL S OL Time (msec) Constraint Systems Laboratory 9/3/16 CP 2016 26

  27. Experiments: Results (2) Instance Completions by Runtime 700 600 Completed Instances 500 400 300 D EC T REE 200 R ANDOM P ER T UPLE + GAC 100 P ER T UPLE A LL S OL + A LL S OL 0 0.01 0.1 1 10 100 1000 Runtime (sec) Constraint Systems Laboratory 9/3/16 CP 2016 27

  28. Background: Tree decomposition, minimality • Build a tree C 1 decomposition C 7 C N • Localize the enforcement C 2 C 8 C 3 B D of minimality to the A I M clusters C 5 K H F C 6 L J • Process clusters in C 4 M AX C LIQUES order back C 10 E G C 9 and forth to quiescence Constraint Systems Laboratory 9/3/16 CP 2016 28

Recommend


More recommend