automatic solver configuration
play

Automatic Solver Configuration and Solver Portfolios Meinolf - PowerPoint PPT Presentation

Automatic Solver Configuration and Solver Portfolios Meinolf Sellmann IBM Research Watson AI for Optimization Why Tune Algorithms? Develops Pretunes Expert Documents Parameters Algorithm Tune Instances Users AI for Optimization CP


  1. Automatic Solver Configuration and Solver Portfolios Meinolf Sellmann IBM Research Watson AI for Optimization

  2. Why Tune Algorithms? Develops Pretunes Expert Documents Parameters Algorithm Tune Instances Users AI for Optimization CP 2011 Meinolf Sellmann

  3. Why Tune Algorithms? AI for Optimization CP 2011 Meinolf Sellmann

  4. Tuning vs Configuration AI for Optimization CP 2011 Meinolf Sellmann

  5. Why Tune Algorithms? • Algorithms have parameters – Implicit in the implementation – Open to user – Big influence on practical performance (speed, accuracy, robustness, etc) • The practice: manual tuning – Takes a lot of time, often not very good – Requires user to learn meaning of parameters • Objectives – Automate tuning – Automatic algorithm customization – Aid developers in algorithm configuration – Enable fair comparison of algorithms AI for Optimization CP 2011 Meinolf Sellmann

  6. Why Bundle Algorithms? AI for Optimization CP 2011 Meinolf Sellmann

  7. Why Bundle Algorithms? AI for Optimization CP 2011 Meinolf Sellmann

  8. Content • Instance-Oblivious Tuning • Algorithm Portfolios – Overview of Approaches – Overview of Approaches – Parameters: Variable Tree – SATzilla Representation – CP-Hydra – GGA: Gender-Based – 3S Genetic Algorithm – GGA: Numerical Results • Instance-Specific Tuning – Overview of Approaches – ISAC: Feature-based Parameter Selection – ISAC: Numerical Results AI for Optimization CP 2011 Meinolf Sellmann

  9. Instance-oblivious Tuning • One parameter set fits all • Most common way of tuning • Customization only by parameterization + manual tuning AI for Optimization CP 2011 Meinolf Sellmann

  10. Overview of Methods • Popular Tuning Methods – Enumerate all configurations – Test specific configurations (based on some understanding of the parameters) – Hand tuning (usually by limited local search) – Automated tuning AI for Optimization CP 2011 Meinolf Sellmann

  11. Overview of Methods • Continuous parameters – Mesh-adaptive Direct Search, MADS [Audet et al, '06] – Population-based, e.g. CMA-ES [Hansen et al, '95] • Categorical parameters – Hill-climbing, Composer [Gratch et al, '92] – Beam search, MULTI-TAC [Minton, '93] – Racing algorithms, F-Race [Birattari et al, '02] – CALIBRA [Adenso-Diaz & Laguna, '06] – Iterated Local Search, ParamILS [Hutter et al, '07] • Model-Based Parameter Optimization – Sequential Parameter Optimization (SPO) [Bartz-Beielstein et al., '05] – Extensions of SPO [Hutter et al, ‘09] • Non-model-based configuration for general parameters – Gender-based genetic algorithm (GGA) [Ansotegui et al. ‘09] AI for Optimization CP 2011 Meinolf Sellmann

  12. C ovariance M atrix A daptation E volution S trategy • General optimizer for highly non-linear continuous optimization problems • Black box optimization (derivatives not available) • The typical difference quotients are not useful • Discontinuities • Noise and outlier • Many local optima • In summary: Black box optimization in a rough or rugged landscape. AI for Optimization CP 2011 Meinolf Sellmann

  13. C ovariance M atrix A daptation E volution S trategy • Repeat – Sample m times around “point of interest” according to N( , ). – Determine best sampling point and set to it. – Adapt . AI for Optimization CP 2011 Meinolf Sellmann

  14. C ovariance M atrix A daptation E volution S trategy AI for Optimization CP 2011 Meinolf Sellmann

  15. C ovariance M atrix A daptation E volution S trategy AI for Optimization CP 2011 Meinolf Sellmann

  16. C ovariance M atrix A daptation E volution S trategy AI for Optimization CP 2011 Meinolf Sellmann

  17. C ovariance M atrix A daptation E volution S trategy AI for Optimization CP 2011 Meinolf Sellmann

  18. Multi-TAC • Selector for heuristics in backtrack search • Beam Search Approach • Repeat – Add a single heuristic to each current search method – Evaluate all resulting search methods – Keep the best m methods AI for Optimization CP 2011 Meinolf Sellmann

  19. Multi-TAC ( o , o , o ) ( 1 , o , o ) ( o , 1 , o ) ( o , o , 1 ) ( o , o , 2 ) ( o , 2 , o ) ( 2 , o , o ) ( o , o , 3 ) ( 3 , o , o ) ( 4 , o , o ) AI for Optimization CP 2011 Meinolf Sellmann

  20. Multi-TAC ( 1 , o , o ) ( 12 , o , o ) ( 1 , 1 , o ) ( 1 , o , 1 ) ( 1 , 2 , o ) ( 1 , o , 2 ) ( 13 , o , o ) ( 1 , o , 3 ) ( 14 , o , o ) ( 3 , o , o ) ( 31 , o , o ) ( 3 , 1 , o ) ( 3 , o , 1 ) ( 3 , o , 2 ) ( 3 , 2 , o ) ( 32 , o , o ) ( 3 , o , 3 ) ( 34 , o , o ) AI for Optimization CP 2011 Meinolf Sellmann

  21. Multi-TAC ( 34 , o , 2 ) ( 341 , o , 2 ) ( 34 , 1 , 2 ) ( 34, o , 21 ) ( 34 , 2 , 2 ) ( 34 , o , 23 ) ( 342, o , 2 ) ( 3 , o , 2 ) ( 31 , o , 2 ) ( 3 , 1 , 2 ) ( 3 , o , 21 ) ( 3 , o , 23 ) ( 3 , 2 , 2 ) ( 32 , o , 2 ) ( 34 , o , 2 ) AI for Optimization CP 2011 Meinolf Sellmann

  22. F-Race • How to determine whether one meta- heuristic works better than another? • Repeat – Pick a new instance – Run and rank all algorithms still in the race – Remove inferior algorithms AI for Optimization CP 2011 Meinolf Sellmann

  23. F-Race 1 2 3 4 5 5 1 6 4 3 5 3 2 2 1 2 2 7 6 6 4 5 4 7 4 3 1 3 2 1 1 AI for Optimization CP 2011 Meinolf Sellmann

  24. F-Race Friedmann Test AI for Optimization CP 2011 Meinolf Sellmann

  25. GGA • General purpose tuner • Handles various types of parameters • Provides high-quality configurations – Robustly – With reasonable computational effort • Exploits – Optimization technology – Parallelism AI for Optimization CP 2011 Meinolf Sellmann

  26. Content • Instance-Oblivious Tuning • Algorithm Portfolios – Overview of Approaches – Overview of Approaches – Parameters: Variable Tree – SATzilla Representation – CP-Hydra – GGA: Gender-Based – 3S Genetic Algorithm – GGA: Numerical Results • Instance-Specific Tuning – Overview of Approaches – ISAC: Feature-based Parameter Selection – ISAC: Numerical Results AI for Optimization CP 2011 Meinolf Sellmann

  27. Variable Trees Categorical Parameter r r g Independence & 1 Numerical Parameter & .5 2 Ordinal Parameter 1 .7 3 AI for Optimization

  28. Variable Trees • Parameter structure represented by an And-Or structure • Represents parameter (in)dependence • Example:    2 f ( x ) ( x x x q )   3 i 3 i 1 3 i 2 i  i [ 0 , n ] & x 3i x 3n x 0 . . . . . . x 1 x 3i+1 X 3n+1 x 2 x 3i+2 X 3n+2 AI for Optimization

  29. Generic Crossover Operator r r N C g r r g & 1 & .7 & .5 2 .9 1 & 1 .7 3 .6 1 2 O r r g O N & 1 N C C & .9 2 C C C 1 2 .6 AI for Optimization CP 2011 Meinolf Sellmann

  30. Genetic Algorithm • Computational Limitations – Low number of individuals – Low number of generations • What did nature do when going from to ? AI for Optimization CP 2011 Meinolf Sellmann

  31. Gender-based Genetic Algorithm • Optimization Problems – Low number of generations  aggressive optimization – Low number of individuals  emphasis on diversity • How can genders help? – Split the population into two genders: competitive (C) and non-competitive (N) – Save 50% of evaluations – Racing: Winners determine evaluation time! – Can afford aggressive selection pressure on C – Individuals in N provide the needed diversity AI for Optimization CP 2011 Meinolf Sellmann

  32. Gender-based Genetic Algorithm C N N N C Race in N Tournament N N Crossover Mutation N N N C Aging and Death N N AI for Optimization CP 2011 Meinolf Sellmann

  33. Population Control • All members have an age • Only 2/ A of the N population mates • 1/A of each population dies at age A C N C N Children 1 X% 2 A A 1 Mating A ฀  ฀  1/A die of old age ฀  AI for Optimization CP 2011 Meinolf Sellmann

  34. Content • Instance-Oblivious Tuning • Algorithm Portfolios – Overview of Approaches – Overview of Approaches – Parameters: Variable Tree – SATzilla Representation – CP-Hydra – GGA: Gender-Based – 3S Genetic Algorithm – GGA: Numerical Results • Instance-Specific Tuning – Overview of Approaches – ISAC: Feature-based Parameter Selection – ISAC: Numerical Results AI for Optimization CP 2011 Meinolf Sellmann

  35. Results • Test against a standard GA – 3 functions with various dependencies – Tested with various population sizes and numbers of generations AI for Optimization CP 2011 Meinolf Sellmann

  36. Results AI for Optimization CP 2011 Meinolf Sellmann

  37. Results Competitive best fitness Non-competitive best fitness Competitive average fitness Non-competitive average fitness AI for Optimization CP 2011 Meinolf Sellmann

  38. Results • Standard GA vs. GGA tuning SAT-solver SAPS • 40 generations with 30 members • Cutoff of 10 seconds • GA wastes lots of time on bad solutions AI for Optimization CP 2011 Meinolf Sellmann

  39. Results SAPS (ms) SAT4J (s) AI for Optimization CP 2011 Meinolf Sellmann

Recommend


More recommend