ai augmented algorithms
play

AI-Augmented Algorithms How I Learned to Stop Worrying and Love - PowerPoint PPT Presentation

AI-Augmented Algorithms How I Learned to Stop Worrying and Love Choice Lars Kotthofg University of Wyoming larsko@uwyo.edu Warsaw, 17 April 2019 Outline 2 Big Picture Motivation Choosing Algorithms Tuning Algorithms


  1. AI-Augmented Algorithms How I Learned to Stop Worrying and Love Choice Lars Kotthofg University of Wyoming larsko@uwyo.edu Warsaw, 17 April 2019

  2. Outline 2 ▷ Big Picture ▷ Motivation ▷ Choosing Algorithms ▷ Tuning Algorithms ▷ Applications ▷ Outlook and Resources

  3. Big Picture techniques intelligently – automatically 3 ▷ advance the state of the art through meta-algorithmic ▷ rather than inventing new things, use existing things more ▷ invent new things through combinations of existing things

  4. Motivation – What Difgerence Does It Make? 4

  5. Prominent Application Fréchette, Alexandre, Neil Newman, Kevin Leyton-Brown. “Solving the Station Packing Problem.” In Association for the Advancement of Artifjcial Intelligence (AAAI), 2016. 5

  6. Performance Difgerences Hurley, Barry, Lars Kotthofg, Yuri Malitsky, and Barry O’Sullivan. “Proteus: A Hierarchical Portfolio of Solvers and Transformations.” In CPAIOR, 2014. 6 1000 100 Virtual Best SAT 10 1 0.1 0.1 1 10 100 1000 Virtual Best CSP

  7. Performance Improvements Hutter, Frank, Domagoj Babic, Holger H. Hoos, and Alan J. Hu. 27–34. Washington, DC, USA: IEEE Computer Society, 2007. FMCAD ’07: Proceedings of the Formal Methods in Computer Aided Design, “Boosting Verifjcation by Automatic Tuning of Decision Procedures.” In 7 4 10 SPEAR, optimized for SWV (s) 3 10 2 10 1 10 0 10 −1 10 −2 10 −2 10 −1 10 0 10 1 10 2 10 3 10 4 10 SPEAR, original default (s)

  8. Common Theme Performance models of black-box processes approximate model based on results of evaluations of the underlying process can be helpful) through interrogation of the model 8 ▷ also called surrogate models ▷ substitute expensive underlying process with cheap ▷ build approximate model using machine learning techniques ▷ no knowledge of what the underlying process is required (but ▷ may facilitate better understanding of the underlying process

  9. Choosing Algorithms 9

  10. Algorithm Selection Given a problem, choose the best algorithm to solve it. Rice, John R. “The Algorithm Selection Problem.” Advances in Computers 15 (1976): 65–118. 10

  11. Algorithm Selection . Extraction Feature Feature Extraction . . . Instance 6: Algorithm 3 Instance 5: Algorithm 3 Instance 4: Algorithm 2 . . Instance 6 Portfolio Instance 5 Instance 4 Performance Model Algorithm Selection Instance 3 Instance 1 Instance 2 Training Instances Algorithm 3 Algorithm 1 Algorithm 2 11

  12. Algorithm Portfolios algorithms across several securities performing poorly other algorithms known to have good performance Huberman, Bernardo A., Rajan M. Lukose, and Tad Hogg. “An Economics Approach to Hard Computational Problems.” Science 275, no. 5296 (1997): 51–54. doi:10.1126/science.275.5296.51. 12 ▷ instead of a single algorithm, use several complementary ▷ idea from Economics – minimise risk by spreading it out ▷ same for computational problems – minimise risk of algorithm ▷ in practice often constructed from competition winners or

  13. Algorithms “algorithm” used in a very loose sense 13 ▷ algorithms ▷ heuristics ▷ machine learning models ▷ software systems ▷ machines ▷ …

  14. Parallel Portfolios Why not simply run all algorithms in parallel? 14 ▷ not enough resources may be available/waste of resources ▷ algorithms may be parallelized themselves ▷ memory/cache contention

  15. Building an Algorithm Selection System algorithms in portfolio on a number of instances 15 ▷ requires algorithms with complementary performance ▷ most approaches rely on machine learning ▷ train with representative data, i.e. performance of all ▷ evaluate performance on separate set of instances ▷ potentially large amount of prep work

  16. Key Components of an Algorithm Selection System optional: extraction time) 16 ▷ feature extraction ▷ performance model ▷ prediction-based selector/scheduler ▷ presolver ▷ secondary/hierarchical models and predictors (e.g. for feature

  17. Types of Performance Models Instance 1 A3: 2 votes Pairwise Regression Models A1 - A2 0 A1 - A3 0 … A1: -1.3 A2: 0.4 A3: 1.7 Instance 2 A1: 1 vote Instance 3 . . . Instance 1: Algorithm 2 Instance 2: Algorithm 1 Instance 3: Algorithm 3 . . . A2: 0 votes … Regression Models A2 A1 A2 A3 A1: 1.2 A2: 4.5 A3: 3.9 Classifjcation Model A1 A3 A1 A1 A3 Pairwise Classifjcation Models A1 vs. A2 A1 A2 A1 A1 A1 vs. A3 A1 A1 A3 17

  18. Tuning Algorithms 18

  19. Algorithm Confjguration Given a (set of) problem(s), fjnd the best parameter confjguration. 19

  20. Parameters? resolution 20 ▷ anything you can change that makes sense to change ▷ e.g. search heuristic, optimization level, computational ▷ not random seed, whether to enable debugging, etc. ▷ some will afgect performance, others will have no efgect at all

  21. Automated Algorithm Confjguration black-box process 21 ▷ no background knowledge on parameters or algorithm – ▷ as little manual intervention as possible ▷ failures are handled appropriately ▷ resources are not wasted ▷ can run unattended on large-scale compute infrastructure

  22. Algorithm Confjguration Frank Hutter and Marius Lindauer, “Algorithm Confjguration: A Hands on Tutorial”, AAAI 2016 22

  23. General Approach workings, build surrogate model based on this data 23 ▷ evaluate algorithm as black-box function ▷ observe efgect of parameters without knowing the inner ▷ decide where to evaluate next, based on surrogate model ▷ repeat

  24. When are we done? parameter space solution (with fjnite time) 24 ▷ most approaches incomplete, i.e. do not exhaustively explore ▷ cannot prove optimality, not guaranteed to fjnd optimal ▷ performance highly dependent on confjguration space � How do we know when to stop?

  25. Time Budget How much time/how many function evaluations? 25 ▷ too much � wasted resources ▷ too little � suboptimal result ▷ use statistical tests ▷ evaluate on parts of the instance set ▷ for runtime: adaptive capping ▷ in general: whatever resources you can reasonably invest

  26. Grid and Random Search Bergstra, James, and Yoshua Bengio. “Random Search for Hyper-Parameter Optimization.” J. Mach. Learn. Res. 13, no. 1 (February 2012): 281–305. 26 ▷ evaluate certain points in parameter space

  27. Model-Based Search results Hutter, Frank, Holger H. Hoos, and Kevin Leyton-Brown. “Sequential Model-Based Optimization for General Algorithm Confjguration.” In LION 5, 507–23, 2011. 27 ▷ evaluate small number of confjgurations ▷ build model of parameter-performance surface based on the ▷ use model to predict where to evaluate next ▷ repeat ▷ allows targeted exploration of new confjgurations ▷ can take instance features into account like algorithm selection

  28. Model-Based Search Example 28 Iter = 1, Gap = 1.9909e−01 0.8 ● ● y 0.4 ● type ● init ● prop 0.0 ● type 0.025 y 0.020 yhat ei 0.015 ei 0.010 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x

  29. Model-Based Search Example 29 Iter = 2, Gap = 1.9909e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 ● seq type 0.03 y yhat 0.02 ei ei 0.01 0.00 −1.0 −0.5 0.0 0.5 1.0 x

  30. Model-Based Search Example 30 Iter = 3, Gap = 1.9909e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● type 0.006 y yhat 0.004 ei ei 0.002 0.000 −1.0 −0.5 0.0 0.5 1.0 x

  31. Model-Based Search Example 31 Iter = 4, Gap = 1.9992e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● type 8e−04 y 6e−04 yhat ei ei 4e−04 2e−04 0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  32. Model-Based Search Example 32 Iter = 5, Gap = 1.9992e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● type y 2e−04 yhat ei ei 1e−04 0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  33. Model-Based Search Example 33 Iter = 6, Gap = 1.9996e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● 0.00012 type y 0.00009 yhat ei 0.00006 ei 0.00003 0.00000 −1.0 −0.5 0.0 0.5 1.0 x

  34. Model-Based Search Example 34 Iter = 7, Gap = 2.0000e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● 5e−05 type y 4e−05 yhat 3e−05 ei ei 2e−05 1e−05 0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  35. Model-Based Search Example 35 Iter = 8, Gap = 2.0000e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● type 2.0e−05 y 1.5e−05 yhat ei ei 1.0e−05 5.0e−06 0.0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  36. Model-Based Search Example 36 Iter = 9, Gap = 2.0000e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● 1.0e−05 type y 7.5e−06 yhat ei 5.0e−06 ei 2.5e−06 0.0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  37. Model-Based Search Example 37 Iter = 10, Gap = 2.0000e−01 0.8 ● ● y type 0.4 ● ● init ● prop 0.0 seq ● type 4e−07 y yhat 3e−07 ei ei 2e−07 1e−07 0e+00 −1.0 −0.5 0.0 0.5 1.0 x

  38. Selected Applications 38

Recommend


More recommend