Applications of Machine Learning in Engineering (and Parameter Tuning) Lars Kotthofg University of Wyoming larsko@uwyo.edu RMACC, 23 May 2019 slides available at https://www.cs.uwyo.edu/~larsko/slides/rmacc19.pdf 1
Optimizing Graphene Oxide Reduction material results 2 ▷ reduce graphene oxide to graphene through laser irradiation ▷ allows to create electrically conductive lines in insulating ▷ laser parameters need to be tuned carefully to achieve good
From Graphite/Coal to Carbon Electronics 3 Overview of the Process
Evaluation of Irradiated Material 4
ML-Optimized Laser Parameters 5 ● ● ● ● ● ● ● 6 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● Ratio ● ● ● ● ● ● ● ● ● ● ● 4 ● ● ● ● ● ● ● 2 0 10 20 30 40 50 Iteration
ML-Optimized Laser Parameters 6 ● ● 6 ● ● ● ● ● ● Ratio 4 + Prediction 2 Actual • 50 um 50 um 0 2 4 6 8 Iteration After 1 st prediction During Training • Predictions work even with small training dataset (19 points) • AI Model achieved I G /I D ratio (>6) after 1st prediction
Explored Parameter Space 7 48 47 12 18 29 1 5 9 7 21 17 35 6 37 42 22 19 31 38 33 28 25 10 32 11 13 2 27 34 8 16 20 30 39 Ratio 6 3 44 43 46 45 14 26 24 40 4 4 36 41 ● 23 ● 15 ● ● ● ● ● 2 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● Parameter Space
Design of New Materials absorption of material 8 ▷ optimize parameters of pattern generator for energy
Big Picture techniques intelligently – automatically 9 ▷ advance the state of the art through meta-algorithmic ▷ rather than inventing new things, use existing things more ▷ invent new things through combinations of existing things
What to Tune – Parameters method 10 ▷ anything you can change that makes sense to change ▷ e.g. heuristic, power of a laser, kernel for a machine learning ▷ not random seed, whether to enable debugging, etc. ▷ some will afgect performance, others will have no efgect at all
Automated Parameter Tuning Frank Hutter and Marius Lindauer, “Algorithm Confjguration: A Hands on Tutorial”, AAAI 2016 11
General Approach workings intensifjcation/exploitation 12 ▷ evaluate algorithm as black-box function ▷ observe efgect of parameters without knowing the inner ▷ decide where to evaluate next ▷ balance diversifjcation/exploration and ▷ repeat until satisfjed
When are we done? solution (with fjnite time) 13 ▷ most approaches incomplete ▷ cannot prove optimality, not guaranteed to fjnd optimal ▷ performance highly dependent on confjguration space � How do we know when to stop?
Resource Budget How much time/how many function evaluations? 14 ▷ too much � wasted resources ▷ too little � suboptimal result ▷ use statistical tests ▷ evaluate on parts of the instance set ▷ for runtime: adaptive capping
Grid and Random Search Bergstra, James, and Yoshua Bengio. “Random Search for Hyper-Parameter Optimization.” J. Mach. Learn. Res. 13, no. 1 (February 2012): 281–305. 15 ▷ evaluate certain points in parameter space
Local Search quality achieved 16 ▷ start with random confjguration ▷ change a single parameter (local search step) ▷ if better, keep the change, else revert ▷ repeat, stop when resources exhausted or desired solution ▷ restart occasionally with new random confjgurations
Local Search Example graphics by Holger Hoos 17 (Initialisation)
Local Search Example graphics by Holger Hoos 18 (Initialisation)
Local Search Example graphics by Holger Hoos 19 (Local Search)
Local Search Example graphics by Holger Hoos 20 (Local Search)
Local Search Example graphics by Holger Hoos 21 (Perturbation)
Local Search Example graphics by Holger Hoos 22 (Local Search)
Local Search Example graphics by Holger Hoos 23 (Local Search)
Local Search Example graphics by Holger Hoos 24 (Local Search)
Local Search Example graphics by Holger Hoos 25 ? Selection (using Acceptance Criterion)
Surrogate-Model-Based Search on this quality achieved 26 ▷ evaluate small number of initial (random) confjgurations ▷ build surrogate model of parameter-performance surface based ▷ use model to predict where to evaluate next ▷ repeat, stop when resources exhausted or desired solution ▷ allows targeted exploration of promising confjgurations
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 27 Iter = 1, Gap = 1.5281e−01 0.8 ● ● y 0.4 type ● ● init ● 0.0 prop type 0.06 y yhat 0.04 ei 0.02 ei 0.00 −0.02 −1.0 −0.5 0.0 0.5 1.0 x
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 28 Iter = 2, Gap = 1.5281e−01 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y 0.03 yhat ei 0.02 ei 0.01 0.00 −1.0 −0.5 0.0 0.5 1.0 x
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 29 Iter = 3, Gap = 1.5281e−01 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y 0.020 yhat ei 0.015 ei 0.010 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 30 Iter = 4, Gap = 1.3494e−02 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y yhat 0.010 ei ei 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 31 Iter = 5, Gap = 1.3494e−02 0.8 ● ● y 0.4 type ● ● init ● prop 0.0 seq type y 0.015 yhat ei 0.010 ei 0.005 0.000 −1.0 −0.5 0.0 0.5 1.0 x
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 32 Iter = 6, Gap = 2.1938e−06 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y 0.006 yhat ei 0.004 ei 0.002 0.000 −1.0 −0.5 0.0 0.5 1.0 x
Surrogate-Model-Based Search Example Bischl, Bernd, Jakob Richter, Jakob Bossek, Daniel Horn, Janek Thomas, and Michel Lang. “MlrMBO: A http://arxiv.org/abs/1703.03373. Modular Framework for Model-Based Optimization of Expensive Black-Box Functions,” March 9, 2017. 33 Iter = 7, Gap = 2.1938e−06 0.8 ● ● y 0.4 type ● init ● ● prop 0.0 seq type y yhat 1e−03 ei ei 5e−04 0e+00 −1.0 −0.5 0.0 0.5 1.0 x
Tools and Resources TPOT https://github.com/EpistasisLab/tpot mlrMBO https://github.com/mlr-org/mlrMBO SMAC http://www.cs.ubc.ca/labs/beta/Projects/SMAC/ Spearmint https://github.com/HIPS/Spearmint TPE https://jaberg.github.io/hyperopt/ COSEAL group for COnfjguration and SElection of ALgorithms: https://www.coseal.net/ Further reading: Jones, Donald R., Matthias Schonlau, and William J. Welch. “Effjcient Global Optimization of Expensive Black-Box Functions.” J. of Global Optimization 13, no. 4 (December 1998): 455–92. 34 iRace http://iridia.ulb.ac.be/irace/
https://www.automl.org/book/ 35
I’m hiring! Several positions available. 36
Exercises https://www.cs.uwyo.edu/~larsko/mbo/mbo.py and run it 37 ▷ (install python and/or scikit-learn) ▷ download ▷ try a difgerent data set ▷ try tuning difgerent/more parameters ▷ …
Recommend
More recommend