accelerating model development by reducing operational
play

Accelerating Model Development by Reducing Operational Barriers - PowerPoint PPT Presentation

Accelerating Model Development by Reducing Operational Barriers Patrick Hayes, Cofounder & CTO, SigOpt Talk ID: S9556 Accelerate and amplify the impact of modelers everywhere 3 SigOpt automates experimentation and optimization Data


  1. Accelerating Model Development by Reducing Operational Barriers Patrick Hayes, Cofounder & CTO, SigOpt Talk ID: S9556

  2. Accelerate and amplify the impact of modelers everywhere

  3. 3

  4. SigOpt automates experimentation and optimization Data Model Experimentation, Training, Evaluation Preparation Deployment Transformation Validation Notebook & Model Framework Labeling Serving Pre-Processing Deploying Pipeline Dev. Monitoring Feature Eng. Managing Feature Stores Inference Experimentation & Model Optimization Online Testing Insights, Tracking, Model Search, Resource Scheduler, Collaboration Hyperparameter Tuning Management Hardware Environment On-Premise Hybrid Multi-Cloud

  5. Model Tuning Deep Learning Architecture Search Training & Tuning Hyperparameter Search Hyperparameter Optimization Grid Search Evolutionary Algorithms Random Search Bayesian Optimization

  6. How it works: Seamlessly tune any model Never accesses your data or models ML, DL or Training Simulation Model Data Testing Model Evaluation or Data Backtest

  7. How it Works: Seamless implementation for any stack Install SigOpt 1 2 Create experiment Parameterize model 3 Run optimization loop 4 5 Analyze experiments

  8. How it Works: Seamless implementation for any stack Install SigOpt 1 2 Create experiment Parameterize model 3 Run optimization loop 4 5 Analyze experiments https://bit.ly/sigopt-notebook

  9. How it Works: Seamless implementation for any stack Install SigOpt 1 2 Create experiment Parameterize model 3 Run optimization loop 4 5 Analyze experiments https://bit.ly/sigopt-notebook

  10. How it Works: Seamless implementation for any stack Install SigOpt 1 2 Create experiment Parameterize model 3 Run optimization loop 4 5 Analyze experiments https://bit.ly/sigopt-notebook

  11. How it Works: Seamless implementation for any stack Install SigOpt 1 2 Create experiment Parameterize model 3 Run optimization loop 4 5 Analyze experiments https://bit.ly/sigopt-notebook

  12. How it Works: Seamless implementation for any stack Install SigOpt 1 2 Create experiment Parameterize model 3 Run optimization loop 4 5 Analyze experiments

  13. Benefits: Better, Cheaper, Faster Model Development 90% Cost Savings 10x Faster Time to Tune Better Performance Maximize utilization of compute Less expert time per model No free lunch, but optimize any model https://aws.amazon.com/blogs/machine-learning/fast- https://devblogs.nvidia.com/sigopt-deep-learning-hyp https://arxiv.org/pdf/1603.09441.pdf cnn-tuning-with-aws-gpu-instances-and-sigopt/ erparameter-optimization/ 13

  14. Overview of Features Behind SigOpt Advanced experiment Parameter importance Intuitive web dashboards visualizations analysis Experiment Key: Insights Only HPO solution Cross-team permissions Organizational Reproducibility with this capability and collaboration experiment analysis Continuous, categorical, Up to 10k observations, Conditional or integer parameters 100 parameters parameters Optimization Engine Constraints and failure Multitask optimization Multimetric optimization regions and high parallelism Libraries for Python, REST API Black-box interface Java, R, and MATLAB Enterprise Platform Infrastructure agnostic Model agnostic Doesn’t touch data 14

  15. Applied deep learning introduces unique challenges

  16. sigopt.com/blog Failed observations Constraints Uncertainty Competing objectives Lengthy training cycles Cluster orchestration

  17. How do you more efficiently tune models that take days (or weeks) to train?

  18. AlexNex to AlphaGo Zero: 300,000x Increase in Compute • AlphaGo Zero • AlphaZero 10,000 • Neural Machine Translation • Neural Architecture Search Petaflop/s - Day (Training) • TI7 Dota 1v1 • Xception VGG • Seq2Seq • DeepSpeech2 1 • GoogleNet • ResNets • AlexNet • Visualizing and Understanding Conv Nets • Dropout • DQN .00001 2012 2013 2014 2015 2016 2017 2018 2019 Year 18

  19. Speech Recognition Deep Reinforcement Learning Computer Vision 19

  20. Training Resnet-50 on ImageNet takes 10 hours Tuning 12 parameters requires at least 120 distinct models That equals 1,200 hours, or 50 days, of training time

  21. Running optimization tasks in parallel is critical to tuning expensive deep learning models

  22. Complexity of Deep Learning DevOps Advanced Case Basic Case Multiple Users Concurrent Optimization Experiments Training One Model, No Optimization Concurrent Model Configuration Evaluations Multiple GPUs per Model 22

  23. Cluster Orchestration Spin up and share training clusters Schedule optimization experiments 1 2 Integrate with the optimization API Monitor experiment and infrastructure 3 4 23

  24. Problems: Infrastructure, scheduling, dependencies, code, monitoring Solution: SigOpt Orchestrate is a CLI for managing training infrastructure and running optimization experiments

  25. How it Works

  26. Seamless Integration into Your Model Code

  27. Easily Define Optimization Experiments

  28. Easily Kick Off Optimization Experiment Jobs

  29. Check the Status of Active and Completed Experiments

  30. View Experiment Logs Across Multiple Workers

  31. Track Metadata and Monitor Your Results

  32. Automated Cluster Management

  33. Training Resnet-50 on ImageNet takes 10 hours Tuning 12 parameters requires at least 120 distinct models That equals 1,200 hours , or 50 days , of training time While training on 20 machines , wall-clock time is 50 days 2.5 days

  34. sigopt.com/blog Failed Observations Constraints Uncertainty Competing Objectives Lengthy Training Cycles Cluster Orchestration

  35. Try SigOpt Orchestrate: https://sigopt.com/orchestrate Free access for Academics & Nonprofits: https://sigopt.com/edu Solution-oriented program for the Enterprise: https://sigopt.com/pricing Leading applied optimization research: https://sigopt.com/research … and we're hiring! https://sigopt.com/careers Thank you! patrick@sigopt.com for additional questions.

Recommend


More recommend