Deep Learning Made Easy with GraphLab Create Piotr Teterwak � Dato Team 1
Who I am Piotr ¡Teterwak ¡ Software ¡Engineer 2
GraphLab Create • A platform for building predictive applications, fast � • Data engineering on Big Data � • Interactive visualization � • Fast machine learning toolkits � • Easy deployment � � • Python frontend, C++ backend 3
The Dato Team 4
Making Deep Learning Easy 5
Deep Learning 6
Deep Learning Made Easy � • Intuitive API � • Transfer Learning � • Integration with other tools in GraphLab Create 7
What is Deep Learning? 8
Machine Learning • Algorithms that can learn from data without being explicitly programmed. � • One example would be image classification, i.e binning an image as one of a fixed number of categories. 9
Machine Learning “cat” 10
Deep Learning “cat” f1(x) f2(x) f3(x) 11
Deep Neural Networks P(cat|x) P(dog|x) http://deeplearning.stanford.edu/wiki/images/4/40/Network3322.png 12
Deep Neural Networks • Can model any function with enough hidden units. � • This is tremendously powerful: given enough units, it is possible to train a neural network to solve arbitrarily difficult problems. � • But also very difficult to train, too many parameters means too much memory +computation time. 13
Neural Nets and GPU’s • Many operations in Neural Net training can happen in parallel � • Reduces to matrix operations, many of which can be easily parallelized on a GPU. 14
Convolutional Neural Nets • Strategic removal of edges Hidden ¡Layer Input ¡Layer 15
Convolutional Neural Nets • Strategic removal of edges Hidden ¡Layer Input ¡Layer 16
Convolutional Neural Nets • Strategic removal of edges Hidden ¡Layer Input ¡Layer 17
Convolutional Neural Nets • Strategic removal of edges Hidden ¡Layer Input ¡Layer 18
Convolutional Neural Nets • Strategic removal of edges Hidden ¡Layer Input ¡Layer 19
Convolutional Neural Nets • Strategic removal of edges Hidden ¡Layer Input ¡Layer 20
Convolutional Neural Nets http://ufldl.stanford.edu/wiki/images/6/6c/Convolution_schematic.gif 21
Pooling layer Ranzato, ¡LSVR ¡tutorial ¡@ ¡CVPR, ¡2014. ¡ www.cs.toronto.edu/~ranzato 22
Pooling layer http://ufldl.stanford.edu/wiki/images/6/6c/Pooling_schematic.gif 23
Overall architecture ¡A. ¡Krizhevsky, ¡I. ¡Sutskever ¡and ¡G.E. ¡Hinton. ¡“ImageNet ¡ Classification ¡with ¡Deep ¡Convolutional ¡Neural ¡Networks”. ¡NIPS ¡ (2012) 24
Hierarchichal Representation Hands ¡-‑ ¡Face ¡-‑ ¡Ground Y. ¡Bengio ¡(2009) 25
Deep learning features Input Learned ¡hierarchy Output 26 Lee ¡et ¡al. ¡‘Convolutional ¡Deep ¡Belief ¡Networks ¡for ¡Scalable ¡Unsupervised ¡Learning ¡of ¡Hierarchical ¡Representations’ ¡ICML ¡2009
Where can we use Deep Learning? 27
Image tagging 28
A quick demo! 29
Deep learning workflow Adjust ¡hyper-‑parameters � � Create � Train Set Model � 80% Labelled � data 20% � Test Set Validate? � � Probably ¡not ¡good ¡enough � • Notice the cycle…you can only break out of this with intuition, time, and lots of frustration. � • But, when you do, magic happens! 30
Simplifying Deep Learning with Deep Features and Transfer Learning 31
Transfer learning • Train a model on one task, use it for another task � • Examples � • Learn to walk, use that knowledge to run � • Train image tagger to recognize cars, use that knowledge to recognize trucks. 32
Deep learning features Input Learned ¡hierarchy Output 33 Lee ¡et ¡al. ¡‘Convolutional ¡Deep ¡Belief ¡Networks ¡for ¡Scalable ¡Unsupervised ¡Learning ¡of ¡Hierarchical ¡Representations’ ¡ICML ¡2009
Feature extraction Input Mid-‑level ¡features ¡probably ¡ useful ¡for ¡other ¡tasks ¡which ¡ ¡ require ¡detection ¡of ¡facial ¡ ¡ Learned ¡hierarchy anatomy Lee ¡et ¡al. ¡‘Convolutional ¡Deep ¡Belief ¡Networks ¡for ¡Scalable ¡Unsupervised ¡Learning ¡of ¡Hierarchical ¡Representations’ ¡ICML ¡2009 34
Feature extraction Extract ¡activations ¡from ¡some ¡ deep ¡layer ¡of ¡neural ¡network http://deeplearning.stanford.edu/wiki/images/4/40/Network3322.png 35
Transfer learning using deep features Create Train Set Simpler Model 80% Labelled Extract Features data using Neural Net 20% trained on different task Test Set Validate? Probably ¡works Deploy $$$ 36
Using ImageNet-trained network as extractor for general features • Using classic AlexNet architechture pioneered by Alex Krizhevsky et. al in ImageNet Classification with Deep Convolutional Neural Networks � • It turns out that a neural network trained on ~1 million images of about 1000 classes makes a surprisingly general feature extractor � • First illustrated by Donahue et al in DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition 37
Demo 38
Caltech-101 39
Caltech-101 Extract ¡features ¡here 40
Deep Features and Logistic Regression 41
What else can we do with Deep Features? 42
Finding similar images 43
Clustering images Set of Images Goldberger et al. 44
How general are these Deep Features? 45
Deep Features are Generalizable 46
Thank you! • To learn more: http://www.dato.com/learn � � • Play with the demos: � • Pathways: https://pathways-demo.herokuapp.com/ � • Phototag: http://phototag.herokuapp.com/ � � • Contact: � • piotr@dato.com � � • We are hiring! jobs@dato.com � � • Thank you to Nvidia for designing and providing the hardware to make this possible � � • Please complete the Presenter Evaluation. Your feedback is important! 47
Recommend
More recommend