Lecture 10: Recurrent Neural Networks Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 1
Administrative - Midterm this Wednesday! woohoo! - A3 will be out ~Wednesday Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 2
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 3
http://mtyka.github.io/deepdream/2016/02/05/bilateral-class-vis.html Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 4
http://mtyka.github.io/deepdream/2016/02/05/bilateral-class-vis.html Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 5
Recurrent Networks offer a lot of flexibility: Vanilla Neural Networks Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 6
Recurrent Networks offer a lot of flexibility: e.g. Image Captioning image -> sequence of words Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 7
Recurrent Networks offer a lot of flexibility: e.g. Sentiment Classification sequence of words -> sentiment Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 8
Recurrent Networks offer a lot of flexibility: e.g. Machine Translation seq of words -> seq of words Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 9
Recurrent Networks offer a lot of flexibility: e.g. Video classification on frame level Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 10
Sequential Processing of fixed inputs Multiple Object Recognition with Visual Attention, Ba et al. Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 11
Sequential Processing of fixed outputs DRAW: A Recurrent Neural Network For Image Generation, Gregor et al. Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 12
Recurrent Neural Network RNN x Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 13
Recurrent Neural Network usually want to y predict a vector at some time steps RNN x Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 14
Recurrent Neural Network We can process a sequence of vectors x by applying a recurrence formula at every time step: y RNN new state old state input vector at some time step some function x with parameters W Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 15
Recurrent Neural Network We can process a sequence of vectors x by applying a recurrence formula at every time step: y RNN Notice: the same function and the same set x of parameters are used at every time step. Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 16
(Vanilla) Recurrent Neural Network The state consists of a single “hidden” vector h : y RNN x Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 17
Character-level y language model example RNN Vocabulary: x [h,e,l,o] Example training sequence: “hello” Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 18
Character-level language model example Vocabulary: [h,e,l,o] Example training sequence: “hello” Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 19
Character-level language model example Vocabulary: [h,e,l,o] Example training sequence: “hello” Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 20
Character-level language model example Vocabulary: [h,e,l,o] Example training sequence: “hello” Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 21
min-char-rnn.py gist: 112 lines of Python (https://gist.github. com/karpathy/d4dee566867f8291f086) Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 22
min-char-rnn.py gist Data I/O
min-char-rnn.py gist Initializations recall:
min-char-rnn.py gist Main loop
min-char-rnn.py gist Main loop
min-char-rnn.py gist Main loop
min-char-rnn.py gist Main loop
min-char-rnn.py gist Main loop
min-char-rnn.py gist Loss function - forward pass (compute loss) - backward pass (compute param gradient)
min-char-rnn.py gist Softmax classifier
min-char-rnn.py gist recall:
min-char-rnn.py gist
y RNN x Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 34
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 35
at first: train more train more train more Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 36
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 37
open source textbook on algebraic geometry Latex source Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 38
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 39
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 40
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 41
Generated C code Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 42
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 43
Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 44
Searching for interpretable cells [Visualizing and Understanding Recurrent Networks, Andrej Karpathy*, Justin Johnson*, Li Fei-Fei] Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 45
Searching for interpretable cells quote detection cell Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 46
Searching for interpretable cells line length tracking cell Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 47
Searching for interpretable cells if statement cell Fei-Fei Li & Andrej Karpathy & Justin Johnson Fei-Fei Li & Andrej Karpathy & Justin Johnson Lecture 10 - Lecture 10 - 8 Feb 2016 8 Feb 2016 48
Recommend
More recommend