lecture 9 recurrent neural networks
play

Lecture 9 Recurrent Neural Networks Im glad that Im Turing Complete - PowerPoint PPT Presentation

Lecture 9 Recurrent Neural Networks Im glad that Im Turing Complete now Xinyu Zhou Megvii (Face++) Researcher zxy@megvii.com Nov 2017 Raise your hand and ask, whenever you have questions... We have a lot to cover and DONT


  1. https://distill.pub/2016/augmented-rnns

  2. https://distill.pub/2016/augmented-rnns

  3. https://distill.pub/2016/augmented-rnns

  4. https://distill.pub/2016/augmented-rnns

  5. https://distill.pub/2016/augmented-rnns

  6. Image Attention: Image Captioning ● Xu, Kelvin, et al. "Show, attend and tell: Neural image caption generation with visual attention." International Conference on Machine Learning. 2015.

  7. Image Attention: Image Captioning

  8. Image Attention: Image Captioning

  9. Text Recognition ● Implicit language model

  10. Text Recognition ● Implicit language model

  11. Soft Attention RNN for OCR Loss 2 金口香牛肉面 金 口 Attention CNN Column FC 金口香牛肉面 Loss 1

  12. RNN with External Memory

  13. Copy a sequence Input Output

  14. Copy a sequence Input Solution in Python Output

  15. Copy a sequence Can neural network Input learn this program Solution in Python purely from data? Output

  16. Traditional Machine Learning ● √ Elementary Operations ● √* Logic flow control ○ Decision tree ● × External Memory ○ As opposed to internal memory (hidden states) Graves, Alex, Greg Wayne, and Ivo Danihelka. "Neural turing machines." arXiv preprint arXiv:1410.5401 (2014).

  17. Traditional Machine Learning ● √ Elementary Operations ● √* Logic flow control ● × External Memory Graves, Alex, Greg Wayne, and Ivo Danihelka. "Neural turing machines." arXiv preprint arXiv:1410.5401 (2014).

  18. Neural Turing Machines (NTM) ● NTM is a neural networks with a working memory ● It reads and write multiple times at each step ● Fully differentiable and can be trained end-to-end An NTM “Cell” Graves, Alex, Greg Wayne, and Ivo Danihelka. "Neural turing machines." arXiv preprint arXiv:1410.5401 (2014).

  19. Neural Turing Machines (NTM) m ● Memory ○ Sdfsdf n http://llcao.net/cu-deeplearning15/presentation/NeuralTuringMachines.pdf

  20. Neural Turing Machines (NTM) ● Read ● Hard indexing ⇒ Soft Indexing ○ A distribution of index ○ “Attention”

  21. Neural Turing Machines (NTM) ● Read Memory Locations ● Hard indexing ⇒ Soft Indexing ○ A distribution of index ○ “Attention”

  22. Neural Turing Machines (NTM) ● Read Memory Locations ● Hard indexing ⇒ Soft Indexing ○ A distribution of index ○ “Attention”

  23. Neural Turing Machines (NTM) ● Write ○ Write = erase + add erase add

  24. Neural Turing Machines (NTM) ● Write ○ Write = erase + add erase add

  25. Neural Turing Machines (NTM) ● Addressing

  26. Neural Turing Machines (NTM) ● Addressing ● 1. Focusing by Content ● Cosine Similarity

  27. Neural Turing Machines (NTM) ● Addressing ● 1. Focusing by Content ● Cosine Similarity

  28. Neural Turing Machines (NTM) ● 1. Focusing by Content ● 2. Interpolate with previous step

  29. Neural Turing Machines (NTM) ● 1. Focusing by Content ● 2. Interpolate with previous step

  30. Neural Turing Machines (NTM) ● 1. Focusing by Content ● 2. Interpolate with previous step ● 3. Convolutional Shift

  31. Neural Turing Machines (NTM) ● 1. Focusing by Content ● 2. Interpolate with previous step ● 3. Convolutional Shift

  32. Neural Turing Machines (NTM) ● 1. Focusing by Content ● 2. Interpolate with previous step ● 3. Convolutional Shift

  33. Neural Turing Machines (NTM) ● 1. Focusing by Content ● 2. Interpolate with previous step ● 3. Convolutional Shift ● 4. Shapening

Recommend


More recommend