pattern recognition
play

Pattern Recognition Part 10: (Artificial) Neural Networks Gerhard - PowerPoint PPT Presentation

Pattern Recognition Part 10: (Artificial) Neural Networks Gerhard Schmidt Christian-Albrechts-Universitt zu Kiel Faculty of Engineering Institute of Electrical and Information Engineering Digital Signal Processing and System Theory Neural


  1. Pattern Recognition Part 10: (Artificial) Neural Networks Gerhard Schmidt Christian-Albrechts-Universität zu Kiel Faculty of Engineering Institute of Electrical and Information Engineering Digital Signal Processing and System Theory

  2. Neural Networks • Contents ❑ Motivation and literature ❑ Structure of a (basic) neural network ❑ Applications of neural networks ❑ Types of neural networks ❑ Basic training of neural networks ❑ Example applications Slide 2 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  3. Neural Networks • ❑ Motivation and literature Contents ❑ Neural networks ❑ Deep learning ❑ Literature ❑ Structure of a (basic) neural network ❑ Applications of neural networks ❑ Types of neural networks ❑ Basic training of neural networks ❑ Example applications Slide 3 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  4. Neural Networks • Motivation and Literature Neural networks: ❑ Neural networks are a very popular machine learning technique. ❑ They simulate the mechanisms of learning in biological systems such as the human brain. ❑ The human brain / the nervous system contains cells which are called neurons . The neurons are connected using axons and dendrites . While learning the connections between neurons are changed. ❑ Within this lecture we will talk about artificial neural networks that mimic the processes in the human brain. The adjective “artificial” will be omitted for Source: https://pixabay.com/de/nervenzelle-neuron-gehirn-neuronen-2213009/ , reasons of brevity. downloaded with permission 03.01.2019. Slide 4 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  5. Neural Networks • Motivation and Literature Deep learning: Accuracy ❑ The advantage of neuronal structures is their ability to be adapted to several types of problems by changing their Deep learning size and internal structure . ❑ A few years ago so-called deep approaches appeared. This was one of the main factors for the success of neural Conventional approaches networks. ❑ “Deep” means here to have on the one hand several/many hidden layers . On the other hand it means that s pecific training procedures are used. Available data size ❑ Compared to conventional (shallow) structures deep approaches are specially suited if a large amount of training data is available. Slide 5 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  6. Neural Networks • Motivation and Literature Literature: ❑ C. C. Aggarwal: Neural Networks and Deep Learning , Springer, 2018 ❑ A. Géron: Machine Learning mit Scikit-Learn & Tensorflow , O’Reilly, 2018 (in German and English) ❑ I. Goodfellow, Y. Bengio, A. Courville: Deep Learning , Mitp, 2018 (in German and English) Slide 6 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  7. Neural Networks • Contents ❑ Motivation ❑ Structure of a (basic) neural network ❑ Applications of neural networks ❑ Types of neural networks ❑ Basic training of neural networks ❑ Example applications Slide 7 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  8. Neural Networks • Structure of a Neural Network – Basics Basic structure during runtime and training: Training Training algorithm Runtime Database with input features Neural network Distance or Database with error input and output computation features Slide 8 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  9. Neural Networks • Structure of a Neural Network – Basics Network structure: Training algorithm Database with input features Neural network Database with Distance input and output features comp. Input Hidden Output Hidden layer layer layer layer Slide 9 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  10. Neural Networks • Structure of a Neural Network – Basics Hidden Input Hidden Output Input layer: layer layer layer layer Neural network ❑ Sometimes only a “pass through” layer ❑ Sometimes also a mean compensation and Input layer a normalization is performed: Afterwards all individually normalized inputs are combined to a vector : Slide 10 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  11. Neural Networks • Structure of a Neural Network – Basics Hidden Input Hidden Output Hidden layer: layer layer layer layer Neural network ❑ Linear weighting of inputs with bias with Hidden layer ❑ Nonlinear activation function : ❑ Combination of all results to a vector : Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks Slide 11

  12. Neural Networks • Structure of a Neural Network – Basics Identity function Sign function Sigmoid function Activation functions – part 1: ❑ The sum of the weighted inputs plus the bias will be abbreviated with ❑ Several activation functions exist, such as ❑ the identity function Differentiation Differentiation Differentiation ❑ the sign function, or ❑ the sigmoid function Slide 12 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  13. Neural Networks • Structure of a Neural Network – Basics Tanh function Rectified linear function “Hard tanh ” function Activation functions – part 2: ❑ Further activation functions : ❑ the tanh function ❑ the rectified linear function (or unit, ReLU) Differentiation Differentiation Differentiation ❑ the “ hard tanh “ function Slide 13 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  14. Neural Networks • Structure of a Neural Network – Basics Hidden Input Hidden Output Output layer: layer layer layer layer Neural network ❑ Sometimes only a “pass through” layer ❑ Sometimes also a limitation Output layer Minimum and a normalization is performed: Maximum The limited and normalized outputs are combined to a vector Normali- zation Slide 14 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  15. Neural Networks • Structure of a Neural Network – Basics Hidden layers Input Layer sizes: Output layer layer ❑ The input and the output layer size is usually given by the application. The input layer size is equal to the feature vector size and the output layer size is determined by the amount of output features. Sometimes more outputs than required are computed in Hidden layers order to modify the cost function. Input Output layer ❑ The entire size of the network (sum of all layer sizes) should layer be adjusted to the size of the available data . ❑ In some applications so-called bottle neck layers are helpful. Slide 15 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  16. Neural Networks • ❑ Motivation ❑ Structure of a (basic) neural network Contents ❑ Applications of neural networks ❑ Real-time video object recognition ❑ Improving Image Resolution ❑ Automatic image colorization ❑ Types of neural networks ❑ Basic training of neural networks ❑ Example applications Slide 16 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  17. Neural Networks • Applications of Neural Networks – Sources Tesla: ❑ https://cleantechnica.com/2018/06/11/tesla-director-of-ai-discusses-programming-a-neural-net-for-autopilot-video/ ❑ https://vimeo.com/272696002?cjevent=c27333cefa3511e883d900650a18050f Pixel Recursive Super Resolution: ❑ R. Dahl, M. Norouzi and J. Shlens: Pixel Recursive Super Resolution , 2017 IEEE International Conference on Computer Vision (ICCV), Venice, pp. 5449-5458, 2017. Image colorization: ❑ http://iizuka.cs.tsukuba.ac.jp/projects/colorization/data/colorization_sig2016.pdf Slide 17 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  18. Neural Networks • Applications of Neural Networks – Real-time Video Object Recognition Video object recognition for Tesla cars: ❑ Tesla uses cameras, radar and ultrasonic sensors to detect objects in the surrounding area. However, they rely mostly on computer vision by cameras. ❑ Their current system uses (mostly) a so-called convolutional network (details later on) for object recognition. New approaches use “ CodeGen ” (also the structure [not only the weights] of the network are adapted during the training). ❑ The main system for autonomous driving is a deep neural network . The following video is a full self driving demo by Tesla, where this legend is used: Slide 18 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

  19. Neural Networks • Applications of Neural Networks – Real-time Video Object Recognition Slide 19 Digital Signal Processing and System Theory | Pattern Recognition | Neural Networks

Recommend


More recommend