neural networks deep learning
play

Neural Networks - Deep Learning Artificial Intelligence @ Allegheny - PowerPoint PPT Presentation

Neural Networks - Deep Learning Artificial Intelligence @ Allegheny College Janyl Jumadinova March 4-6, 2020 Credit: Google Workshop Janyl Jumadinova March 4-6, 2020 1 / 35 Neural Networks - Deep Learning Neural Networks Janyl Jumadinova


  1. Neural Networks - Deep Learning Artificial Intelligence @ Allegheny College Janyl Jumadinova March 4-6, 2020 Credit: Google Workshop Janyl Jumadinova March 4-6, 2020 1 / 35 Neural Networks - Deep Learning

  2. Neural Networks Janyl Jumadinova March 4-6, 2020 2 / 35 Neural Networks - Deep Learning

  3. Neural Networks Janyl Jumadinova March 4-6, 2020 3 / 35 Neural Networks - Deep Learning

  4. Neural Networks Neural computing requires a number of neurons , to be connected together into a neural network . Neurons are arranged in layers. Janyl Jumadinova March 4-6, 2020 4 / 35 Neural Networks - Deep Learning

  5. Neural Networks Neural computing requires a number of neurons , to be connected together into a neural network . Neurons are arranged in layers. Two main hyperparameters that control the architecture or topology of the network: 1) the number of layers, and 2) the number of nodes in each hidden layer. Janyl Jumadinova March 4-6, 2020 4 / 35 Neural Networks - Deep Learning

  6. Activation Functions The activation function is generally non-linear. Linear functions are limited because the output is simply proportional to the input. Janyl Jumadinova March 4-6, 2020 5 / 35 Neural Networks - Deep Learning

  7. Activation Functions Janyl Jumadinova March 4-6, 2020 6 / 35 Neural Networks - Deep Learning

  8. Network structures Two phases in each iteration: 1 Calculating the predicted output y , known as feed-forward 2 Updating the weights and biases, known as backpropagation Janyl Jumadinova March 4-6, 2020 7 / 35 Neural Networks - Deep Learning

  9. Feed-forward example Feed-forward networks: Single-layer perceptrons Multi-layer perceptrons Janyl Jumadinova March 4-6, 2020 8 / 35 Neural Networks - Deep Learning

  10. Feed-forward example Feed-forward networks: Single-layer perceptrons Multi-layer perceptrons Janyl Jumadinova March 4-6, 2020 8 / 35 Neural Networks - Deep Learning

  11. Single-layer Perceptrons Output units all operate separately – no shared weights. Adjusting weights moves the location, orientation, and steepness of cliff. Janyl Jumadinova March 4-6, 2020 9 / 35 Neural Networks - Deep Learning

  12. Multi-layer Perceptrons Layers are usually fully connected. Numbers of hidden units typically chosen by hand. Janyl Jumadinova March 4-6, 2020 10 / 35 Neural Networks - Deep Learning

  13. Neural Networks Janyl Jumadinova March 4-6, 2020 11 / 35 Neural Networks - Deep Learning

  14. Neural Networks Janyl Jumadinova March 4-6, 2020 12 / 35 Neural Networks - Deep Learning

  15. Neural Networks A fully connected NN layer Janyl Jumadinova March 4-6, 2020 13 / 35 Neural Networks - Deep Learning

  16. Implementation as Matrix Multiplication Janyl Jumadinova March 4-6, 2020 14 / 35 Neural Networks - Deep Learning

  17. Non-Linear Data Distributions Janyl Jumadinova March 4-6, 2020 15 / 35 Neural Networks - Deep Learning

  18. Janyl Jumadinova March 4-6, 2020 16 / 35 Neural Networks - Deep Learning

  19. Deep Learning Most current machine learning works well because of human-designed representations and input features. Machine learning becomes just optimizing weights to best make a final prediction. Janyl Jumadinova March 4-6, 2020 17 / 35 Neural Networks - Deep Learning

  20. Deep Learning Most current machine learning works well because of human-designed representations and input features. Machine learning becomes just optimizing weights to best make a final prediction. Deep learning algorithms attempt to learn multiple levels of representation of increasing complexity/abstraction. Janyl Jumadinova March 4-6, 2020 17 / 35 Neural Networks - Deep Learning

  21. Deep Learning Each neuron implements a relatively simple mathematical function. y = g ( w · x + b ) Janyl Jumadinova March 4-6, 2020 18 / 35 Neural Networks - Deep Learning

  22. Deep Learning Each neuron implements a relatively simple mathematical function. y = g ( w · x + b ) The composition of 10 6 − 10 9 such functions is powerful. Janyl Jumadinova March 4-6, 2020 18 / 35 Neural Networks - Deep Learning

  23. Deep Learning Book: http://www.deeplearningbook.org/ Chapter 5 “A core idea in deep learning is that we assume that the data was generated by the composition of factors or features, potentially at multiple levels in a hierarchy.” Janyl Jumadinova March 4-6, 2020 19 / 35 Neural Networks - Deep Learning

  24. Results get better (to a degree) with: more data bigger models more computation Janyl Jumadinova March 4-6, 2020 20 / 35 Neural Networks - Deep Learning

  25. Results get better (to a degree) with: more data bigger models more computation Better algorithms, new insights and improved methods help, too! Janyl Jumadinova March 4-6, 2020 20 / 35 Neural Networks - Deep Learning

  26. TensorFlow Janyl Jumadinova March 4-6, 2020 21 / 35 Neural Networks - Deep Learning

  27. Adoption of Deep Learning Tools on GitHub Janyl Jumadinova March 4-6, 2020 22 / 35 Neural Networks - Deep Learning

  28. TensorFlow Epoch : a training iteration (one pass through the dataset). Batch : Portion of the dataset (number of samples after dataset has been divided). Regularization : a set of techniques that helps learning models to converge ( http://www.godeep.ml/regularization-using-tensorflow/ ). Janyl Jumadinova March 4-6, 2020 23 / 35 Neural Networks - Deep Learning

  29. TensorFlow Operates over tensors : n-dimensional arrays Janyl Jumadinova March 4-6, 2020 24 / 35 Neural Networks - Deep Learning

  30. TensorFlow Operates over tensors : n-dimensional arrays Using a flow graph : data flow computation framework Janyl Jumadinova March 4-6, 2020 24 / 35 Neural Networks - Deep Learning

  31. TensorFlow Operates over tensors : n-dimensional arrays Using a flow graph : data flow computation framework Janyl Jumadinova March 4-6, 2020 24 / 35 Neural Networks - Deep Learning

  32. TensorFlow 5.7 ← Scalar Number, Float, etc. Janyl Jumadinova March 4-6, 2020 25 / 35 Neural Networks - Deep Learning

  33. TensorFlow Janyl Jumadinova March 4-6, 2020 26 / 35 Neural Networks - Deep Learning

  34. TensorFlow Janyl Jumadinova March 4-6, 2020 27 / 35 Neural Networks - Deep Learning

  35. TensorFlow Tensors have a Shape that is described with a vector Janyl Jumadinova March 4-6, 2020 28 / 35 Neural Networks - Deep Learning

  36. TensorFlow Tensors have a Shape that is described with a vector [1000 , 256 , 256 , 3] 10000 Images Each Image has 256 Rows Each Row has 256 Pixels Each Pixel has 3 values (RGB) Janyl Jumadinova March 4-6, 2020 28 / 35 Neural Networks - Deep Learning

  37. TensorFlow Computation is a dataflow graph Janyl Jumadinova March 4-6, 2020 29 / 35 Neural Networks - Deep Learning

  38. TensorFlow Computation is a dataflow graph with tensors Janyl Jumadinova March 4-6, 2020 30 / 35 Neural Networks - Deep Learning

  39. TensorFlow Computation is a dataflow graph with state Janyl Jumadinova March 4-6, 2020 31 / 35 Neural Networks - Deep Learning

  40. Core TensorFlow data structures and concepts Graph : A TensorFlow computation, represented as a dataflow graph: - collection of ops that may be executed together as a group. Janyl Jumadinova March 4-6, 2020 32 / 35 Neural Networks - Deep Learning

  41. Core TensorFlow data structures and concepts Graph : A TensorFlow computation, represented as a dataflow graph: - collection of ops that may be executed together as a group. Operation : a graph node that performs computation on tensors Janyl Jumadinova March 4-6, 2020 32 / 35 Neural Networks - Deep Learning

  42. Core TensorFlow data structures and concepts Graph : A TensorFlow computation, represented as a dataflow graph: - collection of ops that may be executed together as a group. Operation : a graph node that performs computation on tensors Tensor : a handle to one of the outputs of an Operation: - provides a means of computing the value in a TensorFlow Session. Janyl Jumadinova March 4-6, 2020 32 / 35 Neural Networks - Deep Learning

  43. TensorFlow Constants Janyl Jumadinova March 4-6, 2020 33 / 35 Neural Networks - Deep Learning

  44. TensorFlow Constants Placeholders : must be fed with data on execution. Janyl Jumadinova March 4-6, 2020 33 / 35 Neural Networks - Deep Learning

  45. TensorFlow Constants Placeholders : must be fed with data on execution. Variables : a modifiable tensor that lives in TensorFlow’s graph of interacting operations. Janyl Jumadinova March 4-6, 2020 33 / 35 Neural Networks - Deep Learning

  46. TensorFlow Constants Placeholders : must be fed with data on execution. Variables : a modifiable tensor that lives in TensorFlow’s graph of interacting operations. Session : encapsulates the environment in which Operation objects are executed, and Tensor objects are evaluated. Janyl Jumadinova March 4-6, 2020 33 / 35 Neural Networks - Deep Learning

  47. TensorFlow Janyl Jumadinova March 4-6, 2020 34 / 35 Neural Networks - Deep Learning

  48. TensorFlow https://playground.tensorflow.org Janyl Jumadinova March 4-6, 2020 35 / 35 Neural Networks - Deep Learning

Recommend


More recommend