machine learning ii
play

Machine Learning II Techie Pizza #44267 Project Lesson 5 Michael - PowerPoint PPT Presentation

Machine Learning II Techie Pizza #44267 Project Lesson 5 Michael Lyle Dont use a five-dollar word when a fifty-cent word will do. - Mark Twain Dont use a five-dollar word when a fifty-cent word will do. - Mark Twain


  1. Machine Learning II Techie Pizza #44267 Project Lesson 5 Michael Lyle

  2. “Don’t use a five-dollar word when a fifty-cent word will do.” - Mark Twain

  3. “Don’t use a five-dollar word when a fifty-cent word will do.” - Mark Twain (But scientists like using five-dollar words; sorry about repeating them in this lesson!)

  4. Dense Neural Network

  5. Dense Neural Network Every neuron is connected to every neuron in the previous layer. This is a lot of connections. Each connection has its own different “weight” to learn. This makes training slow-- and risks overfitting.

  6. Time Series Data ● Measurements from an accelerometer arrive as time-series data Time (ms) Acceleration 0 0.37 10 -0.12 20 -0.30 30 8.15 40 -3.17 50 0.50 60 -0.15 70 0.78

  7. Graphing Time Series Data Time (ms) Acceleration 10 0 0.37 10 -0.12 8 20 -0.30 6 Acceleration (m/s/s) 30 8.15 4 40 -3.17 2 50 0.50 0 60 -0.15 -2 70 0.78 -4 0 10 20 30 40 50 60 70 Time (ms)

  8. Graphing Time Series Data Time (ms) Acceleration 10 0 0.37 A bump 10 -0.12 8 (something useful?) 20 -0.30 6 Acceleration (m/s/s) 30 8.15 Noise? 4 40 -3.17 2 50 0.50 0 60 -0.15 -2 70 0.78 -4 0 10 20 30 40 50 60 70 Time (ms)

  9. Time Series Data ● If we record 10 seconds of data, with 100 measurements per second, that’s 1,000 measurements; each is an input ● If we have a big dense layer using this data, that is 1,000,000 weights (1,000 neurons each connected to 1,000 inputs) ● Small computers like in current scooters can handle neural networks with 25,000 weights

  10. 5 th Grade Math 6 th Grade Math Pre-Algebra Algebra Geometry Algebra II Trigonometry Pre-Calculus Calculus Linear Algebra Differential Equations Multivariate/Vector Calculus Real & Complex Analysis Group Theory ...

  11. Convolutions ● Convolutions are usually studied during a Differential Equations class, but we can get the “gist” now! ● Convolutions are a way of filtering data-- to smooth it out or exaggerate features ● We make a recipe for the transformation we want-- called a convolution kernel ● Then we follow the recipe for each entry in our data table ● Kernels can be any size, but for these examples size=3

  12. Our Data 30 25 20 15 Acceleration (m/s/s) 10 5 0 -5 -10 -15 -20 0 10 20 30 40 50 60 70 Time (ms)

  13. Convolutions - Smooth Take the average of each measurement, the measurement before, and the measurement after 1 1 1 Time (ms) Acceleration Time (ms) Smoothed [ 3 ] 0 0.37 0 3 3 10 -0.12 10 -0.02 20 -0.30 20 2.57 30 8.15 30 1.56 40 -3.17 40 1.83 50 0.50 50 -0.94 60 -0.15 60 0.38 70 0.78 70

  14. Convolutions - Smooth Take the average of each measurement, the measurement before, and the measurement after Time (ms) Acceleration Time (ms) Smoothed × 1 0 0.37 0 3 × 1 10 -0.12 + 10 -0.02 3 × 1 20 -0.30 20 2.57 3 30 8.15 30 1.56 40 -3.17 40 1.83 50 0.50 50 -0.94 60 -0.15 60 0.38 70 0.78 70 1 1 1 [ 3 ] 3 3

  15. Convolutions - Smooth Take the average of each measurement, the measurement before, and the measurement after Time (ms) Acceleration Time (ms) Smoothed 0 0.37 0 × 1 10 -0.12 10 -0.02 3 × 1 20 -0.30 + 20 2.57 3 × 1 30 8.15 30 1.56 3 40 -3.17 40 1.83 50 0.50 50 -0.94 60 -0.15 60 0.38 70 0.78 70 1 1 1 [ 3 ] 3 3

  16. Our Data, Smoothed 30 25 20 15 Acceleration (m/s/s) 10 5 0 -5 -10 -15 -20 0 10 20 30 40 50 60 70 Time (ms)

  17. Convolutions - Exaggerate Take each measurement times 3, minus the measurement before and minus the one after Time (ms) Acceleration Time (ms) Exaggerated [ − 1 3 − 1 ] 0 0.37 0 10 -0.12 10 -0.43 20 -0.30 20 -8.93 30 8.15 30 27.92 40 -3.17 40 -18.16 50 0.50 50 4.82 60 -0.15 60 -1.73 70 0.78 70

  18. Convolutions - Exaggerate Take each measurement times 3, minus the measurement before and minus the one after Time (ms) Acceleration Time (ms) Exaggerated 0 0.37 0 ×(− 1 ) 10 -0.12 + 10 -0.43 × 3 20 -0.30 20 -8.93 ×(− 1 ) 30 8.15 30 27.92 40 -3.17 40 -18.16 50 0.50 50 4.82 60 -0.15 60 -1.73 70 0.78 70 [ − 1 3 − 1 ]

  19. Convolutions - Exaggerate Take each measurement times 3, minus the measurement before and minus the one after Time (ms) Acceleration Time (ms) Exaggerated 0 0.37 0 10 -0.12 10 -0.43 ×(− 1 ) 20 -0.30 + × 3 20 -8.93 30 8.15 30 27.92 ×(− 1 ) 40 -3.17 40 -18.16 50 0.50 50 4.82 60 -0.15 60 -1.73 70 0.78 70 [ − 1 3 − 1 ]

  20. Our Data, Exaggerated 30 25 20 15 Acceleration (m/s/s) 10 5 0 -5 -10 -15 -20 0 10 20 30 40 50 60 70 Time (ms)

  21. Training an artifjcial neural network Remember this slide? 1)Start with example data and a set of “correct answers.” 2)Adjust how strong the connections are to make the neural network produce closer to the output we want. (“Training”) 3)Repeat. A lot. 4)For some problems, we may get a result that’s as good as a human, or even better!

  22. Convolutional Neural Network ● A convolutional layer is a neural network layer that performs convolutions ● We don’t need to know the exact convolution we want: training will find it for us – This means we don’t need to take Differential Equations first! – Also the computer can find better convolutions than people usually can. ● Hopefully it simplifies the data in ways that make life easier for the later layers

  23. Convolutional Neural Network 30 20 Acceleration 10 0 -10 -20 0 10 20 30 40 50 60 70 Time (ms) 100 80 Detection 60 40 20 0 0 10 20 30 40 50 0 70 Time (ms)

  24. Convolutional Neural Network 30 20 Acceleration 10 0 -10 -20 0 10 20 30 40 50 60 70 Time (ms) 3 weights! 100 80 Detection 60 40 20 0 0 10 20 30 40 50 0 70 Time (ms)

  25. Summary ● Time series data measures how values from a sensor change over time. ● Convolutional neural networks are good at matching patterns in time-series data. ● Convolutional layers are much more efficient (fast to train, fast to “run”) than dense layers, but are limited to spotting local patterns.

Recommend


More recommend