lectu ture 8 recap
play

Lectu ture 8 Recap Prof. Leal-Taix and Prof. Niessner 1 Wh What - PowerPoint PPT Presentation

Lectu ture 8 Recap Prof. Leal-Taix and Prof. Niessner 1 Wh What d do w we k know so so fa far? r? Width Depth Prof. Leal-Taix and Prof. Niessner 2 Wh What d do w we k know so so fa far? r? Activation Functions


  1. Lectu ture 8 Recap Prof. Leal-Taixé and Prof. Niessner 1

  2. Wh What d do w we k know so so fa far? r? Width Depth Prof. Leal-Taixé and Prof. Niessner 2

  3. Wh What d do w we k know so so fa far? r? Activation Functions (non-linearities) $ Sigmoid: ! " = ($&' () ) ReLU: max 0, " tanh: tanh " Leaky ReLU: max 0.1", " Prof. Leal-Taixé and Prof. Niessner 3

  4. Wh What d do w we k know so so fa far? r? ! " 2.00 Backpropagation −0.20 −2.00 ∗ # " −1.00 0.20 −0.39 4.00 + 0.20 −3.00 ! $ −0.39 6.00 ∗ # $ 1.00 −1.00 0.37 0.20 1.37 1 0.73 −2.00 + ) * * −1 +1 0.20 # −0.20 −0.53 −0.59 −0.53 1.00 −3.00 % 0.20 Prof. Leal-Taixé and Prof. Niessner 4

  5. Wh What d do w we k know so so fa far? r? SGD Variations (Momentum, etc.) Batchnorm N = mini-batch size D = #features Prof. Leal-Taixé and Prof. Niessner 5

  6. Wh Why n not o only mo more re L Layers? rs? • We can not make networks arbitrarily complex – Why not just go deeper and get better? – No structure!! – It’s just brute force! – Optimization becomes hard – Performance plateaus / drops! Prof. Leal-Taixé and Prof. Niessner 6

  7. Dealing with th ima mages Prof. Leal-Taixé and Prof. Niessner 7

  8. Usi Using CNNs s in Comp mput uter r Visi sion Prof. Leal-Taixé and Prof. Niessner 8 Credit: Li/Karpathy/Johnson

  9. FC FC layers rs on ima mages • How to process a tiny image with FC layers 5 weights 5 5 3 3 neuron layer Prof. Leal-Taixé and Prof. Niessner 9

  10. FC FC layers rs on ima mages • How to process a tiny image with FC layers 25 weights For the whole 5x5 image on 5 5 3 3 neuron layer Prof. Leal-Taixé and Prof. Niessner 10

  11. FC FC layers rs on ima mages • How to process a tiny image with FC layers 75 weights For the whole 5x5 image on the three 5 channels 5 3 3 neuron layer Prof. Leal-Taixé and Prof. Niessner 11

  12. FC FC layers rs on ima mages • How to process a tiny image with FC layers 75 weights For the whole 5x5 image on the three 75 weights 5 channels pe per r neuron ne on 75 weights 5 3 3 neuron layer Prof. Leal-Taixé and Prof. Niessner 12

  13. FC FC layers rs on ima mages mal image with FC layers • How to process a no normal 1000 1000 3 3 neuron layer Prof. Leal-Taixé and Prof. Niessner 13

  14. FC FC layers rs on ima mages mal image with FC layers • How to process a no normal TICAL IMPRACTIC 3 $%&&%'( weights 1000 IM 1000 3 1000 neuron layer Prof. Leal-Taixé and Prof. Niessner 14

  15. An An alterna native to Fully-Con Connec ected ed • We want to restrict the degrees of freedom – FC is somewhat brute force – We want a layer with structure – Weight sharing à using the same weights for different parts of the image Prof. Leal-Taixé and Prof. Niessner 15

  16. Convoluti tions Prof. Leal-Taixé and Prof. Niessner 16

  17. Wh What a are re Co Convolutions? s? ' ! ∗ # = % ! ( # ) − ( +( &' ! = red # = blue ! ∗ # = green Convolution of two box functions Convolution of two Gaussians application of a filter to a function the ‘smaller’ one is typically called the filter kernel Prof. Leal-Taixé and Prof. Niessner 17

  18. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ‘Slide’ filter kernel from left to right; at each position, compute a single value in the output data Prof. Leal-Taixé and Prof. Niessner 18

  19. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 4 ⋅ 1 3 + 3 ⋅ 1 3 + 2 ⋅ 1 3 = 3 Prof. Leal-Taixé and Prof. Niessner 19

  20. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 3 ⋅ 1 3 + 2 ⋅ 1 3 + (−5) ⋅ 1 3 = 0 Prof. Leal-Taixé and Prof. Niessner 20

  21. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 0 2 ⋅ 1 3 + (−5) ⋅ 1 3 + 3 ⋅ 1 3 = 0 Prof. Leal-Taixé and Prof. Niessner 21

  22. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 0 1 (−5) ⋅ 1 3 + 3 ⋅ 1 3 + 5 ⋅ 1 3 = 1 Prof. Leal-Taixé and Prof. Niessner 22

  23. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 0 1 10/3 3 ⋅ 1 3 + 5 ⋅ 1 3 + 2 ⋅ 1 3 = 10 3 Prof. Leal-Taixé and Prof. Niessner 23

  24. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 0 1 10/3 4 5 ⋅ 1 3 + 2 ⋅ 1 3 + 5 ⋅ 1 3 = 4 Prof. Leal-Taixé and Prof. Niessner 24

  25. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 0 1 10/3 4 4 2 ⋅ 1 3 + 5 ⋅ 1 3 + 5 ⋅ 1 3 = 4 Prof. Leal-Taixé and Prof. Niessner 25

  26. Wh What a are re Co Convolutions? s? Discrete case: box filter ! 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 " ! ∗ " 3 0 0 1 10/3 4 4 16/3 5 ⋅ 1 3 + 5 ⋅ 1 3 + 6 ⋅ 1 3 = 16 3 Prof. Leal-Taixé and Prof. Niessner 26

  27. Wh What a are re Co Convolutions? s? Discrete case: box filter 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 ?? 3 0 0 1 10/3 4 4 16/3 ?? Wh What t to d do a at b boundaries? Prof. Leal-Taixé and Prof. Niessner 27

  28. Wh What a are re Co Convolutions? s? Discrete case: box filter 4 3 2 -5 3 5 2 5 5 6 1/3 1/3 1/3 ?? 3 0 0 1 10/3 4 4 16/3 ?? Wh What t to d do a at b boundaries? 3 0 0 1 10/3 4 4 16/3 1) Shrink 2) Pad 7/3 3 0 0 1 10/3 4 4 16/3 11/3 often ‘0’ Prof. Leal-Taixé and Prof. Niessner 28

  29. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 5 6 7 9 -1 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 3 + −1 ⋅ 3 + −1 ⋅ 2 + −1 ⋅ 0 + −1 ⋅ 4 = 15 − 9 = 6 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 29

  30. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 5 6 7 9 -1 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 2 + −1 ⋅ 2 + −1 ⋅ 1 + −1 ⋅ 3 + −1 ⋅ 3 = 10 − 9 = 1 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 30

  31. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 1 + −1 ⋅ (−5) + −1 ⋅ (−3) + −1 ⋅ 3 + −1 ⋅ 2 = 5 + 3 = 1 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 31

  32. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 -7 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 0 + −1 ⋅ 3 + −1 ⋅ 0 + −1 ⋅ 1 + −1 ⋅ 3 = 0 − 7 = −7 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 32

  33. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 -7 9 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 3 + −1 ⋅ 2 + −1 ⋅ 3 + −1 ⋅ 1 + −1 ⋅ 0 = 15 − 6 = 9 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 33

  34. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 -7 9 2 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 3 + −1 ⋅ 1 + −1 ⋅ 5 + −1 ⋅ 4 + −1 ⋅ 3 = 15 − 13 = 2 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 34

  35. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 -7 9 2 -5 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 0 + −1 ⋅ 0 + −1 ⋅ 1 + −1 ⋅ 6 + −1 ⋅ (−2) = −5 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 35

  36. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 -7 9 2 -5 -9 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 1 + −1 ⋅ 3 + −1 ⋅ 4 + −1 ⋅ 7 + −1 ⋅ 0 = 5 − 14 = −9 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 36

  37. Con Convol olution ons on on I Images es -5 3 2 -5 3 Image 5x5 4 3 2 1 -3 1 0 3 3 5 -2 0 1 4 4 Output 3x3 6 1 8 5 6 7 9 -1 -7 9 2 -5 -9 3 Kernel 3x3 0 -1 0 -1 5 -1 5 ⋅ 4 + −1 ⋅ 3 + −1 ⋅ 4 + −1 ⋅ 9 + −1 ⋅ 1 = 20 − 17 = 3 0 -1 0 Prof. Leal-Taixé and Prof. Niessner 37

Recommend


More recommend