what is there to be skeptical about
play

What Is There to Be Skeptical About? About the Fantastic Success of - PowerPoint PPT Presentation

What Is There to Be Skeptical About? About the Fantastic Success of Deep Learning & The Imperative of Incorporating Priors Into Learning Oliver Brock Robotics and Biology Laboratory The Fantastic Success of Deep Learning Lots of blowout


  1. What Is There to Be Skeptical About? About the Fantastic Success of Deep Learning & The Imperative of Incorporating Priors Into Learning Oliver Brock Robotics and Biology Laboratory

  2. The Fantastic Success of Deep Learning ► Lots of blowout successes in CV and beyond ► End-to-end ► Pixels to Torque ► By using a general function, a gradient, and lots of data

  3. DNNs Is it Science? Engineering? Protoscience? Art?

  4. Hennig Brand (1630 – 1692, depicted: 1669) The Alchymist, in Search of the Philosopher's Stone, Discovers Phosphorus, and prays for the successful Conclusion of his operation, as was the custom of the Ancient Chymical Astrologers (1771, 1795) Joseph Wright of Derby (1734 – 1797)

  5. NFL David Wolpert, William Macready a nd many others…

  6. Bremermann’s Limit 1.36 × 10 50 bits/(s kg)

  7. 10 75 ops/s

  8. 10 103 ops/s

  9. 10 200 À 10 103 600 ¿ 300.000.000

  10. The Imperative of Incorporating Priors Into Learning ► Assumption: NFL captures a property of the problem space ► Hypothesis 1: The problems we want to solve are actually really simple and the NFL does not apply ► Hypothesis 2: Neural nets capture exactly the right prior into learning 4 for the type of problems we want to solve ► Hypothesis 3: Hypothesis 1 & 2 together ► Hypothesis 4: We must incorporate task-specific priors

  11. I am the deep learning hammer!

  12. Towards Combined Tools

  13. Merging Two Ends of the Spectrum Rico Jonschkowski

  14. Incorporate Priors Into Learning ► Representation Learning with Newton’s Laws Rico Jonschkowski Rico Jonschkowski and Oliver Brock. Learning State Representations with Robotic Priors. Autonomous Robots. Springer US 39(3):407-428, 2015. ► Learning with Side Information Direct Pattern Multi-Task Pattern Rico Jonschkowski Sebastian Höfer Rico Jonschkowski, Sebastian Höfer, and Oliver Brock. Patterns for Learning with Side Information. arXiv:1511.06429. 2016.

  15. Learning With Side Information: A Simple Example f: 3 → 14 f: 5 → 30 f: 2 → 9 What is f? Rico Jonschkowski, Sebastian Höfer, and Oliver Brock. Patterns for Learning with Side Information. arXiv:1511.06429. 2016.

  16. Learning With Side Information: A Simple Example f: 3 → 9 → 14 f: 5 → 25 → 30 f: 2 → 4 → 9 What is f? Rico Jonschkowski, Sebastian Höfer, and Oliver Brock. Patterns for Learning with Side Information. arXiv:1511.06429. 2016.

  17. Case Study: Recursive State Estimation Toy problem: 1D hallway navigation with identical doors Problem and illustrations adapted from [Thrun, Burgard, Fox: Probabilistic Robotics. MIT Press,2006.]

  18. Neural Histogram Filter

  19. Neural Histogram Filter

  20. Neural Histogram Filter

  21. Neural Histogram Filter

  22. Neural Histogram Filter

  23. Neural Histogram Filter

  24. Neural Histogram Filter

  25. Neural Histogram Filter

  26. Learned Models (from 1600 training steps) Measurement Model Motion Model Motion Update Measurement Update

  27. Learned Models (from 1600 training steps) Measurement Model Motion Model Motion Update Measurement Update

  28. Learned Models (from 1600 training steps) Measurement Model Motion Model Motion Update Measurement Update

  29. NHF predictions for 32 test steps (from 1600 training steps)

  30. NHF predictions for 32 test steps (from 1600 training steps)

  31. Qualitative Comparison (models from 1600 training steps) NHF

  32. Qualitative Comparison (models from 1600 training steps) NHF RNN 2x RNN LSTM 2x LSTM

  33. First Results

  34. Conclusion I am the I am the Gaussian process SVM saw! screw driver! NFL ) We are all important!

Recommend


More recommend