robot indoor localization robot based on computer vision
play

Robot Indoor Localization Robot Based on Computer Vision - PowerPoint PPT Presentation

Indoor Localization Robot Indoor Localization Robot Based on Computer Vision Ubiquitous Computing Course 2015 Soliman Nasser Outline Why Localization? Why Computer Vision? GPS Motion Capture System Triangulation PnP


  1. Indoor Localization Robot Indoor Localization Robot Based on Computer Vision Ubiquitous Computing Course 2015 Soliman Nasser

  2. Outline ● Why Localization? ● Why Computer Vision? ● GPS ● Motion Capture System ● Triangulation ● PnP ● 3D Cameras ● SLAM ● Summary

  3. Why Localization ? 1)History …....

  4. Why Localization ? 2)Robot Navigation and Mapping

  5. Why Localization ? 3)Guiding (Museum...) 4)Airports, Malls, Supermarket, … 5)more and more... . . .

  6. Why Computer Vision ? In previous lectures, we saw a lot of techniques for Indoor Localization ! (Which is not CV) Question: Problem solved?

  7. Why Computer Vision ? In previous lectures, we saw a lot of techniques for Indoor Localization ! (Which is not CV) Question: Problem solved? Answer:

  8. Why Computer Vision ? Microsoft Indoor Localization Competition - IPSN 2015

  9. Why Computer Vision ? Microsoft Indoor Localization Competition - IPSN 2015 Problem ?? Accuracy

  10. Why Not GPS ? * works great outdoor Eample: Dji Phantom hovering outdoor – windy day /home/soliman/UbiquitousComputing/phantom-outdoor.mp4

  11. Why Not GPS ? ** doesn't work indoor There is no GPS signal Eample: Dji Phantom indoor /home/soliman/UbiquitousComputing/phantom-indoor.mov

  12. Motion Capture System Indusry manufactoring: Vicon, OptiTrack, … Accuracy: millimeters Cost: Very ^ very ^ very High IR Spectrum → Only indoor Used usually in research labs

  13. Motion Capture System Very High FPS localiztion: 100 – 1000 FPS 6DOF

  14. Motion Capture System

  15. Motion Capture System Example: Autonomous micro-quadcopter Flying indoor (lab) 100 FPS – why we need such a high FPS? /home/soliman/UbiquitousComputing/ladybird-neimanem.mp4 /home/soliman/UbiquitousComputing/ladybird-mute.mov /home/soliman/UbiquitousComputing/VCQ.mov

  16. Triangulation Triangulation is the process of determining 3D world coordinates for an object given 2D views from multiple cameras.

  17. Triangulation

  18. Prespective n Point The aim of the Perspective-n-point problem is to determine the position and orientation of a camera given its intrinsic parameters and a set of n correspondences between 3D points 3D lines.

  19. Prespective n Point

  20. Prespective n Point How we can estimate 6DOF from chessboard?!! 1)Chesboard corners detection (2D points) 2)Given 3D points – constant 3)Fiding corresponding between 2D and 3D 4)Solve PnP

  21. Prespective n Point Example: *AR Drone (as a camera) *Chessboard (known 3D points) AR Drone Parrot, flying automously and follow the chessboard. (About 10FPS “only“) /home/soliman/UbiquitousComputing/ARDrone-PnP.mp4

  22. 3D Cameras 3D Cameras: Unlike normal 2D cameras, 3D cameras output an RGB-D matrix. RGB-D: besides a normal RGB output, depth info is also available.

  23. 3D Cameras Microsoft Kinect 360: Patterns are projected (IR lights – different frequencies). Triangulation method applied with the IR camera to estimate depth. Why not using 2 cameras? Works indoor only – why?

  24. RGBDSLAM RGBDSLAM = RGBD + SLAM SLAM : S imultaneous L ocalization A nd M apping

  25. SLAM using Kinect TurtleBot – robot with kinect Localizaion and Mapping using kinect /home/soliman/UbiquitousComputing/turtlebot.mp4

  26. Summary Triangulation PnP 3D GPS Other cameras sensors Accuracy Very High Med Very High High Low (limited dist) Cost $$$$ $$ $$ $ $ Time Low Low+ Med Med Low complexity Indoor/Outdoor Usually In Usua In Out In/Out lly In Usage High Med Med Low Depends complexity

  27. END Thank you for listening :)

Recommend


More recommend