Computer Vision for Mobile Robots in GPS Denied Areas Michael Berli, 28th of April 2015 Supervisor: Tobias Nägeli 1
Robots can work in places we as humans can't reach and they can do jobs we are unable or unwilling to do. 2 [1,2]
Autonomous mobile robots § How do we make robots navigate autonomously? Robots should be able to explore an unknown environment and navigate inside this environment without active human control 3
Autonomous mobile robots § Using computer vision for autonomous navigation Mapless Map-Based Map-Building 4
Robots 5 [3,4,5]
Focus in this talk Type of robot § Autonomous Ground Vehicles Environment § Indoor environments (rooms, tunnels, warehouses) Sensors § Cameras, wheel sensors 6
Robot scenarios: Industrial-Automation 7 [6]
Robot scenarios: Inspection & Discovery 8 [7]
Robot scenarios: Space operations 9 [8]
The three navigation classes Mapless Map-Based Map-Building 10
Mapless Navigation Walk through Paris without colliding 11 [10]
Collision Avoidance 12
Optical Flow § Describe the motion of patterns in successive images (x,y) u v (x,y) (x+dx,y+dy) Frame @ t Frame @ t+1 13
Optical Flow 14 [11]
t 0 t 1 15 [11]
Optical Flow § Get an understanding of depth in images § Time-To-Contact between a camera and an object 16 [11]
Optical Flow: Time-To-Contact 17
Optical Flow: Time-To-Contact 18
Optical Flow: Time-To-Contact FOE Focus of Expansion Where the camera points at 19
Optical Flow: Time-To-Contact Left Flow Central Flow Right Flow FOE TTCr TTCc TTC l 20
Obstacle Avoidance FSM 21 [23]
Inspired by biology 22
Inspired by biology 23
Inspired by biology Maximum of optical flow 24
Optical Flow: Further applications § Applications for visually impaired § Image Stabilization § Video Compression (MPEG) Drawbacks § Hard if no textures § Dynamic scenes? 25
The three navigation classes C Mapless B A D Map-Based E F G Map-Building 26
Map-Based Navigation Use a map of Paris to navigate to champs elysée 27 [12]
Map-Based Navigation: Robot Scenario 28 [13]
Map-Based Navigation: Map Representation Topological Map Graph-based representation of features and their relations, often associated with actions. feature + simple and compact path - no absolute distances - obstacle avoidance needed Metric Map Two-Dimensional space in which objects and paths are placed. + very precise - hard to obtain and to maintain 29
Map-Based Navigation Example Build a topological Use the topological map of the floor map to navigate 30
Feature Extraction Feature Elements which can easily be re-observed and distinguished from the environment § Features should be § Easily re-observable and distinguishable § Plentiful in the environment § Stationary 31
Room Identification F Signature Room F 32 [14]
Topological Map 33 [14]
Room Searching Signature matching 34 [P]
Drawbacks and Extensions § Learning and maintenance is expensive remove cupboard ? § Use scanner tags or artificial beacons? 35
The three navigation classes Mapless Map-Based Map-Building 36 [15]
Map-Building Navigation Leave your hotel in Paris, explore the environment and return to the hotel afterwards 37 [16]
Map-Building Navigation § Goal: in an unknown environment the robot can build a map and localize itself in the map § Two application categories § Structure from Motion (Offline) § Simultaneous Localization and Mapping (SLAM) ß Real-Time! 38
Structure from Motion (Offline) Robot moves around and captures video frames Frame-To-Frame 3D Map and trajectory feature detection reconstruction Pros Cons § Offline approach § Well studied § Changing environment § Very accurate and robust requires new learning phase solution 39
Simultaneous Localisation and Mapping (SLAM) § Build a map using dead reckoning and camera readings § We focus on EKF-SLAM (Extended Kalman Filter) 40
41 [15]
A map built with SLAM 42 [15]
Dead Reckoning § Motion estimation with data from odometry and heading sensors Uncertainty Prediction Starting position 43
Six steps of map-building (1/2) 44 [17]
Six steps of map-building (2/2) 45 [17]
EKF-SLAM: The system This system is represented by - System state vector - System covariance matrix 46
EKF-SLAM: The state vector ! $ x r ! $ y 1 = x 1 ⌢ ⌢ # & x v = y r # & # & y 1 " % # & θ r " % ! $ y 2 = x 2 ⌢ # & y 2 " % ! $ y 3 = x 3 ⌢ # & y 3 " % 47
EKF-SLAM: The covariance matrix 48
SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of Robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 49
Motion model § Estimate robot‘s new position after a movement Motion model x v = f v ( ˆ x v , u ) old Estimated position robot position odometry 50
SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 51
Measurement model § Based on the predicted robot position and the map, use a measurement model to predict which features should be in view now 52
SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 53
Data matching § Match predicted and observed features Prediction Camera 54
SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 55
EKF Fusion Residual Prediction Camera 56
EKF Fusion 57
EKF Update 58
SLAM – Research topics § Robustness in changing environments § Multiple robot mapping 59
Motion estimation of agile cameras § Real-Time SLAM with a Single Camera § Andrew J. Davison, University of Oxford, 2003 § Parallel Tracking and Mapping for Small AR Workspaces § Georg Klein, David Murray, University of Oxford, 2007 60 [18]
Motion estimation of agile cameras § No odometry data, fast and unpredictable movements § Use a constant velocity model instead of odometry x v = ( x y z v x v y v z v α v β v δ ) α β δ Position Velocity Orientation 61
62 [19]
Motion estimation of agile cameras § Real-Time SLAM with a Single Camera § Andrew J. Davison, University of Oxford, 2003 § Parallel Tracking and Mapping for Small AR Workspaces § Georg Klein, David Murray, University of Oxford, 2007 63 [19]
Tracking and Mapping for AR Workspaces 64 [20]
65 [21]
What we have seen § What autonomous mobile robots are used for § How todays mobile robots navigate autonomously § mapless, map-based, map-building § The potential and the challenges of SLAM 66
References Papers 1. Bonin-Font, Francisco, Alberto Ortiz, and Gabriel Oliver. " Visual navigation for mobile robots: A survey. " Journal of intelligent and robotic systems 53.3 (2008): 263-296. 2. Davison, Andrew J. " Real-time simultaneous localisation and mapping with a single camera ." Proceedings of 9th IEEE International Conference onComputer Vision, 2003. 3. Klein, Georg, and David Murray. " Parallel tracking and mapping for small AR workspaces ." Proceedings of 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), 2007 4. Davison, Andrew J. " Sequential localisation and map-building for real-time computer vision and robotics “, Robotics and Autonomous Systems 36 (2001) 171-183. 2001 5. Mehmed Serdar Guzel, Robert Bicker. “ Optical Flow Based System Design for Mobile Robots ”, Robotics Automation and Mechatronics, 2010 6. M. Mata, J-M.Armingol, A. de la Escalera, M.A. Salichs. “ Using learned visual landmarks for intelligent topological navigation of mobile robots ”, Mata, 2003 67
Recommend
More recommend