depth camera for mobile devices
play

Depth Camera for Mobile Devices Instructor - Simon Lucey 16-423 - - PowerPoint PPT Presentation

Depth Camera for Mobile Devices Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Stereo Cameras Structured Light Cameras Time of Flight (ToF) Camera Inferring 3D Points Given we have prior knowledge of


  1. Depth Camera for Mobile Devices Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps

  2. Today • Stereo Cameras • Structured Light Cameras • Time of Flight (ToF) Camera

  3. Inferring 3D Points • Given we have prior knowledge of the, • Intrinsics parameters, { Λ j } J j =1 • Extrinsic parameters, { Ω j , τ j } J j =1 • Corresponding points, { x j } J j =1 • Question is how to estimate the 3D point ? w

  4. Inferring 3D Points J X w = min ˆ η { x j − pinhole[ w , Λ j , Ω j , τ j ] } w j =1 e.g. η { x } = || x || 2 2

  5. Inferring 3D Points • Optimization problem is inherently non-linear due to the pinhole camera function. • Can be made linear using homogeneous coordinates.

  6. Inferring 3D Points • Write j -th out the pinhole camera in homogenous coordinates, • Pre-multiply with inverse of the intrinsics matrix,

  7. Inferring 3D Points • Last equation gives, • Substituting back into the other two equations, • Re-arranging gives the following system of equations,

  8. Inferring 3D Points • Last equation gives, • Substituting back into the other two equations, • Re-arranging gives the following system of equations, What is the minimum number of cameras (J)?

  9. Stereo Camera

  10. Stereo Camera

  11. Stereo Camera 6.35 cm

  12. Stereo Camera 6.35 cm What is better wide or narrow baseline?

  13. Stereo Camera

  14. Stereo Camera

  15. Examples in Mobile “Amazon Fire Phone”

  16. Examples in Mobile Why 4 cameras? “Amazon Fire Phone”

  17. Limitations - Texture • Approach only works if an image patch has texture!! X || I ( x k ) − I ( x k + ∆ x ) || 2 A ( ∆ x ) = x k ∈ N ( x ) 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 A ( ∆ x ) 12

  18. Limitations - Texture • Approach only works if an image patch has texture!! X || I ( x k ) − I ( x k + ∆ x ) || 2 A ( ∆ x ) = x k ∈ N ( x ) 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 5 10 15 20 25 A ( ∆ x ) 12

  19. Today • Stereo Cameras • Structured Light Cameras • Time of Flight (ToF) Camera

  20. Projector vs.Camera 14

  21. Projector vs.Camera “Camera” 14

  22. Projector vs. Camera 15

  23. Projector vs. Camera “Projector” 15

  24. Depth from Structured Light 16

  25. Depth from Structured Light How can we get away with one camera? 16

  26. Depth from Structure Light 17

  27. Depth from Structured Light 18

  28. Prime Sense - Kinect 1.0 Camera (Primesensor) How pattern looks like? • First Region: Allows to obtain a high accurate depth surface for near objects aprox. (0.8 – 1.2 m) • Second Region: Allows to obtain medium accurate depth surface m). aprox. (1.2 – 2.0 m). • Third Region: Allows to obtain a aprox. low accurate depth surface in far objects aprox. (2.0 – 3.5 m). 19

  29. Examples in Mobile 20

  30. ItSeez - App

  31. ItSeez - App

  32. Limitations - Range 22

  33. Limitations - DeFocus (a) Scene (b) Disparity Map 23

  34. Limitations - Ambient Light • A sunny day on Earth can reach up to 1120Wm -2 • Tabletop projector releases on average 10W of light. Spectral Irradiance (in Wm − 2 nm − 1 ) 2.5 Extraterrestrial Radiation Direct + Circumsolar Irradiance 2 1.5 1 0.5 0 0 500 1000 1500 2000 2500 3000 3500 4000 Wavelength (in nm) 24

  35. Today • Stereo Cameras • Structured Light Cameras • Time of Flight (ToF) Camera

  36. Time of Flight Cameras • Light travels at approximately a constant speed c = 8 ms -1 . 3x10 • Measuring the time it takes for light to travel over a distance once can infer distance. • Can be categorized into two types:- 1. Direct TOF - switch laser on and off rapidly. 2. Indirect TOF - send out modulated light, then measure phase difference to infer depth.

  37. Direct - TOF • Li ght D etection A nd R anging ( LiDAR ) probably best example in computer vision and robotics. • High-energy light pulses limit influence of background illumination. • However, difficulty to generate short light pulses with fast rise and fall times. • High-accuracy time measurement required. • Prone to motion blur. • Expensive.

  38. Direct - TOF • Li ght D etection A nd R anging ( LiDAR ) probably best example in computer vision and robotics. • High-energy light pulses limit influence of background illumination. • However, difficulty to generate short light pulses with fast rise and fall times. • High-accuracy time measurement required. • Prone to motion blur. • Expensive.

  39. Direct TOF - Zebedee CSIRO

  40. Direct TOF - Zebedee CSIRO

  41. Indirect - TOF • Continuous light waves instead of short light pulses. • Modulation in terms of frequency of sinusoidal waves. • Detected wave after reflection has shifted phase. • Phase shift proportional to distance from reflecting surface. continuous wave Phase Meter ... 20 MHz ... ... Emitter ... Detector phase shift 3D Surface

  42. Indirect - TOF

  43. Indirect TOF

  44. Examples - Mobile

  45. REAL3 TM Image Sensor

  46. The Future

  47. The Future

Recommend


More recommend