okuli extending mobile interaction through near field
play

Okuli : Extending Mobile Interaction Through Near-Field Visible - PowerPoint PPT Presentation

Okuli : Extending Mobile Interaction Through Near-Field Visible Light Sensing Chi Zhang, Joshua Tabor, Jialiang Zhang and Xinyu Zhang Department of Electrical and Computer Engineering University of Wisconsin-Madison Touch is a dominant mode of


  1. Okuli : Extending Mobile Interaction Through Near-Field Visible Light Sensing Chi Zhang, Joshua Tabor, Jialiang Zhang and Xinyu Zhang Department of Electrical and Computer Engineering University of Wisconsin-Madison

  2. Touch is a dominant mode of mobile interaction But on-screen touch input is not always effective!

  3. Screen multiplexed between display and input Wastes precious display area On-screen keyboard hard to use

  4. Input area depends on device size Infeasible on wearable devices

  5. Lack of physical interaction No accurate feedback Separate device means extra burden

  6. Can be solved by separating display and input With passive wireless sensing

  7. Bridging VLC and touch sensing Previous solutions Array of LED/PD pairs: energy hungry, cumbersome Computer vision: heavy computation, obtrusive camera Machine-learning: excessive run-time training

  8. Use PD/LED pairs in a different way No phase information Visible light channel Amplitude is fine-grained and deterministic Requires a fine-grained model to achieve localization

  9. Use PD/LED pairs in a different way Unlike simple “finger blocking beam” model, fine-grained propagation model can enable lightweight localization With such model and 2 channels, we can locate user's finger – This is how Okuli works

  10. Okuli : overview mobile device (e.g. smartphone) right PD left PD LED finger Workspace

  11. Okuli: light grooming 2D localization → want to limit to 2D surface → light grooming – Eliminates interferences from outside the surface Hand PD Finger FoV surface

  12. Okuli: light grooming Can be done with tiny lenses attaches to PDs / LED

  13. Okuli: light grooming For prototyping we use a 3D-printed shroud left sensor LED right sensor

  14. Okuli: light grooming Before After 1 1 0.8 0.8 0 0.2 0.4 0.6 0 0.2 0.4 0.6 Horizontal Vertical

  15. Okuli : channel model Received signal is affected by multiple factors – Factory calibration measures invariant part PD LED Angular response → ← Angular response finger

  16. Okuli : channel model Received signal is affected multiple factors – Model calculates variant part PD LED ← Propagation loss Propagation loss → ← Finger reflectivity finger

  17. Okuli : channel model Path loss is not simple: it is not actually only 2D – Further away, more area visible – Model needs to compensate PD FoV surface

  18. Okuli : channel model Finger reflectivity can be hard to characterize – Abstract by interacting ratio of the beam non- interacting interacting – Overall reflectivity corrected by calibration incident reflect Finger

  19. Okuli : interference canceling Surrounding light sources – Can be much stronger than desired RSS – Not “coherent” with our light emission Modulate our own emission with OOK – Also helps saving energy

  20. Okuli : interference canceling Background reflection – Cannot be removed by modulation – Usually slow-changing and not very strong Spatial solution: narrow vertical FoV Temporal solution: dynamic estimation & removal – Identifies and tracks background – Also detects clicks

  21. Okuli : interference canceling 1 Dark room Fluorescence light 0.8 Diffusive sunlight Direct sunlight Ambient light 0.6 RSS 0.4 Without Cancellation With Cancellation 0.2 0 Effective in most cases 1 2 Location 0.2 No Background White Paper 0.15 Static Background Dynamic background Dynamic Background RSS 0.1 Without Cancellation 0.05 With Cancellation 0 1 2 Location

  22. Okuli : localization For each point, model produces an expected RSS Samples are compared with these RSS Location that has minimum RSS error is selected

  23. Prototyping Okuli 3D-printed shroud controls FoV Arduino drives LED and samples PDs Bluetooth connects Okuli to mobile devices Mobile device runs the algorithm

  24. Performance 1 1 0.8 0.8 0.6 0.6 CDF CDF 0.4 0.4 Black paper Before White paper After 0.2 0.2 Glass 0 0 0 0.5 1 1.5 2 2.5 3 0 0.5 1 1.5 2 2.5 3 Error (cm) Error (cm) Accuracy across different surfaces Accuracy across time (10 days) Okuli is consistent across different surfaces and over time

  25. Performance 100 94.1% 93.8% 91.8% 90.7% 90.4% 89.4% 87.5% 80 Accuracy (%) 60 40 20 0 1 2 3 4 5 6 7 User Keypad (20 keys) Okuli is consistent across different users

  26. Performance 8 Touchscreen Okuli Touchscreen 100 95.00% 95.20% Okuli 93.60% 91.50% 90.50% 90.00% 6 80 Accuracy (%) Y (cm) 60 4 40 2 20 0 0 1 2 3 0 2 4 6 8 User X (cm) Handwriting recognition Sample trackpad trace Okuli 's performance is comparable with capacitive touch screens

  27. Performance Most energy cost by light emission – Can duty-cycle to reduce Power Consumption (mW) 400 LED CPU 300 ADC 200 100 Processing costs very little – Smooth UI, good user experience 0 0.1 0.2 0.3 0.4 0.5 Duty Cycle

  28. Conclusion ● Fine-grained light propagation model can enable accurate near- field visible light localization ● Multiple types of interferences exists in the visible light channel, and can be effectively canceled ● Visible light channel allows us to achieve centimeter grade passive localization with a compact system

  29. Thank you!

Recommend


More recommend