Environment Awareness for Low-Vision Patients Environment Awareness for Low-Vision Patients
IntelliSight Team Xin Xinyua uan Z Zhang Rafael Ca Carranza Vanessa M Mejia ia IMU Software Camera Software PCB and Mount Integration Integration Design
Overview When we are out in the world, we are able to understand our surroundings by using both our global and local context. ● Global Context → Where we are in the world ● Local Context → What objects are in our surroundings Global Context Local Context
The Problem ● According to the World Health Organization there are 285 million low vision individuals in the world. ● Rely on their senses and on the people around them to understand their local context. ● Although technology has come a long way, it is still unable to help them understand what is in their surroundings in a quick and easy way.
The Solution IntelliSight solves this problem by developing a pair of smart sunglasses that uses: ● Visual information from a camera ● Orientation information from an IMU ● Location information using GPS
Hardware
Block Diagram
Microcontroller ESP32: ● Interfaces our camera and IMU sensor ● Data → Android phone via Bluetooth ● Onboard USB-to-Serial converter ● Operating voltage: 3.7 V
Data Collection IMU: BNO055 ● Collects orientation data ● Capture gestures ● Operating voltage: 3.3 V ● ESP32 → I2C Camera: ArduCam Mini 2MP ● Takes pictures of the user’s surrounding environment ● Operating voltage: 5 V ● ESP32 → SPI
Power Supply LiPo Battery: ● Output Voltage: 3.7 V ● Powers the PCB PowerBoost 500C ● Takes 3.7 V as input and outputs 5 V ● Powers the camera
Printed Circuit Board
PCB Schematic
PCB Layout
Final PCB
Software
Software Camera Mode IMU Mode Object capture Gesture detection Object detection Building detection Text-to-speech Text-to-speech
Camera Mode
Camera Mode ● Capture objects in pictures taken of the user’s surrounding environment ● Our phone application can identify the objects in the pictures using TensorFlow Lite ● Relay information using Text-to-Speech
IMU Mode
IMU Mode Detects Collects Transmits Scans along the gesture bearing bearing data to app bearing to detect via bluetooth landmark
IMU Node: Nodding Pitch y - pitch Difference z - yaw x - roll Time Threshold
IMU Mode: Gesture Detection Azimuth Select NDOF mode [0,360) degree
IMU Mode: Building Detection Bearing Data Building Search | 2m | User’s GPS Location
IMU Mode: Building Detection Bearing Data Next Building Search | 6 m | Building Search Failed | 2 m | User’s GPS Location
IMU Mode: Value Returned Return the final result in voice Distance Output “ X is in front of you” < 30m 30 - 80m “X is close to you” “No building nearby” > 80m
IMU Mode: Further Development ● Higher accuracy to determine the landmark ○ Compass accuracy ○ Better state estimation
Final Product
Final Prototype
A special thanks to: Yogananda Isukapalli Aditya Wadaskar Kyle Douglas
Questions?
Recommend
More recommend