project artemis
play

PROJECT ARTEMIS VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir - PowerPoint PPT Presentation

PROJECT ARTEMIS VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir PROJECT ARTEMIS STATE OF THE INDUSTRY Highly dependent on GPS in assisted modes. Requires sufficient piloting skills in non GPS-assisted modes. Chance of


  1. PROJECT ARTEMIS VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir

  2. PROJECT ARTEMIS STATE OF THE INDUSTRY • Highly dependent on GPS in assisted modes. • Requires sufficient piloting skills in non GPS-assisted modes. • Chance of ‘flyaways’ due to bad GPS reception. • Immediate need for robust GPS-agnostic navigation methods. • No environmental awareness. • Not truly autonomous.

  3. PROJECT ARTEMIS CHALLENGES • Multicopters are highly dynamic systems. • They are inherently unstable and require active control strategies for stable flight. • System dynamics are coupled and fast. • Limited in terms of onboard computing and sensing hardware that can be carried. • QoS for wireless datalinks to the MAV cannot be relied on in all environments.

  4. PROJECT ARTEMIS THE NAVIGATION PROBLEM Perception Localisation Operator interface Navigation Planning Control

  5. PROJECT ARTEMIS LOCALISATION • We use a SLAM (Simultaneous Localisation and Mapping) technique on our robot. • Visual SLAM is globally consistent, and centimetre-level accurate unlike GPS, and works indoors and outdoors. • Tight fusion with time-synchronised inertial measurements greatly increases robustness and accuracy.

  6. PROJECT ARTEMIS VISUAL-INERTIAL LOCALISATION

  7. PROJECT ARTEMIS PERCEPTION • Multiple cameras provide proprioceptive information about the environment. • All the cameras and the IMU (Inertial Measurement Unit) are synchronised in time with respect to each other. Forward stereo cameras are • used to compute depth images in realtime. Depth images are used to build • a 3D map of the environment is built incrementally.

  8. PROJECT ARTEMIS AUTONOMOUS EXPLORATION AND MAPPING

  9. PROJECT ARTEMIS SENSING SUITE Credit : Framerate Stephan Weiss, PhD thesis 2012 (dynamics) Ideal sensor High Compass IMU Drift correction Medium Cameras, Laser Rangers Drift correction GPS Low Drift speed None Medium (spatial) Fast (temporal)

  10. PROJECT ARTEMIS STATE ESTIMATION • The system is designed to navigate using all available sensors in the environment - GPS, Vision and Lidar. • Sensor availability is not guaranteed - modular sensor fusion approach using a hybrid Kalman filter with fault detection is used. • Even if a particular subset or module were to fail, the overall system performance would not be compromised.

  11. PROJECT ARTEMIS ROBUST MULTI-SENSOR FUSION

  12. PROJECT ARTEMIS PLANNING AND CONTROL • The global volumetric map is used to continuously compute a collision-free trajectory for the vehicle. • Assisted modes - planner only intervenes if the operator’s high-level position commands could lead to a possible collision. • Autonomous modes - planner computes optimal trajectories to completely explore the environment.

  13. PROJECT ARTEMIS OPERATOR INTERFACE • We use a single laptop and a tablet for our ground control system. • A long-range Ubiquiti modem is used as the primary air-to-ground datalink. • The laptop runs SLAM visualisation and the tablet runs live FPV (first-person- view). The operator can use this high- definition feed to fly.

  14. PROJECT ARTEMIS VEHICLE PLATFORM PX4 Autopilot Intel i7 computer Rocket M5 Propulsion system (wireless link) Stereo cameras (time-synchronised)

  15. PROJECT ARTEMIS NAVIGATION PIPELINE

  16. PROJECT ARTEMIS VEHICLE PLATFORM • The current-generation developmental prototype was designed after multiple iterations, building on top of our previous visual MAVs. • Intel Core i7 onboard computer running Ubuntu 14.04 Server. • Pixhawk autopilot running the PX4 Flight-stack. • Ubiquiti Rocket M5 long-range wireless datalink. • Forward facing stereo cameras, bottom facing optical flow camera and separate monocular camera. • All low-level sensors like GPS, compass and actuator controllers (ESCs) are interfaced via the CAN bus.

  17. PROJECT ARTEMIS SOFTWARE FRAMEWORK • Software architecture follows a high-level / low-level separation model for maximal reliability. The flight core is isolated from the application-level processing to ensure stability of the core vehicle operation, independent of the high-level system state. • Low-level tasks critical to flight control, like motor actuation and attitude estimation run on the PX4 Middleware on the NuttX RTOS. • High-level tasks like computer vision run on the onboard Linux computer, on top of the ROS (Robot Operating System) Middleware.

  18. PROJECT ARTEMIS THANK YOU! Visit www.uasys.io/research for more! Our open-source software stack is available at www.github.com/ProjectArtemis

Recommend


More recommend