autonomous robotic projects at cyber physical systems
play

Autonomous Robotic Projects at Cyber Physical Systems Group Ol - PowerPoint PPT Presentation

Autonomous Robotic Projects at Cyber Physical Systems Group Ol Olive iver Hftberger, Vi Vienn nna U Uni nive versit ity of Techno nology ( (Au Austria) 04/12/2013 Outline Autonomous Systems Robotic Equipment Projects and


  1. Autonomous Robotic Projects at Cyber Physical Systems Group Ol Olive iver Höftberger, Vi Vienn nna U Uni nive versit ity of Techno nology ( (Au Austria) 04/12/2013

  2. Outline • Autonomous Systems • Robotic Equipment • Projects and Problems to solve • Outlook 2

  3. Autonomous Systems • Autonomous systems perform actions towards a goal with a high degree of autonomy, i.e. without human interaction. • System needs ability to • Gain information about environment • Plan actions to reach the goal • Move and Interact with the environment • Collaborate with other systems 3

  4. Robotic Equipment - Robots • 3 x MobileRobots Pioneer 3-AT • External Features: • SICK LMS 100 Laser Scanner • 0.5 – 20 m operating range • 270 ° field of view • Cannon VC-C50i PTZ Analog Camera • UHF RFID-Reader • Cyton Gamma 300 Manipulator Arm • 300 g payload • 53.4 cm total reach • Sonar Distance Sensors • Bumper Switches 4

  5. Robotic Equipment – Embedded Computers • Mamba Dual-Core, 2.26 GHz, 2 GB RAM, 60 GB SSD-Drive • CARMA GPU Development Kit • NVIDIA Tegra 3 ARM Cortex A9 Quad-Core, 2 GB RAM • NVIDIA Quadro 1000M with 96 CUDA Cores, 2 GB RAM • 120 GB SSD-Drive • WiFi and Ethernet Interfaces • Ubuntu Linux Operating System 5

  6. Robotic Equipment – Sensors • Proprietary Sensor Platform • Raspberry Pi, Model B, 700 MHz, 512 MB RAM • Sensors: • 3 x 3D Acceleration Sensors • 3D Gyroscopes • Digital Compass • Temperature Sensor • Pressure Sensor 6

  7. Robotic Equipment - Quadcopters • 2 x AscTec Pelican Drohnes • Linux Operating System 1) 1.6 GHz Intel Atom Processor Board, Laser Scanner 0.06 – 4 m range 2) 2.1 GHz Intel Core i7 Quad-Core Board, CMOS Camera • 3 x Parrot AR.Drone2.0 • Front (720p) and Floor (QVGA) Camera • Sonar Distance Sensors • Controllable via Smart Phone App 7

  8. Robot Operating System (ROS) 1/2 • Software framework for robots providing OS-like functionality on heterogeneous computer cluster • Developed 2007 by Stanford Artificial Intelligence Laboratory • Now further developed by Willow Garage • Seamless distribution of nodes • Linux, Windows, Mac OS X support • Implemented in C++ and Python, but other languages supported • Many ROS packages available (e.g., perception, planning, control, etc.) 8

  9. Robot Operating System (ROS) 2/2 • Service Oriented Architecture • Publish-subscribe communication pattern • Node creation and destruction during runtime • Module-based development 9

  10. Mapping, Localization and Planning • Mapping : creation of map of unknown environment • Localization : determination of location within given map • Simultaneous Localization and Mapping (SLAM) • Planning : organizing sequence of actions to reach a goal 10

  11. Probabilistic Information in Maps • Types of maps: • Static maps (e.g., street map) • Dynamic maps (e.g., weather map) • Probabilistic maps • Regions marked as possible obstacle (e.g., doors, objects, persons, …) • Improved localization and action planning 11

  12. Mapping Dynamic Areas 1/2 12

  13. Mapping Dynamic Areas 2/2 13

  14. Localization with Particle Filter • Particle : possible location of robot 14

  15. Localization with Particle Filter 15

  16. Vision-based Sensors • Determine • Motion of vehicle • Rotation of vehicle • Optical Flow or FFT-based method • Adaptation to quality of underground and driving situation • GPU Implementation 16

  17. Sensor Fusion “The integration of information from multiple sources to produce specific and comprehensive unified data about an entity.“ [Hal97] • Increase accuracy of sensor measurement • Generic Sensor Fusion and Filtering Framework Implemented as ROS Packages • Voting • Averaging • Kalman Filters • … 17

  18. Dynamic Reconfiguration • System Ontology • Machine-readable model of a system • Interdependence between system properties • Substitution of Failed Services • Increase of system dependability • Automatic exploitation of redundancy • Automatic Sensor Fusion and Filtering 18

  19. Communication between Autonomous Systems • Car2Car, Car2Infrastructure, ... • E.g., used to optimize road traffic • Automatic data exchange upon system encounter • Avoidance of data overflow • Validity of data • Temporal validity • Data dependent conditions Quelle: http://antyweb.pl/samochody-beda-rozmawiac-miedzy-soba-nadchodzaca- nowosc-od-mercedesa/ 19

  20. Communication Framework 20

  21. Autonomous Collaboration (Outlook) • Collaborative actions to reach a common goal • Interaction of robots with different capabilities (e.g., rovers, drones) • Example scenarios: 1. One rover uses camera to detect an object; a second rover uses the robot arm to pick the object 2. Drone inspects the terrain of the environment to guide a rover through 21

  22. Questions? ... thank you! Oliver Höftberger – oliver.hoeftberger@tuwien.ac.at 22

Recommend


More recommend