cs 309 autonomous intelligent robotics fri i lecture 17
play

CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting - PowerPoint PPT Presentation

CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting the Robot Coordinate Frames TF2 Forming Project Groups Instructor: Justin Hart http://justinhart.net/teaching/2019_spring_cs309/ Pick a BWIBot V2 V4 V2 Startup Find


  1. CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting the Robot Coordinate Frames TF2 Forming Project Groups Instructor: Justin Hart http://justinhart.net/teaching/2019_spring_cs309/

  2. Pick a BWIBot V2 V4

  3. V2 Startup ● Find the back of the robot – Yes, the screen faces the BACK

  4. V2 Startup ● The switch should be in the “charge” position when you find the robot.

  5. V2 Startup ● Set the switch to the neutral position

  6. V2 Startup ● Unplug the charging cable ● Set the switch to the “battery” position

  7. V2 Startup ● Go to the front of the robot

  8. V2 Startup ● Press the green button. – It should light up.

  9. V2 Startup ● Press the yellow button. – Yellow button and blue indicator light should light.

  10. V2 Startup ● Hit the power button on the laptop

  11. V4 Startup ● The V4 is different – This is actually the front of the robot.

  12. V4 Startup ● Disconnect the power supply. – First, undo the screws on the connector

  13. V4 Startup ● Disconnect the power supply. – Then disconnect the plug.

  14. V4 Startup ● Locate the Emergency Stop and Power Button – On the top at the front of the unit on the V4

  15. V4 Startup ● The Emergency Stop may be depressed – If it is, gently twist it to the right and it will lift.

  16. V4 Startup ● Press the power button.

  17. V4 Startup ● It will illuminate. – This indicates that the robot is now on.

  18. V4 Startup ● Reach back to the right inside the chasis. – You can feel the power button to turn it on.

  19. V4 Startup ● Note that the V4 has a wireless keyboard – If the batteries are dead, obtain assistance

  20. System Startup ● From here, both systems work the same

  21. System Startup ● Select Ubuntu on this screen using the arrow keys and hit “Enter” or wait for boot

  22. System Startup ● Log in as FRI – Using the FRI password. A mentor can help you.

  23. Doing your homework ● For your homework, you will do this on a real robot. ● These screenshots come from doing this in simulation . – Installing the simulators is easy if you want to try. – Installation instructions for BWI here: ● https://github.com/utexas-bwi/bwi – Tutorials on running the robot & simulators here: ● https://github.com/utexas-bwi/documentation/wiki/Software

  24. System Startup ● Open several terminals ● cd catkin_ws; source devel/setup.bash in each.

  25. System Startup ● roscore

  26. System Startup ● Startup your robot – roslaunch bwi_launch segbot_v2.launch (or v4)

  27. Pick your floor ● A screen will pop up asking what floor you are on. ● The AI floor is the 3 rd floor, and you are almost certainly there at this step of the process.

  28. Next, you want to move the robot ● The next step is to move the robot into the hallway, where you want to start. ● I generally do this before localizing the robot, just to get it out of people’s way.

  29. System Startup ● Rosrun segbot_bringup teleop_twist_keyboard

  30. System Startup ● Slow down the robot before moving. Hit z.

  31. Next, you want to move the robot ● The robot will move quicker than you expect. ● Press the ‘k’ button to stop its motion.

  32. System Startup ● Localize the robot. ● Hit “2D Pose Estimate” ● Click on where the front of the robot is on the map in rviz (the screen is on the BACK), and drag the cursor FORWARD. – Note that the robot graphic may not be there, because the real robot does not know where it is yet!! ● A green arrow will appear showing what you believe the robot’s pose to be.

  33. System Startup ● The system uses a probabilistic method to find the robot’s pose, so your localization just aids in this process. ● Instruct the robot to move a bit by using “2D Nav Goal” ● The robot’s localization will improve as it moves. ● 2D Nav Goal works the same as “2D Pose Estimate”

  34. System Startup ● rosrun bwi_tasks visit_door_list

  35. System Startup ● The robot will now start driving around visiting the doors in the hallway.

  36. System Shutdown ● ctrl-c in the terminal to kill visit_door_list ● teleop_twist_keyboard to drive back into the lab ● Do the start-up steps in reverse for powering on the robot in order to power off ● Plug back into the charger, and go back into charge mode

  37. TF Classes & Functions ● tf::TransformListener listener – Special class that listens on behalf of the TF library so it can compute transforms between frames. ● tf::StampedTransform transform; – A spacial transformation ● Listener.lookupTransform("odom", "base_link", ros::Time::now(), transform); – Looks up the transformation “base_link” into “odom”

  38. One more.. ● listener.waitForTransform("odom", "base_link", ros::Time(0), ros::Duration(4)); – Wait for the transform to be available – TF may not have heard enough data to make this transform work, and it will throw an error in that event.

  39. You can also send frames ● tf::TransformBroadcaster br ● br.sendTransform( tf::StampedTransform( transform, ros::Time::now(), fromFrame, toFrame)); ● StampedTransform - simply adds a timestamp to the transform ● ros::Time::now() - makes that timestamp now ● fromFrame, toFrame – The names the the frames.

  40. Poses & Transformation ● Pose – The position and orientation of an object in space – Generally expressed as a position and a rotation matrix into the correct pose ● Transformation – The relationship between two poses – Generally expressed as a translation (a position) and a rotation ● ROS has types for both! – But they contain basically the same data

  41. Poses & Transformation ● geometry_msgs::Pose pose ● tf::Transform transform – getOrigin() – pose.position ● getOrigin().x() (y(), z()) ● pose.position.x (y,z) – tf::Vector3 origin(x,y,z) – pose.orientation – setOrigin(origin) ● pose.orientation.x (y,z,w) – getRotation() – tf::Quaternion q(x,y,z,w) – Orientation is expressed – setRotation(q) as a quaternion – The tf::Quaternion class also ● We have software that supports handles it. ● Euler angles ● Axis & angle

  42. Example: In simulation ● roscore ● roslaunch bwi_launch simulation_v2.launch ● If we select “Global Options” on the left, we can change “Fixed Frame” to base_footprint, telling us to move rviz so it is focusing on the robot

  43. TF view_frames ● rosrun tf view_frames – This will generate a PDF of the current TF tree

  44. A few of these frames ● BWI uses a special map service to tell the robot which floor it is on ● “map” is our “global frame,” sometimes called the “inertial frame” – It is the top level, all frames are relative to it ● “3rdFloor_map” and “level_mux_map” are from the BWI map service. – You can safely ignore them ● “odom” – Short for “odometry.” – When we track the robot’s motion, it is relative to “odom.”

  45. Where is the robot? ● odom – The robot’s motion is tracked relative to odom – odom is generally in a fixed position ● base_footprint – Where the robot is – Remember, base_footprint is at (0,0,0) in base_footprint’s frame – So we compare to odom to know where the robot is

  46. Example ● We can watch this from the command line ● rosrun tf tf_echo /base_footprint /odom ● rosrun segbot_bringup teleop_twist_keyboard ● If we watch tf_echo, we can see the translation changing. – This is how our system knows where the robot is!

  47. More frames ● The robot has MANY frames ● base_link – The physical base of the robot’s mechanics ● Rather than base_footprint, where it is on the floor ● Decoupling the two simplifies modeling ● chasis_base_plate_link – Where parts are mounted on the BWIBot ● laser_obstacle – Where the laser sees the nearest obstacle ● caster_wheel_link, base_link_left_wheel, base_link_right_wheel – The robot’s wheels

  48. AR Tags ● Let’s take a look at tracking an object and finding its coordinate frame. ● Alvar – An open source library for tracking AR Tags

  49. Virtual Reality ● The simulation of a 3D world that can be experienced first- hand ● 3D – Video games – Virtual worlds – Headsets

  50. Augmented Reality ● Adds 3D content to images and perception of the real world ● The photo on the right merges real content and virtual content in a real-estate application

  51. Augmented Reality Toolkit ● What is an AR Tag? ● Augmented Reality Tag – The image on the right comes from a demonstration of Augmented Reality Tool Kit – Provides a coordinate frame to render 3D content onto – We will use a similar package called Alvar

  52. How does this work? ● Recall our transforms – Translation – Rotation ● Augmented Reality uses projective transformations – Homographies – Projections

  53. Project Group Formation ● This group will – Do homework 5 together – Do homework 6 together – Do the final project together ● Take 15 minutes to choose a team – Email me you + your teammates + your EIDs and email addresses, 1 per team

Recommend


More recommend