Final Presentation Fall 2016 Team 1717-Trinity Firefighting Robot Bobby Barrett (Computer Engineering) Kevin Burke (Electrical Engineering) Connor McCullough (Electrical Engineering) Zach Rattet (Electrical Engineering)
Overview of the Robot ● Rules & Regulations ● Frame & Wheels ● Power ● Motors ● Flame Detection ● Fire Extinguisher ● Navigation Sensors ● Navigation Approach ● Microcontroller(s) ● Budget ● Timeline
Trinity International Firefighting Robot Competition ● When? → April 1st & 2nd, 2017 ● Where? → Trinity College (Hartford) ● What? → Build an autonomous robot capable of navigating a maze and extinguishing a fire represented as a candle
Competition Rules ● The Robot: ○ Once turned on, the robot must be self-controlled without any human interaction. ○ The robot may bump into or touch the walls as it travels, but it cannot mark, damage, or move the walls in doing so. ○ The movement of the robot must not damage the floor of the arena. ○ The robot cannot leave or drop any items in the area as it travels. ○ The robot must fit inside of a box with base dimension 31cm x 31cm and 27cm tall. ○ The robot may not separate into multiple parts. ○ There is no weight restriction. ○ The robot must have a carry handle. ○ The robot must have an arrow indicating front.
Competition Rules ● Sound Activation: ○ All robots must be sound activated for the competition. ○ The microphone must: ■ Be located on the top surface of the robot and be accessible from above. ■ Be less than 2cm below any other mechanical part. ■ Have a blue background ■ Have the abbreviation ‘MIC’ printed in a contrasting color adjacent to the microphone. ○ The microphone will be positioned 25mm away from the sound starting device. ● Safety: ○ Any Contest official may stop, by pulling the robot’s kill power plug (via remote, button, battery removal, etc…) if at any time if, in their opinion, the robot is performing or is about to perform any action that could be dangerous or hazardous to people, facilities, or other equipment.
Competition Rules ● Flame Detection and Extinguishing: ○ Each robot must have a flame detect LED. This LED will turn on when your robot detects a flame. This led must be visible from all directions. ○ There are no sensor restrictions for the robot. ○ There are 4 different methods of extinguishing that are permitted: ■ (1) Air based extinguisher ■ (2) Carbon Dioxide extinguisher ■ (3) Water mist or spray ■ (4) Mechanical (ex: Wet Sponge)
Competition Rules Levels of Competition: LEVEL 1 → Basic Configuration LEVEL 2 → 4 Possible Configurations
Basic Design Concept
Sound Activation ● Tone Decoder and Detector LM567 ● Programed via Arduino ● Bandpass Filter Built In ● Frequency Detection for Robot Start: 3.8khz ● Sends Start Signal When Specific Frequency is Detected
Frame and Wheels ● Last Years Frame ○ Fits competition design constraints ○ Modular Design ○ Lightweight ● Wheels ○ 60mm X 8mm ■ Hard plastic rim with rubber tire ○ 90mm X 8mm
Power ● Nickel-Metal Hydride Battery (NiMH) ○ No memory effect ○ Cost effective (i.e. size to capacity ratio) ○ Larger than LiPo ● Lithium-Ion Polymer (LiPo) ○ LiPo: 545g, 138mm & NiMH: 1091g, 305mm ○ Better energy density ○ Higher discharge current Selection: LiPo Volumetric and Specific Energy Densities for Lead Acid, NiCd, NiMH, and LiPo. ● 14.8V, 5300mAh, 35C, 185.5A
Motors ● Stepper Motor ○ Low efficiency ○ No feedback ● Servo Motor ○ Feedback ○ Simple to control ○ Accelerates quickly ● Brushless DC Motor with Encoders ○ Feedback from encoders ○ Efficient ○ Two motors can be synchronized Selection: DC Motor with Encoders ● 12V Nominal Voltage, 0.7A Stall Current, 159 RPM
Fire Extinguishing ● CO2 extinguishing system ○ Most effective gas extinguishing method ○ Receives extra point in competition ○ Precise extinguishing of the candle possible with the heat map of the flame sensor. Innovations Hammerhead
Flame Detection ● RoBoard RM-G212 16x4 Thermal Array Sensor ● 64 pixel infrared array ● Produces a map of heat values ● Temperature range: -20°C to 300°C ● 0.02 Degree Celsius uncertainty ● Supply voltage: 3V ● Field of View: 60° horizontal, 16.4° vertical
Flame Sensor Data Collection ● Flame sensor placed two feet away from lit candle ● Data averaged over five samples and exported into excel for visual representation of heatmap
Computation and Data Processing ● Arduino MEGA 2560 ● Raspberry PI 3 ○ 1.2 GHz clock ○ 16 MHz clock ○ 40 GPIO pins ○ 54 I/O pins ○ Interprets ultrasonic sensor data and ○ Directly controls motors and calculates motor commands. servos through PWM ○ Directly control overall robot operation including microphone startup, flame extinguishing, and interpretation of flame sensor data (I2C)
Navigation (Sensors) ● Infrared ○ Low cost, low resolution ● Laser ○ Expensive, long processing time ● Bumpers ○ Involves bumping into walls (not useful) ● Ultrasonic ○ Can easily interface multiple sensors (5 or 6) ○ Easy to implement and read data ○ Low cost
Devantech SRF05 Ultrasonic Range Finder ● Ranges 1cm to 4m ● Resolution is 3cm-4cm ● Field of View: 55° ● Dimensions 43mm x 20mm x 17mm ● 5V input ● Is capable of using a single pin for both trigger and echo ● 5 or 6 sensor configuration
SRF05 Ultrasonic Test Data ●
SRF05 Ultrasonic Test Data
SRF05 Ultrasonic Test Data
SRF05 Ultrasonic Test Data At distances ⅛ inch (0.3175 cm) or closer, the sensor data started to jump around and displayed ➔ the max range
Navigation (algorithm) ● May know start point, may not know goal location (candle) ○ Arbitrary start is optional for level 1 and level 2 ● Hard coding ○ Wall-following ○ Turn left/right when there is an obstacle ● Simultaneous Localization and Mapping (SLAM): ○ Most applicable approach to unknown environments ○ Kalman Filters ■ Long processing time ■ Robot can confuse landmarks easily in the maze ● Particle Filters (Monte Carlo Localization) ■ No mapping, need global map ■ Good in nonlinear environments ■ Can only determine robot position, requires path planning
Monte Carlo Localization (MCL) ● Goal: is for the robot to determine its location within its environment ○ Will help the robot return to starting position (extra points) ● Algorithm starts with a uniform random distribution of particles ● Generates random guesses of where it is going to be next(guesses=particles) ● Sensor feedback helps to discard particles inconsistent with the robot’s observations ● Generates more particles with observations that are consistent ● Gets assigned a weighted value ● Weighted samples are grouped into new distribution ● Algorithm runs recursively
MCL Approach for Level Two ● Hard code all four possibilities into one big map ● We could pick the starting location ● We are able to narrow down where the obstacles may be ● Determine which section of the hard coded map that the robot is in ● Localization
System Block Diagram
Timeline Fall 2016
Order List (Received 11/21/2016)
Recommend
More recommend