310 likes | 463 Views
Towards the autonomous navigation of intelligent robots for risky interventions. _ Daniela Doroftei Royal Military School Av. De la Renaissance 30, B1000 Brussels, Belgium Daniela.Doroftei@rma.ac.be Contribution to ViewFinder – WP2. Goal and Problem Formulation.
E N D
Towards the autonomous navigation of intelligent robots for risky interventions _ Daniela Doroftei Royal Military School Av. De la Renaissance 30, B1000 Brussels, Belgium Daniela.Doroftei@rma.ac.be Contribution to ViewFinder – WP2
Goal and Problem Formulation • The goal of this research project is to prepare the RobuDem robot for an outdoor crisis management task.
Goal and Problem Formulation • To achieve this goal, the robot must be able to: • Be tele-operated by a remote user • Ensure its own safety by avoiding obstacles detected by its sensors (sonar, stereo, …) • Navigate autonomously in an unknown environment by mapping the surroundings • Detect chemical contamination • Navigate to pre-defined goal positions • Execute complex tasks like searching for human victims • Here we will focus on the design of the control architecture for such a robot
Robot Description: RobuDem • RobuDem-platform: • Outdoor all-terrain robot • 4 driving / steering wheels allowing for different drive modes • Supports heavy loads (up to 300kg)
Robot Description: Sensors • Ultrasound Sensors for obstacle detection • Joystick for remote control • Chemical sensor with integrated temperature sensor
Robot Description: Sensors • Differential GPS (and wheel encoders) for positioning
Robot Description: Sensors • Quad-cam vision system consisting of a Bumble-bee stereo vision system and 2 digital cameras.
Color Scheme • Sensors: : Green rounded rectangle • Processors: : Violet rectangle : Gold rectangle : Blue rectangle • Actuators: : Black oval Sensors Positioning & Mapping Visual Processors Behavior Processors Steer Robudem
Tele-operation • Transmission of joystick commands to the robot: • Transmission of filtered commands to the robot: Steer Robudem Joystick Filter Commands
Semi-autonomous control • In semi-autonomous control, there is no longer a direct link between joystick and robot actuators: • Instead, there is a Behavior-Based Controller managing the commands sent to the robot • The “Obey Joystick” behavior is only 1 of the multiple behaviors contributing to the global robot control strategy Steer Robudem Joystick Behavior Controller Obey Joystick Filter Commands
An example: Tele-operation + obstacle avoidance using sonar The joystick returns the user commands, telling the robot where to go. The “ObeyJoystick” behavior calculates the best action to perform in order to follow the users’ commands. The on-board sonars deliver information about obstacles in the robots’ path. The “AvoidObstaclesUsingSonar” behavior calculates the best action to perform in order not to bump into the detected obstacles. Joystick Sonar Obey Joystick AvoidObstacles UsingSonar Fuse Behaviors These commands are fused by a behavior-based controller, taking into account both behaviors to come to a globally optimal and consistent command to be sent to the robot Filter Commands Steer Robudem These commands are filtered and sent to the robot
The ViewFinder case: What behaviors do we need? • General requirements: • The robot needs accurate positioning • The robot needs to build up a model (map) of its environment to reason with this data • Robot Safety: • The robot needs to avoid areas with excessive heat • The robot needs to detect chemicals and avoid areas which are too contaminated • The robot needs to detect obstacles using its sonar sensors and its stereo vision system and avoid these obstacles • The robot needs to avoid all previously detected obstacles stored in the environmental model (map)
The ViewFinder case: What behaviors do we need? • Robot Goals: • The robot must be tele-operable • The robot needs to detect chemicals and find the source of contamination • The robot should maximize the knowledge about the environment it is put in. • The robot needs to search for human victims on the disaster site • The robot needs to be able to execute a user-defined trajectory, given through a set of waypoints • In the event of a loss of network connection, the robot should be able to return to the base station
Requirement 1: Accurate positioning and environment modeling Data from the rotation of the wheels, measured by wheel encoders, is used to estimate the position Camera Framegrabber GISMap Odometry GPS A real-time differential GPS system provides absolute positioning data by acquiring signals from at least 4 space-based satellites If available, GIS (Geographic Information System) data is used to initialize the maps The environ- ment is observed by a camera, which detects and tracks features in the environment to estimate the robot motion and to update the map Position Estimation Visual SLAM These 2 positioning estimates are fused a first time to come to a more accurate and robust position estimate Environmental model 6-dimensional position estimate A Visual Simultaneous Localization and Mapping module analyses all this input data. It builds up an environmental model (map) and places the robot accurately on this map.
Requirement 2: Avoiding hot zones and contaminated areas • A Visual SLAM module delivers robot position and a map • A chemical sensor instantaneously measures contaminant concentrations • A temperature sensor measures the temperature • This data is used to build up 2 maps containing the chemical and heat distribution • Using this map, 2 behaviors “AvoidChemicals” and “AvoidHotZones” try to steer the robot away from danger zones Visual SLAM Chemical Sensor Tempe- rature Local ChemicalMap Local HeatMap Avoid Chemicals Avoid HotZones
Odometry Position Estimation The ViewFinder case: What behaviors do we need? Visual SLAM GPS GISMap • The robot is controlled using a behavior-based controller which sends steering commands. • To generate these commands, the controller aims to fuse a number of objectives / tasks / requirements: • General requirements: • The robot needs accurate positioning • The robot needs to build up a model (map) of its environment to reason with this data Camera Framegrabber Fuse Behaviors Steer Robudem Filter Commands
Odometry Position Estimation Visual SLAM GPS GISMap Chemical Sensor Tempe- rature Local ChemicalMap Local HeatMap Avoid Chemicals Avoid HotZones Camera Framegrabber Fuse Behaviors • Robot Safety: • The robot needs to avoid areas with excessive heat • The robot needs to detect chemicals and avoid areas which are too contaminated Steer Robudem Filter Commands
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS GISMap Chemical Sensor Tempe- rature Local ChemicalMap Local HeatMap Avoid Chemicals Avoid HotZones Camera Framegrabber Fuse Behaviors • Robot Safety: • The robot needs to avoid all previously detected obstacles stored in the environmental model (map) Steer Robudem Filter Commands
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS GISMap Chemical Sensor Tempe- rature Local ChemicalMap Local HeatMap Avoid Chemicals Avoid HotZones Camera Framegrabber Fuse Behaviors • Robot Safety: • The robot needs to detect obstacles using its sonar sensors and avoid these obstacles Steer Robudem Filter Commands AvoidObstacles UsingSonar Sonar
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS GISMap Chemical Sensor Tempe- rature • Robot Safety: • The robot needs to detect obstacles using its stereo vision system and avoid these obstacles Local ChemicalMap Local HeatMap Avoid Chemicals Avoid HotZones Camera Framegrabber Fuse Behaviors Steer Robudem Filter Commands AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS GISMap Chemical Sensor • Robot Goals: • The robot must be tele-operable Tempe- rature Local ChemicalMap Local HeatMap Avoid Chemicals Avoid HotZones Obey Joystick Joystick Camera Framegrabber Fuse Behaviors Steer Robudem Filter Commands AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS GISMap Chemical Sensor • Robot Goals: • The robot needs to detect chemicals and find the source of contamination Tempe- rature Local ChemicalMap Local HeatMap GoTo Chemicals Avoid Chemicals Avoid HotZones Obey Joystick Joystick Camera Framegrabber Fuse Behaviors Steer Robudem Filter Commands AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor • Robot Goals: • The robot should maximize the knowledge about the environment it is put in Tempe- rature Local ChemicalMap Local HeatMap GoTo Chemicals Avoid Chemicals Avoid HotZones Obey Joystick Joystick Camera Framegrabber Fuse Behaviors Steer Robudem Filter Commands AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor • Robot Goals: • The robot needs to search for human victims Tempe- rature Local ChemicalMap Local HeatMap GoTo Chemicals Avoid Chemicals • Humans are searched for in each of the 4 camera images • A “Search Humans” behavior directs the robot in the persons’ direction Avoid HotZones Person Detector Obey Joystick Search Humans Joystick Camera Framegrabber Fuse Behaviors Person Detector Camera Framegrabber Steer Robudem Filter Commands Person Detector Person Detector AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor Goal Assigner • Robot Goals: • The robot must be able to execute a user-defined trajectory • The user defines goals • These goals are decompo-sed into a set of waypoints by a Global PathPlanner using the SLAM map and the robot position • A “GoToGoals” behavior directs the robot into the direction of the waypoints Tempe- rature Local ChemicalMap Global PathPlanner Local HeatMap GoTo Chemicals GoTo Goals Avoid Chemicals Avoid HotZones Person Detector Obey Joystick Search Humans Joystick Camera Framegrabber Fuse Behaviors Person Detector Camera Framegrabber Steer Robudem Filter Commands Person Detector Person Detector AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor Goal Assigner • Robot Goals: • In the event of a loss of network connection, the robot should be able to return to the base station • A “ConnectionChecker” constantly monitors the network connection • In the event of a network failure, a “ReturnToBase” behavior uses the map and robot and base station position to guide the robot to the base. Tempe- rature Connection Checker Local ChemicalMap Global PathPlanner Local HeatMap Return ToBase GoTo Chemicals GoTo Goals Avoid Chemicals Avoid HotZones Person Detector Obey Joystick Search Humans Joystick Camera Framegrabber Fuse Behaviors Person Detector Camera Framegrabber Steer Robudem Filter Commands Person Detector Person Detector AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
One last requirement: • A human operator must be able to intervene in this robot control process • The human operator judges which tasks / objectives have greater priority and should be executed first. Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor Goal Assigner Tempe- rature Connection Checker Local ChemicalMap Global PathPlanner Local HeatMap Return ToBase GoTo Chemicals GoTo Goals Avoid Chemicals Avoid HotZones Task Assigner Person Detector Obey Joystick Search Humans Joystick Camera Framegrabber Fuse Behaviors Person Detector Camera Framegrabber Steer Robudem • This is achieved by semantically defining a set of tasks which are translated to different activity levels for the different behaviors Filter Commands Person Detector Person Detector AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: • The Final Control Architecture: Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor Goal Assigner Tempe- rature Connection Checker Local ChemicalMap Global PathPlanner Local HeatMap Return ToBase GoTo Chemicals GoTo Goals Avoid Chemicals Avoid HotZones Task Assigner Person Detector Obey Joystick Search Humans Joystick Camera Framegrabber Fuse Behaviors Person Detector Camera Framegrabber Steer Robudem Filter Commands Person Detector Person Detector AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo
Conclusions • Using this modular behavior based control framework the robot can: • Be tele-operated by a remote user • Ensure its own safety by avoiding obstacles detected by its sensors (sonar, stereo, …) • Navigate autonomously in an unknown environment by mapping the surroundings • Detect chemical contamination • Navigate to pre-defined goal positions • Execute complex tasks like searching for human victims
Odometry AvoidObstacles UsingSLAM Position Estimation Visual SLAM GPS Maximize TerrainKnowledge GISMap Chemical Sensor Goal Assigner Tempe- rature Connection Checker Local ChemicalMap Global PathPlanner Local HeatMap Return ToBase GoTo Chemicals GoTo Goals Avoid Chemicals Avoid HotZones Task Assigner Person Detector Obey Joystick Search Humans Joystick Camera Framegrabber Fuse Behaviors Person Detector Camera Framegrabber Steer Robudem Filter Commands Person Detector Person Detector AvoidObstacles UsingSonar Stereo Framegrabber Sonar AvoidObstacles UsingStereo