570 likes | 721 Views
Development and Implementation of a High-Level Command System and Compact User Interface for Nonholonomic Robots. Hani M. Sallum Masters Thesis Defense May 4, 2005. Outline. Overview and Goals Development Control System Data Analysis and Mapping Graphical User Interface Results.
E N D
Development and Implementation of a High-Level Command System and Compact User Interface for Nonholonomic Robots Hani M. Sallum Masters Thesis Defense May 4, 2005
Outline • Overview and Goals • Development • Control System • Data Analysis and Mapping • Graphical User Interface • Results
Overview This work details the design and development of a goal-based user-interface for unmanned ground vehicles which is maximally simple to operate, yet imparts ample information (data and commands) between the operator and the UGV.
Other UGV Issues • Multi-person crews • Proprietary Operator Control Units (OCU’s) for each UGV Is there a way to have local users control UGV’s without the operational overhead currently required?
Why UA/GV’s? • Over the last two decades there has been a dramatic increase in the complexity/availability of manufactured electronics. • As a result, the capital cost of robotic systems in general has decreased, making them more feasible to implement. Example: N.A.S.A. Mars Pathfinder/Sojourner system was built largely out of commercially available (OTS) parts (sensors, motors, radios, etc.)1. • Additionally, the capacity and functionality of devices such as PDA’s and cellular phone has increased as well. 1. N.A.S.A., Mech. Eng. Magazine, Kodak
Motivation: Q: Do custom O.C.U.’s need to be developed when commercial technology is evolving so rapidly? Considering the ubiquity of PDA’s, smartphones, etc., is it possible to develop a method of using these devices as a form of common O.C.U.?
Goals: • Develop a control system for a UGV • Automates low-level control tasks • Develop a method of rendering sensor data into maps of the UGV’s environment • Design a GUI which runs on a commercially available PDA
Camera Sonar Laser Rangefinder Hardware: Robot iRobot B21R Mobile Research Robot (nonholonomic)
Definition of Nonholonomic Unable to move independently in all possible degrees of freedom. Example: Cars have 3 degrees of freedom (x,y, q), but can not move in x or q alone.
802.11/BluetoothAntenna Windows CE Operating System Hardware: PDA Hewlett-Packard iPAQ 240x320 Color Screen Touch Sensitive
Navigation Control System Two aspects of the navigation process: • Target Approach • Obstacle Avoidance Multimodal Controller Separate control laws depending on the desired operation of the robot.
Proven Method • Schema Architecture [Chang et al.] • Discrete shifts between control modes • Straightforward to implement • “chattering” between modes
Proposed Method • Fuzzy Control [Wang, Tanaka, Griffin] • Gradual shifts between control modes • More complicated controller • Smoother trajectory through state-space
Fuzzy Controller Sensor Data Target Approach Mode Obstacle Avoidance Mode {K,w} {K,w} Fuzzy Blending Fuzzy Control Signals {v,w}
Target Approach Control of Turning Velocity • Final orientation unconstrained • Implement a proportional controller driving the robot heading to a setpoint equal to the current bearing of the target (i.e. qDEV 0) • Produce wAPP • Saturate the controller at the max allowable turning speed • Use high proportional gain to approximate an arc-line path
Target Approach Control of Forward Velocity • Final position close to target • Implement a proportional controller to scale the forward velocity, based on the robot’s distance to the target coordinates (i.e. DTAR 0) • Produce KAPP • Saturate the controller at KAPP =1 (scale to the maximum allowable forward speed)
Obstacle Avoidance Control of Turning Velocity • Implement a proportional controller driving the robot heading to a setpoint 90º away from the nearest obstacle (i.e. qOBS ±90º) • Produce wAVOID
Obstacle Avoidance Control of Forward Velocity • Implement a proportional controller to reduce (scale down) the forward velocity when nearing an obstacle • Produce KAVOID
Obstacle Avoidance • Forward Control • Inner threshold elliptical to avoid being stuck to obstacles:
Final Control Law Turning Control: • Blend the target approach and obstacle avoidance control signals using a weighted sum: WAPPwAPP + WAVOIDwAVOID = wFUZZY • Determine weights using membership functions based on the robot’s distance to the nearest obstacle.
Final Control Law Forward Control: • Blend the target approach and obstacle avoidance control signals by multiplying the maximum forward velocity by the scaling factors produced by each control mode. KAPPKAVOIDvMAX = vFUZZY
Outline • Overview and Goals • Development • Control System • Data Analysis and Mapping • Graphical User Interface • Results
Data Analysis and Mapping • Render data from the laser rangefinder into significant features of the environment: “Fiducial Points” e.g. corners, ends of walls, etc. • Use these fiducial points to generate primitive geometries (line segments) which represent the robot’s environment.
Object Detection Segment Detection Finding Intersections Line Fitting Categorizing Points FIDUCIAL POINTS Distillation Process Finding Fiducial Points RAW DATA
Why Find Fiducial Points? Laser Rangefinder Data
Object Detection Range vs. Bearing (used by Crowley [1985])
Segment Detection Recursive Line SplittingMethod used by Crowley [1985], B.K. Ghosh et al. [2000]
Segment Detection Proposed Threshold Function CREL: Relative Threshold CABS: Absolute Max Threshold
Line Fitting Use perpendicular offset least-squares line fitting
Categorization • Each fiducial point is either interior to an object, or at the end of an object. • Fiducial points at the ends of objects are either occluded or unoccluded.
Distillation Finding Fiducial Points
Mapping • Fiducial Points provide a clear interpretation of what is currently visible to the robot • Provide a way to add qualitative information about previously observed data to local maps Global map
Creating a Global Map Global Map:Occupancy Evidence Grid [Martin, Moravec, 1996] based on laser rangefinder data collection.
Creating a Global Map Global Map:Occupancy Evidence Grid [Martin, Moravec, 1996] based on laser rangefinder data collection.
Local Mapping Map Image: • Sample section of global map for qualitative a priori information about local area. • Overlay map primitives.
Local Mapping Example: Local map with and without a prior information.
Vision Mapping Vision Map: • Transform map primitives to perspective frame and overlay camera image of local area.
Vision Mapping Find common geometries for defining vertical and horizontal sight lines.
GUI • Serve web content from the robot to the iPAQ • Use image-based linking (HTML standard) to allow map images to be interactive on the iPAQ • Use web content to call CGI scripts onboard the robot, which run navigation programs on the robot
GUI Main Map/Command Screen
GUI Close-Range Map/Command Screen Long-Range Map/Command Screen
GUI Vision Map/Command Screen Rotation Map/Command Screen
Results • GUI: Main Map/Command Screen
Results • GUI: Rotation Map/Command Screen
Results • GUI: Vision/Command Screen