400 likes | 715 Views
Sensor-Based Mapping and Sensor Fusion. By Rashmi Patel. Overview. Bayes Rule Occupancy Grids [Alberto Elfes, 1987] Vector Field Histograms-[J. Borenstein, 1991] Sensor Fusion [David Conner, 2000]. Bayes Rule.
E N D
Sensor-Based Mapping andSensor Fusion By Rashmi Patel
Overview • Bayes Rule • Occupancy Grids [Alberto Elfes, 1987] • Vector Field Histograms-[J. Borenstein, 1991] • Sensor Fusion [David Conner, 2000]
Bayes Rule • Posterior (Conditional) probability – probability assigned to a event given some evidence • Conditional Prob. Example- Flipping coins: • P(H) = 0.5 P(H | H) = 1 • P(HH) = 0.25 P(HH | first flip H) =0.5
Bayes Rule- continued • Bayes Rule: P(A|B) = P(A)P(B|A) P(B) • The useful thing about Bayes Rule is that it allows you to “turn” conditional probabilities around • Example: P(Cancer) = 0.1, P(Smoker) = 0.5, P(S|C) = 0.8 P(C|S) = ? P(C|S) = P(S|C)*P(C) / P(S) = 0.16
Occupancy Grids [Elfes] • In the mid 80’s Elfes starting implementing cheap ultrasonic transducers on an autonomous robot • Because of intrinsic limitations in any sonar, it is important to compose a coherent world-model using information gained from multiple reading
y x Occupancy Grids Defined • The grid stores the probability that Ci = cell(x,y) is occupied O(Ci) = P[s(Ci) = OCC](Ci) • Phases of Creating a Grid: • Collect reading generating O(Ci) • Update Occ. Grid creating a map • Match and Combine maps from multiple locations Ci
22.5 deg Occupancy Grids Sonar Pattern • 24 transducers, in a ring, spaced 15 degrees a part • Sonars can detect from 0.9-35 ft • Accuracy is 0.1 ft • Main sensitivity in a 30 cone • -3db sensitivity from middle 15 (1/2 response) Beam Pattern
Occupancy Grids Sonar Model • Probability Profile- Gaussian p.d.f. is used but that is variable p(r | z,)=1/(2r)*exp[ (-(r-z)2/2r2) - ((2)/2)] • Where r is the sensor reading and z is actual distance Distance R Ranging error Somewhere Occupied Probably Empty Range Measurement Rmin angle
y x Occupancy Grids Notation • Definitions: • Ci is a cell in the occupancy grid • s(Ci) is the state of cell Ci (i.e. value of that cell) • OCC means OCCUPIED and whose value is 1 Ci
Occupancy Grids Bayes Rule • Applying Baye’s Rule to a single cell s(Ci) with sensor reading r: P[s(Ci) = OCC | r] = P[r | s(Ci) = OCC] * P[s(Ci) = OCC] -------------------------------------------------------------------------------- p[r | s(Ci)] * P[s(Ci)] • Where p(r) = p[r | s(Ci)] * P[s(Ci)] summed over the cells that intercept the sensor model • Then apply this to all the cells creating a local map for each sensor
Occupancy Grids Bayes Rule Implemented Prior Likelihood P[s(Ci) = OCC | r] =P[r | s(Ci) = OCC] * P[s(Ci) = OCC] ----------------------------------------------------------------------------------------- p[r | s(Ci) = OCC] * P[s(Ci) = OCC] • P[s(Ci) = OCC | r] is the probability that a cell is Occupied given a sensor reading r • P[r | s(Ci) = OCC] is the probability that sensor reading is ‘r’ given the state of cell Ci (this value is found by using the sensor model) • P[s(Ci) = OCC] is the probability that the value of cell Ci is 1 or that s(Ci) = OCC (this value is taken from the occupancy grid) Normalize
Let the red oval be the somewhere Occupied region The Yellow blocks are in the sonar sector The black lines are the boundaries of that sonar sector P(r) = Sum over all of those yellow block using the sonar model to figure out the Probability P(r) Occupancy Grids Implementation Ex Occupied Range y x
Occupancy Grids Multiple Sonars Combining Readings from Multiple Sonars: • The Grid is updated sequentially for t sensors {r}t = {r1,…,rt} • To update for new sensor reading rt+1: P[s(Ci) = OCC | rt+1] = P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] -------------------------------------------------------------------------------- p[rt+1 | s(Ci)] * P[s(Ci) |{r}t]
Occupancy Grids Equations P[s(Ci) = OCC | rt+1] = P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] -------------------- ----------------------------------------------------------------------------------------- P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] • P[s(Ci) = OCC | rt+1]is the probability that a cell is Occupied given a sensor reading r • P[rt+1 | s(Ci) = OCC] is the probability that sensor reading is ‘r’ given the state of cell Ci (this value is found by using the sensor model) • P[s(Ci) = OCC|{r}t]is the probability that the value of cell Ci is 1 or that s(Ci) = OCC (this value is taken from the occupancy grid)
Occupancy Grids Multiple Maps Matching Multiple Maps • Each new map must be integrated with existing maps from past sensor readings • The maps are integrated by finding the best rotation and translation transform which results in the maps having best correlation in overlapping areas Occupancy Grid
Occupancy Grids Matching Maps Ex Example 1: A simple translation of maps Center of Robot at (2,2) New Map: Combined map 1&2 Map 1: Map 2: After translating P(cell3) = P(cell1)+P(cell2)- P(cell1)*P(cell2)
Occupancy Grids Vs Certainty Grids • Occupancy Grids and Certainty Grids basically the same in the method that is used to • Collect readings to generate Probability Occupied and for Certainty grids Probability Empty • Create grid from different sonars • Match Maps to register from other locations • Difference arise from the fact that Occ. Grids use conditional prob. to determine Probability Occupied while Certainty Grids use simpler math models
Occupancy Grids Vs Certainty Grids • Both have a P.d.f for the sonar model • However the major difference is in finding the probability that a cell is occupied • First Pempty is computed for a cell • Then Poccupied is computed using Pocc = 1-Pemp • Then Pocc is normalized over the sonar beam and combined with the value of that cell from other sonars and Pocc(sonar reading r)
Vector Field Histograms[Borenstien] • The VFH allows fast continuous control over a mobile vehicle • Tested on CARMEL using 24 ultrasonic sensors placed in a ring around the robot • The scan times range from 100 to 500 ms depending on the level of safety wanted
Vector Field HistogramsNotation • The VFH uses a two dimensional Cartesian Histogram grid similar to certainty grids [Elfes] • Definition: • CVmax = 15 • Cvmin = 0 • d is the distance returned by the sonar • Increment value is 3 • Decrement value is –1 • VCP is Vehicle center point • Obstacle vector-vector point from cell to VCP
Vector Field HistogramsHistogram Grid • The histogram grid is incremented differently from the certainty grid • The only cell incremented in the grid is the cell which is distance d away and lying on the acoustic axis of the sonar • Similarly only the cells on the acoustic axis and are less than distance d are decremented
Vector Field HistogramsPolar Histogram • Next the 2-D histogram grid is converted into a 1-D grid called the Polar histogram • The Polar Histogram, H, has n angular sections with width a
Vector Field HistogramsH mapping • In order to generate H, we must map an every cell in the histogram grid into H
Vector Field HistogramsDiscrete H grid • Now that the Object vectors for every cell have been computed, we have to find the magnitude of each sector in H
Vector Field HistogramsThreshold • Once the Polar Object Densities have been computed, H can be threshold to determine where the objects are located so that they can be avoided. • The choose of this threshold is important. Choosing to high a threshold and you may come too close to a object and too low may cause you to lose some valid paths
Sensor Fusion [D. Conner] David C ConnerPhD Student Presentation on his thesis and the following paper: “Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle”
Sensor Fusion Navigator • 2 front wheels are driven and third rear wheel is a caster • 2 separate computer systems, PC handles sensor fusion and PLC handles motor control • 180 degree laser range finder with 0.5 resolution • 2 color CCD cameras Navigator- 3wheeled differentially driven vehicle
Cameras and Frame Grabbers • Because the camera is not parallel to the ground the image must be transformed to correctly represent the ground • The correction is done using the Intel Image Processing Library (IPL)
Cameras and Frame Grabbers • Since there are two cameras the two images must be combined • The images are transformed into the vehicle coordinates and combined using the IPL functions
Image Conversion Once a picture is captured • It is converted to gray scale • It is blurred using a gaussian Convolution mask Example shown is below
Image Conversion continued • Then the threshold of image is taken to limit the amount of data in the image • The threshold value is chosen to be above the norm of the intensities from the gray scale histogram • Then resulting image is pixilated to store in a grid
Laser Range Finder • SICK LMS-200 laser rangefinder return 361 data points for a 180 degree arc with 0.5 degree resolution • Values above a certain range or ignored
Vector Field Histograms • VFH are nice because it allows us to easily combine our camera data and our laser rangefinder data to determine most accessible regions • Several types of polar obstacle density (POD) functions can be used (linear, quadratic, exponential) • POD = KC(a-b*d)
POD values-Laser Rangefinder • The POD values for the laser are determined by: • Using the linear function shown above to transform the laser data into POD values • Then for every two degrees a max of the POD values in that arc is chosen as the final value
POD values-Images • POD values for the image are pre-calculated and stored in a grid at startup • The pre-calculated values are multiplied by the pixilated image also stored in a grid. (The overlapping cells would be multiplied) • For every 2degree arc the cell with the highest POD values is chosen to the value of that arc
Combining VFH • The two VFH’s are then combined by taking the max POD for each sector • The max POD is chosen because that represents the closest object
Bibliography • Elfes, A. “Occupancy Grids: A Stochastic Spatial Representation for Active Robot Perception.” July 1990 • Elfes, A. “Sonar-Based Real-World Mapping and Navigation”, June 1987 • Borenstein, J. “The Vector Field Histogram- Fast Obstacle Avoidance for Mobile Robots”, June 1991 • Conner, D. “Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle”,