620 likes | 628 Views
Explore methods for combining observation data from multiple robots to enhance accuracy and reduce uncertainty in object position estimation. Learn about sensor fusion, Gaussian representations, matrix multiplication, and calibration techniques. Discover how collaborative localization can improve efficiency and robustness in various applications.
E N D
Collaborative Observation and Localization Stuff to know for the final project
Motivation • More Complete World view • Reduce Uncertainty • Increase Accuracy • More Robust • Amortize cost of expensive sensors
Collaborative Observation Distributed Sensor Fusion for Object Position Estimation by Multi-Robot Systems Ashley W. Stroupe Martin C. Martin Tucker Balch
Goal • Define a method for how to combine data of two observers to more accurate data. • Observations are instantaneous at different vantage points • Be efficient to work in fast-based environment
Related Work • Little work in merging data • Some fill unsensed data • Merging sequential observations for Kalman filter tracking and occupancy grid updates • Usually based on discretizing the world • Use historical data
Representation • 2D Gaussian • Mean center location of the object • Standard deviation uncertainty
Combining • If assume independent can multiply • How to multiply?
Combining • Gaussian can be represented as a matrix • Use Matrix multiplication
Problem • The Gaussians are easily defined relative to each robot. • To combine they need to be a global coordinate system. • How to convert to canonical form?
Algorithm • Convert to Local Matrix Form • Rotate to Canonical Form • Multiply with other observation • Extract Angle • Rotate to local form • Extract mean center and noise
Basic Assumption • Sensor error independent and has normal distribution • Assume perfectly localized • Coordinate frames are coincident
Experiment • Three robots • 1.5 meters from global origin • On global axis
Experimental Assumptions • Assume observation simultaneous • Objects Unique • On Ground Play
Experimental Hardware • Cye Robots • Trailer and Drive Section • Onboard motor processor • Image Processing on Pentium 266 • Communicate by wireless Ethernet
Error Calibration • Camera Calibration • Flatfish – camera aberration • Experiment with known locations • Results used to find Guassian Parameters
Results • Not point by point more accurate with merged results • More consistent
Test Application • Location and Retrieval of unseen objects • One robot can see target, but not reach it • The other robot can not initially see target • Must communicate to retrieve target • The robots retrieve the target. There is much rejoicing.
Test Application • Blind Robot Tracking • Three robots track ball • When one is blinded by box still able to track the ball • Audiences every where dumbfounded
Test Application • Robot Soccer • Robo-cup soccer • Ignore widely conflicting observations • Ignore localization uncertainty • Result • Used as starting point when ball location lost • Robots able to find ball faster
Conclusion • Use Gaussian distributions based on Bayes’ Rule and Kalman filter theory to fuse observation • Applicable to any sensor • Need to resolve coordinate frame differences • Need to resolve uncertain localization • Need to remove ground plane limitation
Collaborative Multi-Robot Localization Dieter Fox Wolfram Burgard Hannes Kruppa Sebastian Thrun
Goal • Be able to do global localization • Have more accurate localization • Amortize the cost of sensors • More efficient localization • Work in known environment
Related Work • Many look at single robot localization • Adjust for odometer errors. Require initial position. • Use Kalman Filters and second moment of density or Markov Localization • Lock Step multi-robot localization • Little raw sensor info used
Method • Markov Localization • Detection Models that calculate noise and uncertainty • Statistical Sampling • Density Trees
Some Notation • For N Robots dn 1<=n<= N is the is a sequence of three types of data • Odometry measurements - an • Environment measurents – on • Detections – rn
Markov Localization • Each robot maintains belief distribution over it’s position • Uses dead reckoning and environment. Detections not considered initially • Belief at time 0 initialized • Belnt(L)is the belief of robot n about it’s location at time t
Math Derivations • How to calculate belief given environment data
More Math derivation • How to update belief given odometry info.
More Math • To add in the detection of other robots • Assume P(L1……Ln | d) = P(L1|d) ..P(Ln|d) • Not completely accurate. Work in practice and much easier
More Math Derivations • To calculate the effect of another robots belief about my location
Limitations of Math • Does not use negative sights • Repeated use another robots belief will use the same evidence multiple times • only report detection after significant movement from last detection
How to represent • Robot positions are a continuous valued • How to model belief of all these positions? • Monte Carlo Localization • Sample representation • Sample/importance resample to propagate beliefs
Monte Carlo Localization • A set of K samples in S • S = {si | i = 1..K} • si = <li ,pi> • li = < x, y, Angle> • pi = numerical weighting factor • sum of all pi in S = 1
Monte Carlo Localization • Robot Moves • Generate K new sample • Draw sample s randomly from S based on ps • ls generated from ls using P(ls | l’s, a) • ps = 1/K
Monte Carlo Localization • For each new sample s • ps = P(os | ls) • All p are then normalized to have them sum to 1 • Small number of random uniform samples added to overcome local minima
Monte Carlo Localization • Trade off in sample size between computational efficiency and accuracy • Focuses resources on high likely hood
Example • Example
Multirobot MCL • Both robots keep samples • How to relate these two sample sets when combining detection beliefs? • Density Trees
Density Tree • Adaptively sub-divides space into rectangular regions • Start with one giant rectangle. • If fails the stopping condition rectangle is split along long side. • Continues recursively
Density Tree • Example
Density Tree • Density of each rectangle is defined as • Density = Sum of all p in rect/Volume of rect. • This density is used to get this update rule
Detection Algorithm • Use camera and vision processing to detect presence of a robot • Scan laser scan data to find convex structure • Use the laser scan at center of convex shape to find distance and angle.
Detection Model • Detection Function
Learn Detection Error • Ran training peroid to learn P(Lm|Ln,rn) • When check for robot. Record reported location comapred with actual location • Actual location determined by MCL • Data used to compute Gaussian. • 6.7% chance miss detection • 3.5% chance incorrect detection