260 likes | 341 Views
Robust Beacon Localization from Range-Only Data. Edwin Olson (eolson) John Leonard (jleonard) Seth Teller (teller) (@csail.mit.edu). MIT Computer Science and Artificial Intelligence Laboratory. Outline. Our goal: Navigate with LBL beacons, without knowing the beacon locations
E N D
Robust Beacon Localization from Range-Only Data Edwin Olson (eolson) John Leonard (jleonard) Seth Teller (teller) (@csail.mit.edu) MIT Computer Science and Artificial Intelligence Laboratory
Outline • Our goal: • Navigate with LBL beacons, without knowing the beacon locations • Filtering range data without a prior • Outlier rejection with very noisy data • SLAM with estimated beacon locations • Optimal exploration
Problem Statement • Simultaneous Localization and Mapping (SLAM) • Range-only measurements • Features only partially observable • Use vehicle’s dead reckoning to bootstrap solution • Applications • Covert mine sweeping (beacons not calibrated) • Detecting movement of a “stationary” beacon • SLAM with uncalibrated sensor networks.
Basic Idea • Record range measurements while traveling a relatively short distance. • Initialize feature in Kalman filter based on triangulation. • Continue updating both robot state and beacon position with EKF. but…
Feature Initialization • This is the hard step. • Noise is major issue • No prior with which to do outlier detection! • The noise is not well behaved…
Noise is not Gaussian • Easy solution (LSQ) if range error is Gaussian. • It’s not. These extreme outliers will cause trouble in any linear filter Distribution of LBL error (relative to true range). Best Gaussian fit in red. (GOATS’02 data)
Noise is not independent or stationary Nasty, consistent-looking outliers There’s no signal at all here… but there is dependent noise.
Median Windows (baseline algorithm) • Method: • Compute distribution of data z(t) around time t • Outlier if z(t)<lowPercentile or z(t)>highPercentile • Pros • Simple, Fast • Cons • Can’t distinguish stationary garbage from a real signal • Three sensitive parameters to tune • Cannot take advantage of multiple observations from different AUVs.
Median Windows Median window misclassifies inliers • Hard to tune! • Data dependent • Inevitably throwing away good data in order to avoid outliers
Improving Outlier Rejection • Add geometrical constraints • Require measurements to intersect • In AUVs, we don’t get much data • Extract everything we can out of what we have • We can afford to do more processing; not CPU limited.
Inconsistent Consistent (two possible solutions) Measurement Consistency • Consider pair-wise measurement consistency • Imposes geometrical constraint on accepted points • How do we turn pair-wise constraints into a global classifier?
Spectral Clustering Formulation • Consider Markov process • Every measurement is a single state • Define transition matrix P • Consistent states have high probability transitions • Find the steady-state state probability vector S. • (what state will we be in as t→∞ ?) t=0: S t=1: PS t=2: P2St=n: PnS • Best S is eigenvector of P with largest eigenvalue • (smaller eigenvalue components get smaller and smaller as t→∞)
Spectral Clustering • Use singular value decomposition (SVD) UΣVT=P • First column of U is solution to PS=λS with maximum λ. • Cluster based on thresholding U(:,1) by mean(U(:,1)).
Computation in blocks • Compute SVD for small sets of measurements • Manages computational cost: O(n3) • Avoids errors in transition matrix by bounding accumulated DR error • Becomes effective at N≈10 for typical LBL data • Performance very good at N≈25
Spectral Clustering • Each circle is a range measurement centered about the AUV’s dead-reckoned position • Blue circles are “inliers” • Black circles are “outliers” • Green triangle represent actual LBL position Spectral clustering of 25 measurements (GOATS’02 data)
Spectral Clustering Result Spectral Clustering, block size=25 Median Window (N=21, 20%, 80%)
Multiple vehicles • If vehicles positions are known in the same coordinate frame, just add the data and use the same algorithm. • No need to do outlier rejection independently on each AUV. • (More on this for AUV2004)
Effect of outlier rejection • PDF after outlier rejection… • Can we restore our Gaussian assumptions? • Maybe not quite • But we’re much better! Distribution of LBL error (relative to true range). Outliers rejected via Spectral Clustering. Best Gaussian fit in red. (GOATS’02 data)
Solution Estimation • Given “clean” data, estimate a beacon location • Or determine that it’s still ambiguous • K-means clustering of range intersections • Typically K=2 • We get a measure of cluster variance (confidence) • Least-squares solution within selected cluster
Solution Estimation • Put each intersection into a 2-dimensional accumulator • Extract peaks • We get multiple solutions and the number of votes for each • Initialize feature at mean of points in bucket
SLAM • Path with no priors (this work) • Note accuracy up to global translation/rotation • Error accumulated while “locking” • Dead-reckoned path in Red • EKF path with prior beacon locations in magenta
Optimal Exploration • Robot at x, beacon is at either A or B. • Disambiguate by maximizing the difference in range depending on actual location • i.e., maximize: • What should robot do now? Path leads to two possible solutions Path leads to only one plausible solution
Optimal Exploration: Solution • Gradient is easily computed • Absolute value handled by setting A to be the closest of A and B. Optimal robot motions given possible beacon locations at (-1,0) and (1,0). Arrow size indicates magnitude of ∆r per distance traveled.
Future Work • Guess beacon locations earlier and use particle filter to track the multiple hypotheses • Incorporate optimal exploration algorithm into experiment.
Questions/Comments • How can I make this better/more compelling for the conference?