180 likes | 337 Views
FastSLAM Algorithm. As interpreted by Aaron Lee. Figure 13.2. For each particle: “Retrieval”/”Prediction” Move particle “Measurement update” For each feature seen by the particle, update its mean and covariance “Importance weight” Calculate how likely this particle is Resample
E N D
FastSLAM Algorithm As interpreted by Aaron Lee
Figure 13.2 • For each particle: • “Retrieval”/”Prediction” • Move particle • “Measurement update” • For each feature seen by the particle, update its mean and covariance • “Importance weight” • Calculate how likely this particle is • Resample • Duplicate and delete particles based on goodness
Get next particle Move particle
Get next particle Move particle Loop through all features in the particle and determine how well each of them correlate with the measurement.
Get next particle Move particle Loop through all features in the particle and determine how well each of them correlate with the measurement. Set default importance if new feature
Get next particle Move particle Loop through all features in the particle and determine how well each of them correlate with the measurement. Set default importance if new feature Pick the feature that corresponds best to the measurement
Get next particle Move particle Loop through all features in the particle and determine how well each of them correlate with the measurement. Set default importance if new feature Pick the feature that corresponds best to the measurement If is a new feature, increase number by 1
Get next particle Move particle Loop through all features in the particle and determine how well each of them correlate with the measurement. Set default importance if new feature Pick the feature that corresponds best to the measurement If is a new feature, increase number by 1 If new feature, set mean to what measurement said. Remember that we have seen this feature once
Get next particle Move particle Loop through all features in the particle and determine how well each of them correlate with the measurement. Set default importance if new feature Pick the feature that corresponds best to the measurement If is a new feature, increase number by 1 If new feature, set mean to what measurement said. Remember that we have seen this feature once If we have seen this before, update mean and covariance. We have seen this feature once more.
If we didn’t see the feature, don’t change its properties
If we didn’t see the feature, don’t change its properties However, if we should have seen it, remember that we should have and didn’t
If we didn’t see the feature, don’t change its properties However, if we should have seen it, remember that we should have and didn’t If we see a feature less than 50% of the time then throw it away
If we didn’t see the feature, don’t change its properties However, if we should have seen it, remember that we should have and didn’t If we see a feature less than 50% of the time then throw it away We are done with this particle, so move it to temporary storage place
If we didn’t see the feature, don’t change its properties However, if we should have seen it, remember that we should have and didn’t If we see a feature less than 50% of the time then throw it away We are done with this particle, so move it to temporary storage place Empty out main particle storage place
If we didn’t see the feature, don’t change its properties However, if we should have seen it, remember that we should have and didn’t If we see a feature less than 50% of the time then throw it away We are done with this particle, so move it to temporary storage place Empty out main particle storage place Now we resample. This involves selecting M particles to copy from the temporary storage place to the main storage. The probability of any given particle of getting selected is based on its importance factor. w=target/proposal