280 likes | 390 Views
ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 19: Minimum Variance Estimation and Introduction to Sequential Processing. Announcements. Exam 1 Plan to return and review on Monday, Oct. 21 Homework 6 Posted.
E N D
ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 19: Minimum Variance Estimation and Introduction to Sequential Processing
Announcements • Exam 1 • Plan to return and review on Monday, Oct. 21 • Homework 6 Posted
Problem Statement • With the least squares solution, we minimized the square of the residuals • Instead, what if we want the estimate that gives us the highest confidence in the solution: • What is the linear, unbiased, minimum variance estimate of the state x?
Problem Statement • What is the linear, unbiased, minimum variance estimate of the state x ? • This encompasses three elements • Linear • Unbiased, and • Minimum Variance • We consider each of these for formulate a solution
Linear Estimator • To be linear, the estimated state is a linear combination of the observations: • What is the matrix M? • This mysterious M matrix gives us the solution to the minimum variance estimator
Unbiased Estimator • To be unbiased, then Solution Constraint!
Minimum Variance Estimator • Must satisfy previous requirements:
What does it mean to have a minimum P ? • Put into the context of scalars:
Statement of Optimization Problem • We seek to minimize: • Subject to the equality constraint: • Using the method of Lagrange Multipliers, we seek to minimize: Term added to keep Q symmetric
Solution Derivation • Using calculus of variations, we need the first variation to vanish to achieve a minimum:
Solution Derivation • In order for the above to be satisfied: • We will focus on the first
Solution Derivation • We now have two constraints, which will give us a solution:
What about P non-negative definite? • Showed that P satisfies the constraints, but do we have a “minimum” • Must show that, for any other solution, • See book, p 186-187 for proof
Minimum Variance Estimator • Turns out, we get the weighted, linear least squares! • Hence, the linear least squares gives us the minimum variance solution • Of course, this is predicated on all of our statistical/linearization assumptions
Minimum Variance w/ a priori • Falls from similar derivations previously discussed:
Batch vs. Sequential Processing • Batch – process all observations in a single run of the filter • Sequential – process each observations individually (usually as they become available over time) X*
Mapping of Filter State and Uncertainty • Recall how to map the state deviation and covariance matrix (previous lecture) • Can we leverage this information to sequentially process measurements in the minimum variance / least squares algorithm?
Minimum Variance as a Sequential Processor • Given from a previous filter run: • We have new a observation and mapping matrix: • We can update the solution via:
Sequential Estimator Updates • Two principle phases in any sequential estimator • Time Update • Map previous state deviation and covariance matrix to the current time of interest • Measurement Update • Update the state deviation and covariance matrix given the new observations at the time of interest • Jargon can change with communities • Forecast and analysis • Prediction and fusion • others…
Notes on the Sequential Minimum Variance/Least Squares • No assumptions on the number of observations at tk. • Wait, but what if we have fewer observations than unknowns at tk? • Do we have an underdetermined system?
Sequential Minimum Variance Measurement Update • The a priori may be based on independent analysis or a previous estimation • Independent analysis could be a product of: • Expected launch vehicle performance • Previous analysis of system (a priori gravity field) • Initial orbit determination solution
Sequential Minimum Variance Measurement Update • We still have to invert a n × n matrix • Can be computationally expensive for large n • Gravity field estimation: ~n2+2n-3 coefficients! • May become sensitive to numeric issues
Sequential Minimum Variance Measurement Update • Is there a better sequential processing algorithm? • YES! – This equations above may be manipulated to yield the Kalman filter