310 likes | 425 Views
ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 35: Uncertainty Quantification and Smoothing. Announcements. Homework 11 due on Friday, Dec. 6 Lecture quiz due by 5pm on Wednesday after Thanksgiving.
E N D
ASEN 5070: Statistical Orbit Determination I Fall 2013 Professor Brandon A. Jones Professor George H. Born Lecture 35: Uncertainty Quantification and Smoothing
Announcements • Homework 11 due on Friday, Dec. 6 • Lecture quiz due by 5pm on Wednesday after Thanksgiving
The Probability Ellipsoid Views of Error Ellipsoid view (-37.5°,0) standard MATLAB view
Hot Topic of Research • Problem first identified in 1996: • Junkins, et al., “Non-Gaussian Error Propagation in Orbital Mechanics”, Journal of Astronautical Sciences, V. 44, N. 4, 1996 pp. 541-563 • Early studies motivated by need to perform regular collision risk assessment for the ISS • Multiple methods exists for nonlinear propagation: • Monte Carlo • State transition tensors (STT) • Gaussian Mixtures • Polynomial Chaos / Separation of Variables
State Transition Tensors • The STM represents a 2nd-order tensor • Generated via the first derivative of the force model • Accuracy improved with the inclusion of higher-order effects • Keep higher-order terms of Taylor expansion • A STT maintains higher order derivatives for mapping of the a priorip.d.f.
Example STT Propagation Fujimoto, et al., 2011
Gaussian Mixtures Horwood, et al., JGCD, Nov.-Dec., 2011
Gaussian Mixtures • Under what constraints is this a probability density function?
Polynomial Chaos (PC) • Based on Weiner’s Homogeneous Chaos (1938) • Generates an approximate solution to a stochastic ODE: • More commonly used in structures, CFD, applied physics, and other fields • We are applying it to orbital mechanics
Example PCE Result for a Molniya Orbit • Use polynomial surrogate to approximate the p.d.f. • PC requires ~100-200 ODE evaluations • Monte Carlo requires more than 100,000 evaluations Image: Jones, et al., 2013
Forthcoming Use of PC in Spacecraft Operations • Part of NASA/GSFC-based navigation team for the Magnetospheric Multi-Scale (MMS) mission • Leveraging CU-developed methods and applications of uncertainty quantification • Applying polynomial chaos (PC) to the estimation of collision probabilities • Includes post-maneuver uncertainty quantification
Uncertainty Quantification (UQ) • For more information on general UQ: • ASEN 6519 – Uncertainty Quantification • Spring 2014 • E-mail instructor (Alireza.Doostan@colorado.edu) about pre-requisites • More details on use in astrodynamics: • ASEN 6519 – Orbital Debris • Fall 2014 (planned)
Homework 11 • Leverage code from HW10 • New data set generated with a different force model • Otherwise, same format, data noise, etc. • Process observations in existing filter • Do not add J3 to your filter model! • Observe the effects of such errors on OD • Add process noise to improve state estimation accuracy
Motivation • The batch processor provides an estimate based on a full span of data • When including process noise, we lose this equivalence between the batch and any of the sequential processors • Is there some way to update the estimated state using information gained from future observations?
Smoothing • Smoothing is a method by which a state estimate (and optionally, the covariance) may be constructed using observations before and after the epoch. • Step 1. Process all observations using a CKF with process noise (SNC, DMC, etc.). • Step 2. Start with the last observation processed and smooth back through the observations.
Notation • As presented in the book, the most common source of confusion for the smoothing algorithm is the notation Based on observations up to and including Value/vector/matrix Time of current estimate
Smoothing visualization • Process observations forward in time: • If you were to process them backward in time (given everything needed to do that):
Smoothing visualization • Process observations forward in time: • If you were to process them backward in time (given everything needed to do that):
Smoothing visualization • Smoothing does not actually combine them, but you can think about it in order to conceptualize what smoothing does. • Smoothing results in a much more consistent solution over time. And it results in an optimal estimate using all observations.
Smoothing • Caveats: • If you use process noise or some other way to increase the covariance, the result is that the optimal estimate at any time really only pays attention to observations nearby. • While this is good, it also means smoothing doesn’t always have a big effect. • Smoothing shouldn’t remove the white noise found on the signals. • It’s not a “cleaning” function, it’s a “use all the data for your estimate” function.
Smoothing of State Estimate • First, we use • If Q = 0,
Smoothing of State Estimate • Hence, in the CKF, we store:
Smoothing of Covariance • Optionally, we may smooth the state error covariance matrix