740 likes | 839 Views
ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least Squares Estimate. Announcements. Homework 2 Graded Comments included on D2L Any questions, talk with us this week Homework 3 Graded by Thursday EOD
E N D
ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least Squares Estimate
Announcements • Homework 2 Graded • Comments included on D2L • Any questions, talk with us this week • Homework 3 Graded by Thursday EOD • Homework 4due Thursday • Solutions online now. • The solutions show a lot of information – be sure to demonstrate that you know the answers and are not just copying the solutions! • Homework 5 out today
Homework 5 • Problem 1: • Simulate the Stat OD process on a 2-dimensional, point-mass orbit. • Problem 2: • Simple Batch Process exercise. • Problem 3: • Process noisy observations of a spring-mass system.
Topics Today • Review of Stat OD Concepts • Least Squares Estimator • Quiz: Observability • ~10% of the class got 4/4 • Majority of the class got 2/4 • We’ll go over the topic more closely toward the end of the lecture!
Observation Residuals • We have noisy observations of certain aspects of the system. • We need some way to relate each observation to the trajectory that we’re estimating. X* Observed Range Computed Range True Range = ??? ε = O-C = “Residual”
Fitting the Data • How do we best fit the data? Residuals = ε = O-C No No Not bad Good
Fitting the Data • Lp norm estimators p = 1: even weights, but discontinuous p = 2: favors outliers Interior options: optimal for certain scenarios that involve outliers.
Fitting the data • A good solution, and one easy to code up, is the least-squares solution
State Deviation and Linearization • Linearization • Introduce the state deviation vector • If the reference/nominal trajectory is close to the truth trajectory, then a linear approximation is reasonable.
State Deviation and Linearization • Goal of the Stat OD process: • The best fit trajectoryis represented by This is what we want
State Deviation Mapping • How do we map the state deviation vector from one time to another? • The state transition matrix. • It tracks how a perturbation grows/shrinks/changes over time along the trajectory.
State Transition Matrix • The state transition matrix maps a deviation in the state from one epoch to another. • It is constructed via numerical integration, in parallel with the trajectory itself.
The A Matrix • The “A” Matrix:
Process • Step 1: Define your state and your dynamics.
Process • Step 1: Define your state and your dynamics.
Process • Step 2: Integrate your state and STM. Use the “reshape” command: X0 = [ X; reshape(Phi, 8*8, 1) ]
Process • Step 2: Integrate your state and STM.
Process • Step 2: Integrate your state and STM. • Did we do too much work? Yes, yes, we did.
Process • Step 2: Integrate your state and STM. • Did we do too much work? Yes, yes, we did. Don’t have to integrate these parameters!
Process • Step 2: Integrate your state and STM. • Did we do too much work? Yes, yes, we did. Don’t have to integrate these parameters! Might not have to integrate these either!
Process • Step 2: Integrate your state and STM. Phi0 = eye(6,8) X0 = [ X; reshape(Phi, 6*8, 1) ]
Process • Step 2: Integrate your state and STM. • This is important for your final project. • Rather than integrating 19x18 = 342 equations,Integrate only the 10x6 = 60 equations that change over time.
Process • Step 3: Process observations. • Step 4: Determine a new best estimate of the state. • Step 5: Iterate/repeat as needed.
Mapping an observation • How do we map an observation to the trajectory?
Measurement Mapping Matrix • The Mapping Matrix
Measuring the residuals • Our state and measurement mappings. ε = O-C = “Residual” X*
Relating the observations to an epoch • If the state has n unknown parameters and there are lobservations at different times • Then there are a total of n×l unknowns. • If we can instead relate each state to an epoch, then we can reduce it to only n unknown state parameters. If the errors are equal to zero, then we can deterministically solve this system using n linearly independent observations.
Relating the observations to an epoch If the errors are not zero, then we now have n unknown state parameters and l unknown errors. If we have multiple observation types (say p types), then we have m = p×l observations and a total of m unknown observation errors. Total unknowns: m+n Total equations: m Least squares criterion provides a best estimate solution in this scenario.
How do we solve the problem? • Least squares estimate
How do we solve the problem? • Least squares estimate
Best estimate • The state deviation vector that minimizes the least-squares cost function: • Some observations of this solution: • What is the k ? • Are all data points equally weighted? • Does this solution consider correlated data or statistical information in the observations? • What happens if we acquire more data?
Best estimate • The state deviation vector that minimizes the least-squares cost function: • Additional Details: • is called the normal matrix • If H is full rank, then this will be positive definite. • If it’s not then we don’t have a least squares estimate! • The best estimate of the observation errors:
Example 4.2.1 Or in component form: Expressed in first order form:
Example 4.2.1: Observability • Notice that all variables end up in either the A or matrix, but not necessarily both. A
Example 4.2.1: Integration Efficiency • Which equations of motion do we need to integrate? A
Example 2 • The previous example (4.2.1 from the book) left us off at the definitions of the matrices. • Next example takes us through the least squares estimate.
Least Squares Example Example of least squares Let (Note that this is a linear system) Assume we wish to estimate
Least Squares Example Note that HTHis always a symmetric matrix
Least Squares Example Assume: Then