500 likes | 613 Views
Target Precision Determination and Integrated Navigation By Professors Dominick Andrisani and James Bethel, and Ph.D. students Aaron Braun, Ade Mulyana and Takayuki Hoshizaki Purdue University, West Lafayette, IN 47907-1282 andrisan@ecn.purdue.edu 755-494-5135
E N D
Target Precision Determination and Integrated Navigation By Professors Dominick Andrisani and James Bethel, and Ph.D. students Aaron Braun, Ade Mulyana and Takayuki Hoshizaki Purdue University, West Lafayette, IN 47907-1282 andrisan@ecn.purdue.edu 755-494-5135 bethel@ecn.purdue.edu 755-494-6719 NIMA Meetings December 11-12, 2001 http://bridge.ecn.purdue.edu/~uav
Purposes of this talk To provide an overview of the results of the Purdue Motion Imagery Group which is studying precision location of ground targets from a UAV. (This work started January ‘01). To suggest that integrated navigators have to be re-optimized in in regards to allowable errors in position and orientation of the aircraft for the problem of locating ground targets. To build a case for a new class of aircraft navigators that use imagery to improve aircraft navigation accuracy. To build a case for a new class of target locators that integrate aircraft navigation and target imagery to improve accuracy of aircraft location and target location.
Note to the Audience This summary talk will be short on mathematics, procedural details, and numerical results. This summary talk will be big on ideas and concepts that we have identified as being important in improving the accuracy of target location from an UAV. Our final report will be available through contract sponsor Dave Rogers in early April. Papers documenting the details of our work and work-in-progress can be found at http://bridge.ecn.purdue.edu/~uav (This site is password protected. Contact Dave for password.)
Objectives of the Purdue Motion Imagery Group To study location of both target and aircraft using motion Imagery with multiple ray intersections, inertial sensors, and the GPS system, and to do this with as few simplifying assumptions as possible. To determine which sources of error contribute most to errors in locating a ground target. To determine an error budget that will guarantee a cep90% of 10 feet.
Our overall target location problem Trajectory Input Aircraft Turbulence Input Model Time Input Aircraft Motion Satellite Constellation Processing Mode Errors GPS INS Antennas Number, Location Errors Position, Attitude, Rates Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Errors Transformation to Sensor Position, Attitude, and Uncertainty Errors Synthetic Image Generation Errors Sensor Parameters Target Tracking Image Acquisition Imaging Parameters System Multi-Image Site Model Intersection Target Coordinates Graphic Animation Uncertainty, CE90
Covariance Analysis Trajectory Input Aircraft Turbulence Input Model Time Input Aircraft Motion Satellite Constellation Processing Mode Errors GPS INS Antennas Number, Location Errors Position, Attitude, Rates Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Given covariance of zero mean errors Errors Transformation to Sensor Position, Attitude, and Uncertainty Errors Synthetic Image Generation Find target position covariance (cep90) using linear methods Errors Sensor Parameters Target Tracking Image Acquisition Imaging Parameters Problem: Errors are not always zero mean System Multi-Image Site Model Intersection Graphic Animation Target Coordinates Uncertainty, CE90
Covariance Analysis See References by Aaron Braun • Rigorous sensor modeling is important in determining target location. • Aircraft orientation and aircraft position accuracy are both • important in target location accuracy. • The relative importance of various error sources to the CEP90 is being determined. See References by Professor James Bethel Sometimes errors are not zero mean but biased. The best example of this is in GPS positioning. Quoted GPS accuracy reflects the sum of the bias and random components. Biased aircraft positioning will lead to biased target positioning. Covariance analysis will not show this fact.
Today’s Integrated Inertial Navigator (Inertial + GPS) Trajectory Input Aircraft Turbulence Input Model Time Input Aircraft Motion Satellite Constellation Processing Mode Errors GPS INS Antennas Number, Location Errors Position, Attitude, Rates Position, Attitude, Rates Filter Aircraft Position & Attitude Estimate and Uncertainty Errors Transformation to Sensor Position, Attitude, and Uncertainty Errors Synthetic Image Generation Errors Sensor Parameters Target Tracking Image Acquisition Imaging Parameters System Multi-Image Site Model Intersection Target Coordinates Graphic Animation Uncertainty, CE90
Status of our work on an Integrated Inertial Navigator See References by Mulyana and Hoshizaki Ade Mulyana and Taka Hoshizaki have completed the development of an integrated navigator of this form. Results show that improving the GPS subsystem produces a significant improvement in aircraft position accuracy. Results also show that improving the inertial navigation subsystem produces a significant improvement in aircraft orientation accuracy. Since both aircraft position and orientation are important in targeting, careful re-optimization of the INS and GPS systems is required for the ground targeting scenario. Our error budget will help in this re-optimization.
Local Frame Position Errors: (true) – (estimated) dx (m) dy (m) dz (m) 0 400 (sec) 200~300s covariance and nominal trajectory data are passed to imagery analysis • GPS performance directly affects position errors
Local Frame Euler Angle Errors: (true) – (estimated) droll (rad) dpitch (rad) dyaw (rad) 0 400 (sec) • INS accuracy helps orientation accuracy
Proposed Imaging Navigator (Inertial+GPS+ Imagery) Trajectory Input Aircraft Turbulence Input Model Time Input Aircraft Motion Satellite Constellation Processing Mode Errors GPS INS Antennas Number, Location Errors Position, Attitude, Rates Position, Attitude, Rates Do all this simultaneously for improved accuracy in aircraft positioning. Filter Aircraft Position & Attitude Estimate and Uncertainty Errors Transformation to Sensor Position, Attitude, and Uncertainty Errors Synthetic Image Generation The targets may include one or more known control points. Errors Sensor Parameters Target Tracking Image Acquisition Imaging Known control points improve aircraft accuracy. Parameters System Multi-Image Site Model Intersection Target Coordinates Uncertainty, CE90 Graphic Animation
Status of our work on the Imaging Navigator A fully integrated nonlinear Imaging Navigator will be developed under a subsequent contract. Preliminary analysis by Andrisani using greatly simplified models and linear analysis are encouraging. Flying over known control points improve aircraft position accuracy. This is a standard INS update technique. Flying over stationary objects on the ground should minimize the effects of velocity biases and rate gyro biases in the inertial navigator. This should improve aircraft position and orientation accuracy.
Proposed Integrated Target Locator (Inertial+GPS+ Imagery) Trajectory Input Aircraft Turbulence Input Model Time Input Aircraft Motion Satellite Constellation Processing Mode Errors GPS INS Antennas Number, Location Errors Position, Attitude, Rates Position, Attitude, Rates Do all this simultaneously for improved accuracy in target positioning Filter Aircraft Position & Attitude Estimate and Uncertainty Errors Transformation to Sensor Position, Attitude, and Uncertainty Errors Synthetic Image Generation The targets may include one or more known control points. Errors Sensor Parameters Target Tracking Image Acquisition Imaging Known control points improve target accuracy. Parameters System Multi-Image Site Model Intersection Target Coordinates Uncertainty, CE90 Graphic Animation
Status of our work on the Integrated Target Locator A nonlinear Integrated Target Locator will be developed under a subsequent contract. Preliminary analysis by Andrisani using greatly simplified models and linear analysis is encouraging. Flying over known control points improves target position accuracy. Flying over stationary objects on the ground should minimize the effects of velocity biases and rate gyro biases in the inertial navigator. This should improve target position accuracy.
Simplified Integrated Target Locator Hypothesis: Given a combined estimator of aircraft position and target position capable of imaging on a unknown target and a known control point. If a control point enters the field of view of the image system, the accuracy of simultaneous estimation of aircraft position and unknown target position will be significantly improved.
Technical Approach Use a linear low-order simulation of a simplified linear aircraft model, Use a simple linear estimator to gain insight into the problem with a minimum of complexity. A control point of known location will enter the field of view of the image processor only during the time from 80-100 seconds.
Linear Simulation: Fly over trajectory Initial aircraft position time=0 sec Final aircraft position time=200 sec Nominal speed=100 ft/sec 20,000 Data every .1 sec., i.e., every 10 ft Position Meas Xaircraft (ft) Focal Plane (f=150 mm) Image Coord. Meas. x (micron) Control point Known location Visible only from time=80-100 seconds. Camera always looks down. Range Meas., R (ft) Unknown Target always visible -10,000 10,000 0 Position (ft)
Nominal Measurement Noise in the Simulation sAircraft position = 1 feet sImage coordinate = 7.5 microns sRange = 1 feet
State Space Model Linear state equation x(j+1)=f(j,j-1)x(j)+v(j)+w(j) Nonlinear measurement equation z(j)=h(x(j))+u(j) x(o)=x0 (Gaussian initial condition) where v(j) is a known input w(j) is Gaussian white process noise u(j) is Gaussian white measurement noise
The Kalman Filter State Estimator Initialize Predict one step Measurement update
Residuals of the Kalman Filter, saircraft =100 ft Aircraft Position residual (ft) Target1 Position Error (ft) Target2 Position Error (ft) No measurement here No measurement here No measurement here No measurement here
Estimated State – Actual State , saircraft =100 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) Major impact of control point here Major impact of control point here
Expanded time scale for Estimated state -Actual state Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) Major impact of control point here Major impact of control point here
Estimated State – Actual State, saircraft =10 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft)
Expanded time scale, saircraft =10 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) Major impact of control point here
Estimated State – Actual State, saircraft =1 ft Little impact of control point here Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) No impact of control point
Expanded time scale, saircraft =1 ft Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) Littler impact of control point here
Estimated State – Actual State, sRange=10 ft. Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) No impact of control point here
Expanded time scale, sRange=10 ft. Aircraft Position Error (ft) Target1 Position Error (ft) Target2 Position Error (ft) No impact of control point here
Two Useful Scenarios Target #2 data Aircraft and target #1 data Improved aircraft position Improved target position ” Imaging Navigator” with camera #1 on target #1 and INS and GPS. Image-based target locator using camera #2 on target #2. Improved target position Aircraft and target #1 and #2 data “Integrated Target Locator” using one camera to simultaneously or sequentially track two targets and INS and GPS.
Conclusions • Both aircraft position and orientation accuracy strongly effect the accuracy of target location. • Accuracy specifications for position and orientation in integrated inertial navigators should be re-optimized for the problem of achieving desired accuracy in target location. Our error budget to achieve 10 ft cep90% should help in this re-optimization. • 3. Regarding our proposed “Integrated Target Locator,” when the measurement noise on aircraft position is large (saircraft>>1 ft), the sighting of a known control point significantly improves the aircraft position accuracy AND the unknown target position accuracy. This suggests a that flying over control points is tactically useful! • 4. A dramatic improvement of aircraft position estimation suggests a new type of navigator, the “Imaging Navigator” should be developed. This navigator would integrate INS, GPS, and image processor looking at known or unknown objects on the ground. One or two cameras might be used.
References Presented at the The Motion Imagery Geolocation Workshop, SAIC Signal Hill Complex, 10/31/01 1. Dominick Andrisani, Simultaneous Estimation of Aircraft and Target Position With a Control Point 2. Ade Mulyana, Takayuki Hoshizaki, Simulation of Tightly Coupled INS/GPS Navigator 3. James Bethel, Error Propagation in Photogrammetric Geopositioning 4. Aaron Braun, Estimation Models and Precision of Target Determination Presented at the The Motion Imagery Geopositioning Review and Workshop, Purdue University, 24/25 July, 2001 • 1. Dominick Andrisani, Simultaneous Estimation of Aircraft and Target Position • 2. Jim Bethel, Motion Imagery Modeling Study Overview • 3. Jim Bethel, Data Hiding in Imagery • 4. Aaron Braun, Estimation and Target Accuracy • 5. Takayuki Hoshizaki and Dominick Andrisani, Aircraft Simulation Study Including Inertial • Navigation System (INS) Model with Errors • 6. Ade Mulyana, Platform Position Accuracy from GPS
Related Literature 1. B.H. Hafskjold, B. Jalving, P.E. Hagen, K. Grade, Integrated Camera-Based Navigation, Journal of Navigation, Volume 53, No. 2, pp. 237-245. 2. Daniel J. Biezad, Integrated Navigation and Guidance Systems, AIAA Education Series, 1999. 3. D.H. Titterton and J.L. Weston, Strapdown Inertial Navigation Technology, Peter Peregrinus, Ltd., 1997. 4. A. Lawrence, Modern Inertial Technology, Springer, 1998. 5. B. Stietler and H. Winter, Gyroscopic Instruments and Their Application to Flight Testing, AGARDograph No. 160, Vol. 15,1982. 6. A.K. Brown, High Accuracy Targeting Using a GPS-Aided Inertial Measurement Unit, ION 54th Annual Meeting, June 1998, Denver, CO.
Structure of Simulation Tightly Coupled INS/GPS Position Velocity Orientation Covariance INS UAV IMU Nav Position, Velocity, Orientation and Covariance correction - Kalman Filter Bias Correction + GPS Receiver
= Bias + White Noise Simplified IMU Model where : Sensor Output : Sensor Input Bias : Markov Process, tc=60s for all Accelerometer Outputs Rate Gyro Outputs
GPS Receiver Model Pseudorange Pseudorange Rate : Satellite Position : Platform Position : Pseudorange equvalent Clock Bias (Random Walk) : Pseudorange rate equivalent Clock Drift (Random Walk) : Normally Distributed Random Number : Normally Distributed Random Number
Kalman Filter: Error Dynamics Orientation Angle Errors Velocity Errors Position Errors Gyro Biases Accelerometer Biases Clock Bias and Drift 17 States Kalman Filter
Kalman Filter: Output Equation Measurement: Random Noise: Output Equation: where
Initial Errors • Initial Covariance Values Initial Error Condition
Error Source Specifications INS LN-100G LN-200IMU Units Accelerometers Notation Bias White Noise (sqrt(PSD)) Rate Gyros Bias White Noise (sqrt(PSD)) (deg/hr/sqrt(Hz)) (worse) (good) • 2 levels of INS are used for Simulation
GPS Receiver Notation Receiver 1 Receiver 2 Units Pseudorange 6.6 33.3 m Pseudorange Rate 0.05 0.5 m/s ClockBias White Noise(PSD) 0.009 0.009 ClockDrift White Noise(PSD) 0.0355 0.0355 Error Source Specifications GPS (good) (worse) • 2 levels of GPS Receivers are used for Simulation
x=Zecef y=-Yecef z=Xecef-6378137m Local Frame: x, y, z Zecef Nominal Trajectory x y Yecef z Xecef
Local Frame Velocity Errors: (true) – (estimated) 0 400 (sec) • GPS performance directly affects velocity errors