200 likes | 519 Views
Performance of POSIT for real-time UAV pose estimation. Chayatat Ratanasawanya March 16, 2011. Overview. Thesis problem The UAV Pose estimation by POSIT Previous work Development of POSIT-based real-time pose estimation algorithm Experimental results Questions. Thesis problem statement.
E N D
Performance of POSIT forreal-time UAV pose estimation ChayatatRatanasawanya March 16, 2011
Overview • Thesis problem • The UAV • Pose estimation by POSIT • Previous work • Development of POSIT-based real-time pose estimation algorithm • Experimental results • Questions
Thesis problem statement • Develop a flexible human/machine control system to hover an UAV carrying a VDO camera beside an object of interest such as a window for surveillance purposes. • Method: Human control – Joystick Machine control – Visual-servoing • Application: for the police to use the system to survey a room from outside of a building.
The UAV • Q-ball: 6DOF quadrotor helicopter • Came with SIMULINK-based real-time controllers IMU Helicopter Controller Roll, Pitch Desired inputs X, Z (desired) Yaw(desired) Y (desired) y x World frame Magnetometer Optitrack Camera Sonar X, Y, Z, Yaw Yaw X, Z Y z
POSIT algorithm • Developers: Daniel DeMenthon & Philip David • The algorithm determines the pose of an object relative to the camera from a set of 2D image points Image coordinates of min. 4 non-coplanar feature points POSIT Rotation matrix of object wrt. camera 3D object coordinates of the same points Translation of object wrt. camera Camera intrinsic parameters (f, cc) Reference: http://www.cfar.umd.edu/~daniel/classicPosit.m
Previous work • Cardboard box target • Took still images of the target from various locations in the lab • Manual feature points identification • Object pose was estimated offline • Target was self-occluded • Not a real-time process y x Object frame z
Current work • Image-based control algorithm is being developed • Must be a real-time process • UAV pose must be estimated real-time • Target must not be self-occluded • Image source: Live video • Image processing has to be fast • Feature points must be identified automatically
Feature points extraction Detect LED Discard unwanted feature points detected Detect Window Camera Detect Corners
Feature points undistortion? • Fast image processing – no unnecessary calculations • Evaluate the pose estimated by POSIT from distorted and undistorted feature points locations Distortion coeff. from cam calibration VDO from Camera IMU Optitrack 6DOF UAV pose Roll Pitch Feature points extraction Undistortion by look-up table POSIT & Inv. kinematics Compare POSIT & Inv. kinematics Points location filter 6DOF UAV pose estimates
Experimental setup • The Q-ball was randomly placed in 20 locations in the lab. Its pose was different in each location. • Acquire live video stream and estimate the UAV pose with POSIT in real-time. • 150 6DOF pose estimations, Optitrack, and IMU readings were recorded. • Optitrack readings are used as reference.
Results - X Standard Deviation
Results - Y Standard Deviation
Results - Z Standard Deviation
Results - Roll Standard Deviation
Results - Pitch Standard Deviation
Results - Yaw Standard Deviation
Mean and SD of errorof all 3000 measurements Excludes #3 & 15
Conclusion • POSIT algorithm is an alternative for real-time UAV pose estimation • Target consists of a white LED and a window • 5 non-coplanar feature pts: the LED and 4 corners • Pose estimation using undistorted feature points is more accurate than that using distorted points – significant improvement along Z-direction • Image information may be mapped to positional control inputs via POSIT algorithm
Summary • Thesis problem & the UAV • Previous work on POSIT – the drawbacks • POSIT-based real-time pose estimation algorithm • Feature points extraction from live VDO • Feature points image coordinates undistortion • Feature points location filtering • Real-time algorithm • Comparison between pose estimated by POSIT, pose from Optitrack, and 2 attitude angles from IMU.