230 likes | 242 Views
Research focuses on real-time person tracking to augment scenes with 3D info, enabling control visualization of spatial data. Sensor fusion strategies, algorithms, and technical approaches empowered by data-driven models and extended Kalman Filters are explored. The research emphasizes the development of methods for data fusion and autocalibration in gyro/vision systems to achieve precise target localization and stable orientation in tracking scenarios. Strategies like Large-Motion Estimation and Recursive Rotation Factorization enhance motion estimate accuracy, enabling effective tracking in dynamic environments.
E N D
Tracking for Scene Augmentation & Visualization Ulrich Neumann Computer Science Department Integrated Media Systems Center University of Southern California July 2000
Research Goals • Basic science and engineering needed for wide-area,unencumbered,real-time person tracking Why person tracking? • Position/orientation of people in field – smart sensors • Augment the observed scene with 3D information gleaned from distributed sources • Enable a shared command/control visualization of distributed spatial information/sensing sources Spatial relationships are critical in sensing, display, and data fusion
Person/Head Tracking vs. Object or Vehicle Tracking • Tracking objects from fixed sensors – vehicles • Objects emit signatures like sound, light, force that are detected (illumination is possible) • Person/head tracking uses body-worn moving sensors • Passive sensing of environmental signatures/measures • difficult to model or sense or measure • Vehicle tracking – ground, air, water • Similar sensors and fusion ideas (e.g., EKF) • Inverse component motion rates translation rotation • Lack of velocity measures (e.g., wheel rotations, flow) • Man-portable imposes severe weight/power constraints
Current Outdoor Person Tracking • S. Feiner @ Columbia - MARS project • L. Rosenblum @ ONR/NRL • GPS/compass (gyro w/USC) • E. Foxlin @ Intersense Corp. • compass/gyro IS-300 • R. Azuma @ HRL • compass/gyro (vision w/USC) • U. Neumann, S. You @ USC • gyro/vision, panoramic • Land Warrior - Army/Point Research Corp. • GPS/compass/accelerometer (body-only)
Technical Approach/Strategy • Estimate real-time 6DOF tracking by fusion of multiple sensor data streams, each possessing variable uncertainty • Fusion strategies – EKF, data-driven models • GPS for 3DOF position (intermittent) • Inertial (MEMS) gyros and accelerometers • Vision - planar and panoramic projections • Compass, pedometer, laser-range-finder, human-aided,…
Sensor fusion algorithms 3DOF orientation from gyro/compass/vision 3DOF position from GPS/accel/vision/LRF Real-time portable prototype Outdoor performance/annotation tests/metrics Command/control visualization testbed Oriented images Precise target localization (man-sighted, UAVs) Proposed Research and Development
Data Fusion Methods • Extended Kalman Filters • Explicit closed-loop models • Fuzzy models • Implicit data-driven models of noise, sensor bias, and sensor correlations that are hard to model explicitly • E.g., device to device variations, application “usage” variations, (crawling, vs running), user-to-user variations • Hybrid combinations of above
Explicit Closed-Loop Models for Inertial/Vision Data Fusion Grot Gyro dt Gor Grot - 3D rotation rate 3-axis Gyro KF Gor - 3D orientation Gcal Vor Avel - 3D velocity 3-axis Accelerometer Accel Xg-1 LPF Apos - 3D position Avel Vdir - 2D direction of motion Vrot - 3D rotation rate Accel dt dt Apos 2D Vision tracking KF KF Vpos - 3D position Vor - 3D orientation Gcal Xg Vdir Vpos Avel Vrot 5D Motion Vdir 2D Vision Vor 3D Autocalibration 6D Pose Vpos Grot Gor Apos
USC Research Status • Autocalibration • Calibrate 3D positions of features (sparse modeling) • Extend tracking and stabilize pose • Panoramic imaging • Track from reference images • Visualize/monitor 360° scene • Gyro/vision fusion • Stable orientation (3DOF) • Track through high speed motions and blur
Autocalibration estimate camera pose P = K(Fc, S, Fn) Features (Fc), gyro, accelerometer & other sensors (s) S = ∫s(t) dt detect & calibrate new features Fn = K(P, fn) convergencein an iterative Extended Kalman Filter framework supports autocalibration and multiple-sensor fusion
Motion Estimation/Tracking with Panoramic Images • Panoramic images are more robust to partial occlusions than planar images • Adapt iterative EKF for 5DOF motion estimates • Motion direction and rotation between images • Good results for small motions [RT] • Similar accuracy as popular 8-point method • Least squares solution with more points • EKF framework has advantages • Sensor fusion framework • No minimum number of features • Flexibility in rejecting uncorrelated noise
Large Motion Estimates • Large R and T cause motion estimate errors • Large R or T are both desirable for high SNR • Errors arise in separating R/T • Recursive Rotation Factorization (RRF) • Build on iEKF framework for small motions • Take advantage of property that features motions are identical for a given R motion • Estimate R I2 = [TR] I1 • Factor R from image I2 = [TR] [R-1]I1 • Estimate T I2 = [T]I1 • Iterate until R and T converge
RRF Large-Motion Estimation RRF motion estimation with various noise levels (1 m displacement and 10-90 degree rotation about the up-axis). Left and right side charts describe translation and rotation error respectively. Noise levels are 0.3, 1.5, and 3.0 degrees, top to bottom Translation Rotation
Panoramic 6DOF Tracking • 6DOF tracking of a moving camera (red) is obtained (without requiring any calibrated features in the scene) from multiple 5DOF-motion estimates relative to reference images (blue)
B O A 6DOF Tracking Simulation RRF motion estimation is computed and integrated over a sequence of images. The left graph shows the absolute angular error in translation direction and the right graph shows the absolute rotation error. The lower figure shows the simulated motion path of the camera. Points A and B are two reference positions. The camera starts at A and moves along the path. Translation Rotation Motion Path
Panoramic Images/Video for Visualization • Panoramic image from conventional video • Video processed in field to produce a panoramic image • Highly compressed scene capture - no redundancy • Panoramic video camera (3200x480 @ 24 fps) • Transmit 360 field of view from remote site • Video motion detection in all directions • Real time view for desktop or HMD
Gyro/Vision Orientation Tracking • Gyro – angular rate sensors • Drift and bias (1KHz) • Video – angular rate sensors • tracking loss/drift (30Hz) • Compass – orientation sensor • jitter and magnetic field distortion (100 Hz)
Orientation Tracking Test • Predict 2D feature motion from gyro data • Refine gyro data by vision feature tracking • Stabilize gyro drift and more robust 2D tracking
Gyro integration/prediction time integration sample & EKF sample Gyro measurement Gyro integration/measurement Gyro/Vision Fusion Examples Video tracking process 0 1 2 3 …. 32 33 34 ms time
Cooperation within MURI Team • Algorithms for sensor fusion and uncertainty management • Portable prototype and testbed for visualization demonstration and outdoor tracking tests • Shared visualizations of spatial annotations and panoramic imagery • Tracking and modeling are strongly related • Scene modeling aids tracking and vice-versa