1 / 23

Tracking for Scene Augmentation & Visualization

Research focuses on real-time person tracking to augment scenes with 3D info, enabling control visualization of spatial data. Sensor fusion strategies, algorithms, and technical approaches empowered by data-driven models and extended Kalman Filters are explored. The research emphasizes the development of methods for data fusion and autocalibration in gyro/vision systems to achieve precise target localization and stable orientation in tracking scenarios. Strategies like Large-Motion Estimation and Recursive Rotation Factorization enhance motion estimate accuracy, enabling effective tracking in dynamic environments.

kathrynha
Download Presentation

Tracking for Scene Augmentation & Visualization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tracking for Scene Augmentation & Visualization Ulrich Neumann Computer Science Department Integrated Media Systems Center University of Southern California July 2000

  2. Research Goals • Basic science and engineering needed for wide-area,unencumbered,real-time person tracking Why person tracking? • Position/orientation of people in field – smart sensors • Augment the observed scene with 3D information gleaned from distributed sources • Enable a shared command/control visualization of distributed spatial information/sensing sources Spatial relationships are critical in sensing, display, and data fusion

  3. Person/Head Tracking vs. Object or Vehicle Tracking • Tracking objects from fixed sensors – vehicles • Objects emit signatures like sound, light, force that are detected (illumination is possible) • Person/head tracking uses body-worn moving sensors • Passive sensing of environmental signatures/measures • difficult to model or sense or measure • Vehicle tracking – ground, air, water • Similar sensors and fusion ideas (e.g., EKF) • Inverse component motion rates translation rotation • Lack of velocity measures (e.g., wheel rotations, flow) • Man-portable imposes severe weight/power constraints

  4. Current Outdoor Person Tracking • S. Feiner @ Columbia - MARS project • L. Rosenblum @ ONR/NRL • GPS/compass (gyro w/USC) • E. Foxlin @ Intersense Corp. • compass/gyro IS-300 • R. Azuma @ HRL • compass/gyro (vision w/USC) • U. Neumann, S. You @ USC • gyro/vision, panoramic • Land Warrior - Army/Point Research Corp. • GPS/compass/accelerometer (body-only)

  5. Technical Approach/Strategy • Estimate real-time 6DOF tracking by fusion of multiple sensor data streams, each possessing variable uncertainty • Fusion strategies – EKF, data-driven models • GPS for 3DOF position (intermittent) • Inertial (MEMS) gyros and accelerometers • Vision - planar and panoramic projections • Compass, pedometer, laser-range-finder, human-aided,…

  6. Sensor fusion algorithms 3DOF orientation from gyro/compass/vision 3DOF position from GPS/accel/vision/LRF Real-time portable prototype Outdoor performance/annotation tests/metrics Command/control visualization testbed Oriented images Precise target localization (man-sighted, UAVs) Proposed Research and Development

  7. Data Fusion Methods • Extended Kalman Filters • Explicit closed-loop models • Fuzzy models • Implicit data-driven models of noise, sensor bias, and sensor correlations that are hard to model explicitly • E.g., device to device variations, application “usage” variations, (crawling, vs running), user-to-user variations • Hybrid combinations of above

  8. Explicit Closed-Loop Models for Inertial/Vision Data Fusion Grot Gyro dt Gor Grot - 3D rotation rate 3-axis Gyro KF Gor - 3D orientation Gcal Vor Avel - 3D velocity 3-axis Accelerometer Accel Xg-1 LPF Apos - 3D position Avel Vdir - 2D direction of motion Vrot - 3D rotation rate Accel dt dt Apos 2D Vision tracking KF KF Vpos - 3D position Vor - 3D orientation Gcal Xg Vdir Vpos     Avel Vrot 5D Motion Vdir 2D Vision Vor 3D Autocalibration 6D Pose Vpos Grot Gor Apos

  9. USC Research Status • Autocalibration • Calibrate 3D positions of features (sparse modeling) • Extend tracking and stabilize pose • Panoramic imaging • Track from reference images • Visualize/monitor 360° scene • Gyro/vision fusion • Stable orientation (3DOF) • Track through high speed motions and blur

  10. Autocalibration estimate camera pose P = K(Fc, S, Fn) Features (Fc), gyro, accelerometer & other sensors (s) S = ∫s(t) dt detect & calibrate new features Fn = K(P, fn) convergencein an iterative Extended Kalman Filter framework supports autocalibration and multiple-sensor fusion

  11. Autocalibration Demonstration

  12. Motion Estimation/Tracking with Panoramic Images • Panoramic images are more robust to partial occlusions than planar images • Adapt iterative EKF for 5DOF motion estimates • Motion direction and rotation between images • Good results for small motions [RT] • Similar accuracy as popular 8-point method • Least squares solution with more points • EKF framework has advantages • Sensor fusion framework • No minimum number of features • Flexibility in rejecting uncorrelated noise

  13. Large Motion Estimates • Large R and T cause motion estimate errors • Large R or T are both desirable for high SNR • Errors arise in separating R/T • Recursive Rotation Factorization (RRF) • Build on iEKF framework for small motions • Take advantage of property that features motions are identical for a given R motion • Estimate R I2 = [TR] I1 • Factor R from image I2 = [TR] [R-1]I1 • Estimate T I2 = [T]I1 • Iterate until R and T converge

  14. RRF Large-Motion Estimation RRF motion estimation with various noise levels (1 m displacement and 10-90 degree rotation about the up-axis). Left and right side charts describe translation and rotation error respectively. Noise levels are 0.3, 1.5, and 3.0 degrees, top to bottom Translation Rotation

  15. Panoramic 6DOF Tracking • 6DOF tracking of a moving camera (red) is obtained (without requiring any calibrated features in the scene) from multiple 5DOF-motion estimates relative to reference images (blue)

  16. B O A 6DOF Tracking Simulation RRF motion estimation is computed and integrated over a sequence of images. The left graph shows the absolute angular error in translation direction and the right graph shows the absolute rotation error. The lower figure shows the simulated motion path of the camera. Points A and B are two reference positions. The camera starts at A and moves along the path. Translation Rotation Motion Path

  17. Panoramic Images/Video for Visualization • Panoramic image from conventional video • Video processed in field to produce a panoramic image • Highly compressed scene capture - no redundancy • Panoramic video camera (3200x480 @ 24 fps) • Transmit 360 field of view from remote site • Video motion detection in all directions • Real time view for desktop or HMD

  18. Panorama from Video

  19. Gyro/Vision Orientation Tracking • Gyro – angular rate sensors • Drift and bias (1KHz) • Video – angular rate sensors • tracking loss/drift (30Hz) • Compass – orientation sensor • jitter and magnetic field distortion (100 Hz)

  20. Orientation Tracking Test • Predict 2D feature motion from gyro data • Refine gyro data by vision feature tracking • Stabilize gyro drift and more robust 2D tracking

  21. Orientation-Tracking Demonstration

  22. Gyro integration/prediction time integration sample & EKF sample Gyro measurement Gyro integration/measurement Gyro/Vision Fusion Examples Video tracking process 0 1 2 3 …. 32 33 34 ms time

  23. Cooperation within MURI Team • Algorithms for sensor fusion and uncertainty management • Portable prototype and testbed for visualization demonstration and outdoor tracking tests • Shared visualizations of spatial annotations and panoramic imagery • Tracking and modeling are strongly related • Scene modeling aids tracking and vice-versa

More Related