380 likes | 465 Views
Next Generation 4-D Distributed Modeling and Visualization of Battlefield. Avideh Zakhor UC Berkeley June, 2002. Participants. Avideh Zakhor, (UC Berkeley) Bill Ribarsky, (Georgia Tech) Ulrich Neumann (USC) Pramod Varshney (Syracuse) Suresh Lodha (UC Santa Cruz).
E N D
Next Generation 4-D Distributed Modeling and Visualization of Battlefield Avideh Zakhor UC Berkeley June, 2002
Participants • Avideh Zakhor, (UC Berkeley) • Bill Ribarsky, (Georgia Tech) • Ulrich Neumann (USC) • Pramod Varshney (Syracuse) • Suresh Lodha (UC Santa Cruz)
Battlefield Visualization • Goal: Detailed, timely and accurate picture of the modern battlefield • Many sources of info to build “picture”: • Archival data, roadmaps, GIS and databases: static • Sensor information from mobile agents at different times and location: dynamic. • Multiple modalities: fusion • How to make sense of all these without information overload?
Major Challenges: Data • Disparate/conflicting sources • Large volumes. • Inherently uncertain: resulting models also uncertain. • Need to be visualized on mobiles with limited capability. • Time varying, time dependent and dynamic. Make decisions and take actions
Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model construction with texture Visualization Database UNCERTAINTY UNCERTAINTY Model update Fusion/ Decision Making Mobiles with augmented reality sensors
Research Agenda • Model construction and update • Sensor tracking and registration • Real time visualization and multi-model interaction • Uncertainty processing and visualization: • Fusion used in all of the above.
Visualization Pentagon 4D Modeling/ Update Tracking/ Registration Visualization Database Information Fusion Uncertainty Processing/ Visualization
Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model construction with texture Visualization Database Model update Decision Making Mobiles with augmented reality sensors
Model Construction for Visualization • Develop a framework for 3D model construction for urban areas: • Easy, fast, accurate, automatic • Compact to represent; • Easy to render; • Strategy: • Fusion of multiple data sources: intensity, range, heading, speedometer, panoramic cameras. • Incorporate apriori models, e.g. digital roadmaps. • Registration, tracking and calibration.
3D Modeling: • Close-range modeling: • Ground based vehicle with multiple sensors • Far-range modeling: • Aerial/satellite imagery • Airborne Lidar data • Fusion of close range and far range info at multiple levels: • Data and models.
Combining Aerial and Ground Based Models Airborne Modeling Ground Based Modeling • Laser scans & images from acquisition vehicle • Laser scans/images from plane 3D Model of terrain and building tops Highly detailed model of street scenery & building façades Fusion Complete 3D City Model
Ground Level Based Data Acquisition • Laser range scanners • Digital roadmaps • Aerial photos Hybrid DGPS Inertial sensors Cameras (USC) 2D laser scanners: horizontal and vertical Intensity camera (UCB)
Processing Ground Based Laser Data • Histograms • Segmentation • Layer separation • Interpolation
Resulting Models from Hole Filling • Before hole filling • After hole filling
Fusing Airborne model with ground based model Airborne Point cloud Ground based facade Merged airborne /façade model
6 DOF Pose Estimation for texture mapping with 3 DOF pose with 6 DOF pose
Static Texture Mapping Copy texture of all triangles into “collage” image Typical texture reduction: factor 8 - 12
Dynamic Texture Projection on LiDAR Data Sensor Sensor Image plane View frustum • Enables Real Time, Multi Source Data Fusion • Requires accurate 3D model, sensor model, • and texture/model registration • Tracking and registration algorithms Aerial view of projected image texture (campus of Purdue University)
Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model construction with texture Visualization Database Model update Decision Making Mobiles with augmented reality sensors
Data-adapted global quadtree } City-organized hierarchy Forest of quadtrees tree structure … … … … Data-dependent detailed representation (quadtree depth to level of a “block”) … … … … … … … … Block LOD Hierarchy Façade 1 Façade N … Object 1 Object M … Hierarchical, multiresolution methods forinteractive visualization of extended, detailed urban Scenes
Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model construction with texture Visualization Database Model update Decision Making Mobiles with augmented reality sensors
Multimodal Interface to Augmented Reality Systems Gesture pendant (worn on chest) Infrared lights Camera with Infrared filter Speech and gesture multimodal interface test setup Demonstration of use of gesture pendant to recognize hand gestures Multimodal interface in action
Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model construction with texture Visualization Database UNCERTAINTY UNCERTAINTY Model update Decision Making Mobiles with augmented reality sensors
Visualization of Uncertain Particle Movement • Uncertainty in initial position, direction and speed • Uncertainty modeled by Gaussian distribution
Modeling andVisualization of Uncertainty • Spatio-temporal GPS uncertainty models : • Number of accessible/used satellites • SNR (Signal to Noise Ratio) • DOP (Dilution of Precision) • Real-time visualization of GPS-tracked objects and associated uncertainty within VGIS
Low Uncertainty Line Preserving Compression Coastline preserving Original Unconstrained
Hierarchical Line Simplification Intersection preserving simplification
Mobile AR Visualization Laser-------- Lidar-------- Radar------- Camera------ GPS--------- Maps-------- Gyroscope-- 3D model construction with texture Visualization Database Model update Fusion/ Decision Making Mobiles with augmented reality sensors
Bayesian Networks with Temporal Updates Information flow Objective: To incorporate time-dependence of observations and evidence in Bayesian inference networks.
Temporal Fusion in Multi-Sensor Target Tracking Systems For a multi-sensor tracking system, sensors can be either synchronous or asynchronous (temporally staggered) T: Sampling interval of synchronous sensors T1: Time difference between sensor 1 and sensor 2 in asynchronous-sensor case T=T1+T2
Transitions (1) • Government: • Interactions with AFRL, ONR, NASA, NIMA • Presentations to President Bush and Gov. Ridge • Presentations to program directors at STRICOM • Industry: • Raytheon, Lockheed Martin, Boeing, Sarnoff • HJW, Sick, Bosch, Astech, Airborne 1 • Sensis, Andro computing solutions • Olympus • Rhythm and Hues Studio
Publications (1) • C. Früh and A. Zakhor, "3D model generation for cities using aerial photographs and ground level laser scans", Computer Vision and Pattern Recognition, Hawaii, USA, 2001, p. II-31-8, vol.2. • H. Foroosh, “ A closed-form solution for optical flow by imposing temporal constraints”, Proceedings 2001 International Conference on Image Processing, vol.3, pp .656-9. • C. Früh and A. Zakhor, "Data processing algorithms for generating textured 3D building façade meshes from laser scans and camera images”, accepted to 3D Data Processing, Visualization and Transmission, Padua, Italy, 2002 • John Flynn, “Motion from Structure: Robust Multi-Image, Multi-Object Pose Estimation”, Master’s thesis, Spring 2002, U.C. Berkeley • S. You, and U. Neumann. “Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration,” IEEE VR2001, pp.71-78, March 2001 • B. Jiang, U. Neumann, “Extendible Tracking by Line Auto-Calibration,” submitted to ISAR 2001 • J. W. Lee. “Large Motion Estimation for Omnidirectional Vision,” PhD thesis, University of Southern California, 2002
Publications (2) • J. W. Lee, B. Jiang, S. You, and U. Neumann. “Tracking with Vision for Outdoor Augmented Reality Systems,” submitted to IEEE Journal of Computer Graphics and Applications, special edition on tracking technologies, 2002 • William Ribarsky, “Towards the Visual Earth,” Workshop on Intersection of Geospatial and Information Technology, National Research Council (October, 2001). • William Ribarsky, Christopher Shaw, Zachary Wartell, and Nickolas Faust, “Building the Visual Earth,” to be published, SPIE 16th International Conference on Aerospace/Defense Sensing, Simulation, and Controls (2002). • David Krum, William Ribarsky, Chris Shaw, Larry Hodges, and Nickolas Faust “Situational Visualization,” pp. 143-150, ACM VRST 2001 (2001). • David Krum, Olugbenga Omoteso, William Ribarsky, Thad Starner, and Larry Hodges “Speech and Gesture Multimodal Control of a Whole Earth 3D Virtual Environment,” to be published, Eurographics-IEEE Visualization Symposium 2002. Winner of SAIC Best Student Paper award. • William Ribarsky, Tony Wasilewski, and Nickolas Faust, “From Urban Terrain Models to Visible Cities,” to be published, IEEE CG&A. • David Krum, Olugbenga Omoteso, William Ribarsky, Thad Starner, and Larry Hodges “Evaluation of a Multimodal Interface for 3D Terrain Visualization,”submitted to IEEE Visualization 2002.
Publications (3) • Justin Jang, William Ribarsky, Chris Shaw, and Nickolas Faust, "View-Dependent Multiresolution Splatting of Non-Uniform Data," pp. 125-132, Eurographics-IEEE Visualization Symposium 2002 • C. K. Mohan, K. G. Mehrotra, and P. K. Varshney, ``Temporal Update Mechanisms for Decision Making with Aging Observations in Probabilistic Networks’’, Proc. AAAI Fall Symposium, Cape Cod, MA, Nov. 2001. • R. Niu, P. K. Varshney, K. G. Mehrotra and C. K. Mohan, `` Temporal Fusion in Multi-Sensor Target Tracking Systems’’, to appear in Proceedings of the Fifth International Conference on Information Fusion, July 2002, Annapolis, Maryland. • Q. Cheng, P. K. Varshney, K. G. Mehrotra and C. K. Mohan, ``Optimal Bandwidth Assignment for Distributed Sequential Detection’’, to appear in Proceedings of the Fifth International Conference on Information Fusion, July 2002, Annapolis, Maryland. • Suresh Lodha, Amin P. Charaniya, Nikolai M. Faaland, and Srikumar Ramalingam, "Visualization of Spatio-Temporal GPS Uncertainty within a GIS Environment" to appear in the Proceedings of SPIE Conference on Aerospace/Defense Sensing, Simulation, and Controls, April 2002. • Suresh K. Lodha, Nikolai M. Faaland, Amin P. Charaniya, Pramod Varshney, Kishan Mehrotra, and Chilukuri Mohan, "Uncertainty Visualization of Probabilistic Particle Movement", To appear in the Proceedings of The IASTED Conference on Computer Graphics and Imaging", August 2002.
Publications (4) • Suresh K. Lodha, Amin P. Charaniya, and Nikolai M.Faaland, "Visualization of GPS Uncertainty in a GIS Environment", Technical Report UCSC-CRP-02-22, University of California, Santa Cruz, April 2002, pages 1-100. • Suresh K. Lodha, Nikolai M. Faaland, Grant Wong, Amin Charaniya, Srikumar Ramalingam, and Arthur Keller, "Consistent Visualization and Querying of Geospatial Databases by a Location-Aware Mobile Agent", In Preparation, to be submitted to ACM GIS Conference, November 2002. • Suresh K. Lodha, Nikolai M. Faaland, and Jose Renteria, ``Hierarchical Topology Preserving Simplification of Vector Fields using Bintrees and Triangular Quadtrees'', Submitted for publication to IEEE Transactions on Visualization and Computer Graphics. • Lilly Spirkovska and Suresh K. Lodha, ``AWE: Aviation Weather Data Visualization Environment'', Computers and Graphics, Volume 26, No.~1, February 2002, pp.~169--191. • Suresh K. Lodha, Krishna M. Roskin, and Jose Renteria, ``Hierarchical Topology Preserving Compression of 2D Terrains'', Submitted for publication to Computer Graphics Forum.
Publication (5) • Suresh K. Lodha and Arvind Verma, ``Spatio-Temporal Visualization of Urban Crimes on a GIS Grid'',Proceedings of the ACM GIS Conference, November 2000, ACM Press, pages 174--179. • Christopher Campbell, Michael Shafae, Suresh K. Lodha and D. Massaro, ``Multimodal Representations for the Exploration of Multidimensional Fuzzy Data", Submitted for publication to Behavior Research, Instruments, and Computers. • Suresh K. Lodha, Jose Renteria and Krishna M. Roskin, ``Topology Preserving Compression of 2D Vector Fields'', Proceedings of IEEE Visualization '2000, October 2000, pp. 343--350.
Outline of Talks • A. Zakhor, U.C. Berkeley, "Overview" • C. Freuh, U.C. Berkeley, "Fast 3D model construction of urban environments" • U. Neuman, USC, "Tracking and Data Fusion for 4D Visualization" • Bill Ribarsky, Georgia Tech, "4D Modeling and Mobile Visualization" • Lunch • Pramod Varshney, Syracuse, "Temporal Uncertainty Computation, Fusion, and Visualization in Multisensor Environments" • S. Lodha, U.C. Santa Cruz, "Uncertainty Quantification and Visualization: Mobile Targetswithin Geo-Spatially Registered Terrains" • Discussion, Feedback from Government