800 likes | 820 Views
The 3DDI project, a collaboration between MURI, UC Berkeley, and MIT, aims to develop a pipeline for 3D capture, modeling, simulation, and rendering, with applications in tele-surgery, training, collaboration, and more.
E N D
3DDI Visualization MURI UC Berkeley and MIT
3DDI: Overview Project pipeline: 3D capture: Modeling, simulation Rendering 3D Display Applications: Tele-surgery Training Collaboration
Laser scanner Virtual object 3D display 3DDI: Goals • Direct Interaction: no gloves or glasses. • Animated content: interaction in real time. • Content is real-world: 3D models from live capture and modeling.
Task: 3D capture using range scanner • To build a solid-state, high-accuracy electronic range-finding scanner. • The system should serve as a replacement for mechanical scanners and motion-capture devices and be usable indoors and outdoors. • Desired performance: • Outdoors, sub-meter accuracy at 100s of meters, scans in less than a second. • Indoors, millimeter accuracy at several meters, scans at 20-60 frames/sec.
Subtasks • Fabrication, testing and improvement of high-power, flip-bonded VCSEL arrays. • Integration of scanner components. Design of custom elements (modulator, amplifier and power supplies). • Purchase and integration of coupling optics. • Illumination demo with VCSEL source, photomultiplier and CCD. • Wrote code for image sequence processing and calibration. • Static scan example, integration with dynamic authoring tool (Steve Chenney).
3D Imaging System, U.C.B. VCSEL Array CCD IR Light Portable Platform! Fuji Lens Imaging Optics MCP Power Supply HF Signal
First Scanned image • Bottle image: depth range ~ 1.2m. Accuracy ~ 0.3cm
Task: Model Capture Using Pose Cameras • Urban geometry • Textures/BRDFsfor reillumination …Develop effectivesensors, automatedand semi-automatedsoftware tools for rapid environment capture • Synergistic efforts at UCB: • 3D scanner • Illumination capture … How can we import 3D scene data quickly and automatically? … Starting point for visualization, design, simulation, teaching.
Goals of integrated effort • Acquire geo-referenced digital imageryof MIT campus from ground, air • Extract building exteriors from imagery,using fully automatic techniques • Model building interiors semi-automaticallyfrom existing 2D building floorplans • Attach dense interior phototexturesto geometry, semi-automatically • Integrate photometrics, interaction,and dynamic simulation from UCB
Acquisition of geo-referenced imagery • Argus platform performs sensor fusionof imagery, navigation information
Geo-referencing of multiple nodes • Currently semi-automated process requir-ing less than one person-second per image
Texture extraction • Estimation based on weighted medians
Progress year 3 • Acquire hemispherical interior imagery • Merge ground, aerial geo-ref’d imagery • Extension to temporal modeling • Continuous site modeling of changing site • Test: Building 20 demolition, construction
Task: Capturing Geometry and Reflectance from Photographs • Input from Cameras, Pose Cameras, Laser Scanners • Output to Conventional and 3D Displays
Progress year 1 • Extend Facade to Parametrized Curved Objects • Visibility Processing and Real-time Rendering • Campanile Movie • High Dynamic Range Photography
Research Highlights Year 1 • Façade: Accelerated using α-blending. • Façade: extended to circularly symmetric objects. Campanile Movie shown at SIGGRAPH’97
Progress year 2 • Photometric Properties of Architectural Scenes • Capturing and Using Complex Natural Illumination • Video Motion Capture
Research Highlights Year 2 • Calculation of radiance with known (outdoor) illumination: • Re-rendering under novel lighting:
Research Highlights Year 2 • Rendering synthetic objects into real scenes using HDR photography. Real+Synthetic objects:
Research Highlights Year 2 • Acquisition of motion data from video using kinematic models:
Progress year 3 • Reflectance Recovery from MIT Pose Camera Data • Inverse Global Illumination
A Synthetic Sunrise Sequence 5:00am 5:30am 6:00am 6:30am 7:00am 8:00am 9:00am 10:00am One Day at the End of March
Inverse Global Illumination Algorithm Developed Reflectance Properties Radiance Maps Light Sources Geometry
Input Multiple range scans of a scene Multiple photographs of the same scene Output Geometric meshes of each object in the scene Registered texture maps for objects Progress Year 4
Overview Range Images Point Cloud Point Groups Simplified Meshes Meshes Registration Segmentation Reconstruction Pose Estimation Texture Map Synthesis Radiance Images Calibrated Images Texture Maps Objects
Camera Pose Results • Accuracy: consistently within 2 pixels • Correctness: correct pose for 58 out of 62 images
Image-based Modeling and Rendering • 3rd Generation--Vary spatial configurations in addition to viewpoint and lighting Novel Viewpoint Novel Viewpoint & Configuration
Task: Authoring Huge, Dynamic Visual Simulations • Efficiency • Too much time is spent computing needless dynamic state, and dynamic authoring is not integrated with geometric design. • Control • Physics doesn’t do what an author wants • Success is measured through speedups and the control of example scenarios.
Take models from measured data. Eg: architecture Author scenarios and simulate the dynamics. Eg: a traffic accident Provide dynamic models for efficient rendering. Integration example: Simulating with a scanned bottle. How it relates to MURI 3D Capture Modeling, Simulation Rendering 3D Display
Year 1: Culling with consistency • Exploit viewer uncertainty to achieve efficient dynamics culling • Significant speedups demonstrated: • Around 5x for test environments. • Arbitrary depending on the world. • Tools released for VRML authoring. • Papers in I3D, VRML98 and CGA.
Year 2 and 3: Directing Scenarios • Use physical sources of randomness (eg. rough surfaces, variable initial conditions) to direct physical simulations • Year 2: Directing a single body • Year 3: Directing multiple interacting bodies • Along the way: Fast multi-body simulation techniques
Integration Example: Details • Captured data and 3d rendering must be linked by an authoring phase. • Extract radius information from 3D bottle scan, plus estimate of variance. • Simulate using MCMC to achieve a goal - balls are deflected by bottles to land in the right place. • Render on autostereoscopic display.
Task: Integration of Modeling and Simulation • Incorporate data from multiple sources: • Geodetic capture (MIT); floorplan extrusion, instancing (UCB) • Geometry compilation for responsiveness: • Scaleable, persistent proximity/visibility database (UCB, MIT) • Natural, extensible constraint-based interaction • Object associations framework (UCB) • Physically-based kinematics: • Fire simulation (UCB; shown in ‘98) • Impulse-response simulation (UCB)
Several generations of system components: • 1990-93: WalkThrough system (UCB) • Rapid visualization of complex models • 1993-94: Radiosity integration (Princeton) • Diffuse illumination throughout model • 1994-95: Object associations (UCB) • Natural object instancing & placement • 1994-97: FireWalk, Impulse (UCB) • Physically-based fire, kinematic simulations • 1996-99: Façade, Skymaps (UCB) • High-fidelity photo-assisted modeling • 1996-99: City Scanning (MIT) • Acquisition of extended urban models
Dataset Integration: Geo-referencing • Argus data is geodetically registered
Dataset Integration: Exterior structure • Exteriors in UCB FireWalk framework
Integration of UCB object associations • Infrastructure supports editing at any scale
Exterior to interior transition • Seamless transition to Tech Square interior
Transition: building approach • Gravity association keeps us to local ground
Visibility modifications: exterior, interior • Cell-portal visibility applies throughout
Door passages using object assocations • Opening doors to allow passage
Integration of UCB floorsketch, firewalk • Tech Square interiors modeled by procedural floorplan extrusion, furniture instancing
Integration of UCB Impulse-Response • Automated generation of RBL objects • Requires specification as union of convex parts • Initial integration: population, visualization
Extension to Impulse: sleeping objects • Added “sleep state” for objects coming to rest
Extension to Impulse: interaction • Added interactive application of forces