1 / 80

3DDI: 3D Capture, Modeling, Simulation, and Rendering for Visualization Applications

The 3DDI project, a collaboration between MURI, UC Berkeley, and MIT, aims to develop a pipeline for 3D capture, modeling, simulation, and rendering, with applications in tele-surgery, training, collaboration, and more.

rmayer
Download Presentation

3DDI: 3D Capture, Modeling, Simulation, and Rendering for Visualization Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 3DDI Visualization MURI UC Berkeley and MIT

  2. 3DDI: Overview Project pipeline: 3D capture: Modeling, simulation Rendering 3D Display Applications: Tele-surgery Training Collaboration

  3. Laser scanner Virtual object 3D display 3DDI: Goals • Direct Interaction: no gloves or glasses. • Animated content: interaction in real time. • Content is real-world: 3D models from live capture and modeling.

  4. Task: 3D capture using range scanner • To build a solid-state, high-accuracy electronic range-finding scanner. • The system should serve as a replacement for mechanical scanners and motion-capture devices and be usable indoors and outdoors. • Desired performance: • Outdoors, sub-meter accuracy at 100s of meters, scans in less than a second. • Indoors, millimeter accuracy at several meters, scans at 20-60 frames/sec.

  5. Subtasks • Fabrication, testing and improvement of high-power, flip-bonded VCSEL arrays. • Integration of scanner components. Design of custom elements (modulator, amplifier and power supplies). • Purchase and integration of coupling optics. • Illumination demo with VCSEL source, photomultiplier and CCD. • Wrote code for image sequence processing and calibration. • Static scan example, integration with dynamic authoring tool (Steve Chenney).

  6. 3D Imaging System, U.C.B. VCSEL Array CCD IR Light Portable Platform! Fuji Lens Imaging Optics MCP Power Supply HF Signal

  7. First Scanned image • Bottle image: depth range ~ 1.2m. Accuracy ~ 0.3cm

  8. Task: Model Capture Using Pose Cameras • Urban geometry • Textures/BRDFsfor reillumination …Develop effectivesensors, automatedand semi-automatedsoftware tools for rapid environment capture • Synergistic efforts at UCB: • 3D scanner • Illumination capture … How can we import 3D scene data quickly and automatically? … Starting point for visualization, design, simulation, teaching.

  9. Goals of integrated effort • Acquire geo-referenced digital imageryof MIT campus from ground, air • Extract building exteriors from imagery,using fully automatic techniques • Model building interiors semi-automaticallyfrom existing 2D building floorplans • Attach dense interior phototexturesto geometry, semi-automatically • Integrate photometrics, interaction,and dynamic simulation from UCB

  10. Acquisition of geo-referenced imagery • Argus platform performs sensor fusionof imagery, navigation information

  11. Geo-referencing of multiple nodes • Currently semi-automated process requir-ing less than one person-second per image

  12. Texture extraction • Estimation based on weighted medians

  13. Progress year 3 • Acquire hemispherical interior imagery • Merge ground, aerial geo-ref’d imagery • Extension to temporal modeling • Continuous site modeling of changing site • Test: Building 20 demolition, construction

  14. Task: Capturing Geometry and Reflectance from Photographs • Input from Cameras, Pose Cameras, Laser Scanners • Output to Conventional and 3D Displays

  15. Progress year 1 • Extend Facade to Parametrized Curved Objects • Visibility Processing and Real-time Rendering • Campanile Movie • High Dynamic Range Photography

  16. Research Highlights Year 1 • Façade: Accelerated using α-blending. • Façade: extended to circularly symmetric objects. Campanile Movie shown at SIGGRAPH’97

  17. Progress year 2 • Photometric Properties of Architectural Scenes • Capturing and Using Complex Natural Illumination • Video Motion Capture

  18. Research Highlights Year 2 • Calculation of radiance with known (outdoor) illumination: • Re-rendering under novel lighting:

  19. Research Highlights Year 2 • Rendering synthetic objects into real scenes using HDR photography. Real+Synthetic objects:

  20. Research Highlights Year 2 • Acquisition of motion data from video using kinematic models:

  21. Progress year 3 • Reflectance Recovery from MIT Pose Camera Data • Inverse Global Illumination

  22. A Synthetic Sunrise Sequence 5:00am 5:30am 6:00am 6:30am 7:00am 8:00am 9:00am 10:00am One Day at the End of March

  23. Inverse Global Illumination Algorithm Developed Reflectance Properties Radiance Maps Light Sources Geometry

  24. Real vs. Synthetic for Original Lighting

  25. Real vs. Synthetic for Novel Lighting

  26. Input Multiple range scans of a scene Multiple photographs of the same scene Output Geometric meshes of each object in the scene Registered texture maps for objects Progress Year 4

  27. Overview Range Images Point Cloud Point Groups Simplified Meshes Meshes Registration Segmentation Reconstruction Pose Estimation Texture Map Synthesis Radiance Images Calibrated Images Texture Maps Objects

  28. Segmentation Results

  29. Camera Pose Results • Accuracy: consistently within 2 pixels • Correctness: correct pose for 58 out of 62 images

  30. Texture-Mapping and Object Manipulation

  31. Image-based Modeling and Rendering • 3rd Generation--Vary spatial configurations in addition to viewpoint and lighting Novel Viewpoint Novel Viewpoint & Configuration

  32. Texture-Mapping and Object Manipulation

  33. Task: Authoring Huge, Dynamic Visual Simulations • Efficiency • Too much time is spent computing needless dynamic state, and dynamic authoring is not integrated with geometric design. • Control • Physics doesn’t do what an author wants • Success is measured through speedups and the control of example scenarios.

  34. Take models from measured data. Eg: architecture Author scenarios and simulate the dynamics. Eg: a traffic accident Provide dynamic models for efficient rendering. Integration example: Simulating with a scanned bottle. How it relates to MURI 3D Capture Modeling, Simulation Rendering 3D Display

  35. Year 1: Culling with consistency • Exploit viewer uncertainty to achieve efficient dynamics culling • Significant speedups demonstrated: • Around 5x for test environments. • Arbitrary depending on the world. • Tools released for VRML authoring. • Papers in I3D, VRML98 and CGA.

  36. Year 2 and 3: Directing Scenarios • Use physical sources of randomness (eg. rough surfaces, variable initial conditions) to direct physical simulations • Year 2: Directing a single body • Year 3: Directing multiple interacting bodies • Along the way: Fast multi-body simulation techniques

  37. Integration Example: Details • Captured data and 3d rendering must be linked by an authoring phase. • Extract radius information from 3D bottle scan, plus estimate of variance. • Simulate using MCMC to achieve a goal - balls are deflected by bottles to land in the right place. • Render on autostereoscopic display.

  38. Task: Integration of Modeling and Simulation • Incorporate data from multiple sources: • Geodetic capture (MIT); floorplan extrusion, instancing (UCB) • Geometry compilation for responsiveness: • Scaleable, persistent proximity/visibility database (UCB, MIT) • Natural, extensible constraint-based interaction • Object associations framework (UCB) • Physically-based kinematics: • Fire simulation (UCB; shown in ‘98) • Impulse-response simulation (UCB)

  39. Several generations of system components: • 1990-93: WalkThrough system (UCB) • Rapid visualization of complex models • 1993-94: Radiosity integration (Princeton) • Diffuse illumination throughout model • 1994-95: Object associations (UCB) • Natural object instancing & placement • 1994-97: FireWalk, Impulse (UCB) • Physically-based fire, kinematic simulations • 1996-99: Façade, Skymaps (UCB) • High-fidelity photo-assisted modeling • 1996-99: City Scanning (MIT) • Acquisition of extended urban models

  40. Dataset Integration: Geo-referencing • Argus data is geodetically registered

  41. Dataset Integration: Exterior structure • Exteriors in UCB FireWalk framework

  42. Integration of UCB object associations • Infrastructure supports editing at any scale

  43. Exterior to interior transition • Seamless transition to Tech Square interior

  44. Transition: building approach • Gravity association keeps us to local ground

  45. Visibility modifications: exterior, interior • Cell-portal visibility applies throughout

  46. Door passages using object assocations • Opening doors to allow passage

  47. Integration of UCB floorsketch, firewalk • Tech Square interiors modeled by procedural floorplan extrusion, furniture instancing

  48. Integration of UCB Impulse-Response • Automated generation of RBL objects • Requires specification as union of convex parts • Initial integration: population, visualization

  49. Extension to Impulse: sleeping objects • Added “sleep state” for objects coming to rest

  50. Extension to Impulse: interaction • Added interactive application of forces

More Related