400 likes | 597 Views
Unstructured Lumigraph Rendering. Chris Buehler Michael Bosse Leonard McMillan MIT-LCS Steven J. Gortler Harvard University Michael F. Cohen Microsoft Research. The Image-Based Rendering Problem. Synthesize novel views from reference images
E N D
Unstructured Lumigraph Rendering Chris Buehler Michael Bosse Leonard McMillan MIT-LCS Steven J. Gortler Harvard University Michael F. Cohen Microsoft Research
The Image-Based Rendering Problem • Synthesize novel views from reference images • Static scenes, fixed lighting • Flexible geometry and camera configurations
LF ULR VDTM The ULR Algorithm • Designed to work over a range of image and geometry configurations • Designed to satisfy desirable properties # of Images Geometric Fidelity
u u0 s0 s Desired Camera “Light Field Rendering,” SIGGRAPH ‘96 Desired color interpolated from “nearest cameras”
u Desired Property #1: Epipole consistency s Desired Camera “Light Field Rendering,” SIGGRAPH ‘96
“The Scene” u Potential Artifact Desired Camera “The Lumigraph,” SIGGRAPH ‘96
“The Scene” Desired Property #2: Use of geometric proxy Desired Camera “The Lumigraph,” SIGGRAPH ‘96
“The Scene” Desired Camera “The Lumigraph,” SIGGRAPH ‘96
“The Scene” Desired Property #3: Unstructured input images Desired Camera “The Lumigraph,” SIGGRAPH ‘96 Rebinning Note: all images are resampled.
“The Scene” Desired Property #4: Real-time implementation Desired Camera “The Lumigraph,” SIGGRAPH ‘96
“The Scene” Occluded Out of view Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98
“The Scene” Desired Property #5: Continuous reconstruction Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98
“The Scene” Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98 θ1 θ3 θ2
“The Scene” Desired Property #6: Angles measured w.r.t. proxy Desired Camera View-Dependent Texture Mapping, SIGGRAPH ’96, EGRW ‘98 θ1 θ3 θ2
“The Scene” Desired Camera
“The Scene” Desired Property #7: Resolution sensitivity Desired Camera
Previous Work • Light fields and Lumigraphs • Levoy and Hanrahan, Gortler et al., Isaksen et al. • View-dependent Texture Mapping • Debevec et al., Wood et al. • Plenoptic Modeling w/Hand-held Cameras • Heigl et al. • Many others…
Unstructured Lumigraph Rendering • Epipole consistency • Use of geometric proxy • Unstructured input • Real-time implementation • Continuous reconstruction • Angles measured w.r.t. proxy • Resolution sensitivity
Desired Camera colordesired = Σwi colori i Blending Fields
Desired Camera colordesired = Σw(ci)colori i Blending Fields
Unstructured Lumigraph Rendering • Explicitly construct blending field • Computed using penalties • Sample and interpolate over desired image • Render with hardware • Projective texture mapping and alpha blending
θ6 θ5 θ1 θ4 θ2 θ3 Angle Penalty Geometric Proxy C6 C1 C5 C2 C4 Cdesired C3 penaltyang(Ci)=θi
Cdesired Resolution Penalty Geometric Proxy penaltyres Ci distdesired penaltyres(Ci)=max(0,dist(Ci)– dist(Cdesired ))
Field-Of-View Penalty penaltyFOV angle
penalty(Ci) = αpenaltyang(i) + βpenaltyres(i) + γpenaltyfov(i) Total Penalty
K-Nearest Continuous Blending • Only use cameras with K smallest penalties • C0 Continuity: contribution drops to zero as camera leaves K-nearest set • w(Ci) = 1- penalty(Ci)/penalty(Ck+1st closest ) • Partition of Unity: normalize • w(Ci) =w(Ci)/Σw(Cj) ~ j
Sampling Blending Fields Epipole and grid sampling Just epipole sampling
Hardware Assisted Algorithm Sample Blending Field • Clear frame buffer • for each camera ido • Set current texture and projection matrix • Copy blending weights to vertices’ alpha channel • Draw triangles with non-zero alphas • end for Select blending field sample locations for each sample location jdo for each camera ido Compute penalty(i) for sample location j end for Find K smallest penalties Compute blending weights for sample location j end for Triangulate sample locations Render with Graphics Hardware
Blending over one triangle Epipole and grid sampling Just epipole sampling
Hardware Assisted Algorithm Sample Blending Field • Clear frame buffer • for each camera ido • Set current texture and projection matrix • Copy blending weights to vertices’ alpha channel • Draw triangles with non-zero alphas • end for Select blending field sample locations for each sample location jdo for each camera ido Compute penalty(i) for sample location j end for Find K smallest penalties Compute blending weights for sample location j end for Triangulate sample locations Render with Graphics Hardware
Future Work • Optimal sampling of the camera blending field • More complete treatment of resolution effects in IBR • View-dependent geometry proxies • Investigation of geometry vs. images tradeoff
Conclusions • Unstructured Lumigraph Rendering • unifies view-dependent texture mapping and lumigraph rendering methods • allows rendering from unorganized images • sampled camera blending field
Acknowledgements • Thanks to the members of the • MIT Computer Graphics GroupandMicrosoft ResearchGraphics and Computer Vision Groups • DARPA ITO Grant F30602-971-0283 • NSF CAREER Awards 9875859 & 9703399 • Microsoft Research Graduate Fellowship Program • Donations from Intel Corporation, Nvidia, andMicrosoft Corporation