1 / 57

Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

Regressions in RT Rendering: LittleBigPlanet Post Mortem Alex Evans & Anton Kirczenow Media Molecule. Advances in Real-Time Rendering in 3D Graphics and Games. Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009). Constraints & Mash-ups.

Download Presentation

Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Regressions in RT Rendering: LittleBigPlanet Post Mortem Alex Evans & Anton Kirczenow Media Molecule Advances in Real-Time Rendering in 3D Graphics and Games Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  2. Constraints & Mash-ups Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  3. LBP’s Constraints • (Back in 2006) • Blank slate • no code, no tech, no existing franchise • New, Unknown & Strange platform • but a single platform! • User generated content was to be key • no pre-computation • ‘worst’ case is the typical case: untrained ‘artists’ aka users • Distinctive art style based on ‘miniature world’ • style binding content: allow LBP’s style to shine through any UGC • My technical prejudices • 1.5 programmers & 36 months Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  4. Prejudices to force decisions • We had little time to mess about, so the overriding goal was always: whatever is easiest to code & plays to our strengths • no middleware • (home-made turtles all the way down please) • one code path for all rendering of all objects • eg same path for solid, transparent, fx... • all lighting per-pixel • no per-vertex or (shudder) per-object ‘optimizations’ • playing to strength of team: I hate vertices, and I don’t have time to write those secondary paths anyway • minimum number of ‘knobs’ for the artists - but not too few • relatively non-technical art team, busy with other tasks Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  5. Prejudices into Features • lots of dynamic geometry & cloth • (we can’t pre-compute anything anyway...) • motion blur, DOF, flexible color correction • used in unusual context • unbounded numbers of dynamic lights • emphasis on many layers of texture to achieve surface look • refractive glass • correct DOF / motion-blur behind transparent objects • ability to arbitrarily cover the entire world in 'stickers' with no loss in frame-rate • fluid & cloth to drive all FX • we almost avoided writing a particle system. almost. Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  6. Unusual vertex pipe • Avoid Vertex Shader slowness on RSX by using SPU instead • Add some sexy features to take advantage of SPU: • Cloth • Soft bodies • Patch tesselation • Point deformers Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  7. Anton shows all Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  8. Vertex pipeline: iterate spring/collision constraints Verlet Integrate Collision with world Input verts Find cluster matrix Rigid Skinning Input bones X X compute N, T Stitch seams Morph targets Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  9. Vertex motion • SPU’s job is to create world-space ‘polygon soup’ per frame • P & N are triple buffered, allowing simple Verlet physics • Pnew = 2P-drag*(P-Pold)-Pold+A • drag is set by collision routine • Meshes automatically turned into springs across grids of quads, with artist weighting in vertex colors • http://www.gamasutra.com/features/20000327/lander_02.htm • Vertex lerp’ed back to skinned pos • using artist defined weights Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  10. Collision Detection • SPU does collision against: • extruded 2D convex shapes (world) • swept ellipsoids (character) • vertex transformed into old & new ellipsoid position • raycast sphere vs ellipse in stationary ‘ellipse space’ raycast Pold Pold old Pnew new transform into ellipse space Pnew Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  11. Clusters • Lovely mesh-less algorithm: least-squares fit a new rigid matrix to a deformed cloud of vertices • remove center of mass - that gives translational component • matrix sum of dyadic of (P-COM) x (Prest-COMrest) • multiply by inverse of pre-computed matrix in rest pose • orthogonalize resulting best-fit matrix to give affine transform, and optionally preserve volume by normalizing • blend with original rigid body matrices to make as squishy/rigid as you want • more detail: Meshless Deformations Based on Shape Matching - Muller et al, SIGGRAPH 05 Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  12. flexibility of vertex pipeline • we ended up using that pipeline not just for meshes, decorations, cloth • but also SFX like confetti, dust, embers,... • allowing us to almost entirely avoid an explicit ‘particle system’ • with one exception: jet-pack smoke was a simple point-sprite based backend on the integrated/collided vertices... • rendered 1/4 screen-res for fill-rate reasons • each point sprite has a 2D gradient across it - gives remarkable sense of ‘3d-ness’ • every pixel samples the sun’s shadowmap to give volumetric shadows in the smoke Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  13. On to pixels: graph based? Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  14. fixed everything aint so bad? • the graph editor was never used by artists • they were happy to composite LOTS of texture layers at different scales • and left me to hard-code a single ‘uber-shader’ that handled everything else. • that BRDF is entirely ‘cooked up’ based on live shader editing and my eyeballs. approximately: • 2 strong rim-light with color set using artist controlled hemisphere based on (V.N)^2 with bias & scale • the 2 rim lights are lerp’ed between based on sunlight shadow • diffuse a mixture of N.L and (N.L)^2 • ambient a simple 2 color hemisphere • specular simply (R.L)^22 Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  15. shadowing • one sun light & arbitrary numbers of ‘sprite lights’ • world has shallow depth, so we got away with very simple shadowmapping • Initially TSM eg http://www.comp.nus.edu.sg/~tants/tsm/tsm.pdf • Finally, just orthogonal single 768x768 shadowmap! • very tightly tuned frustum of shadow casters • soft shadows via variance shadow maps + AO contact shadows • just the vanilla form of VSM: after Donnelly et al • lots of light leaking • used 16FP render targets with two copies of X, X^2 in R/G and B/A • second copy scaled by 32. recombined in shader -> less boiling Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  16. ambient occlusion for contacts • the character’s feet were small & felt floaty... • so we added an AO ‘light’ that analytically computed the occlusion of the character modeled as 5 spheres (2 feet, 2 legs, 1 groin) • see Inigo Quilez’ awesome site for derivation • http://iquilezles.org/www/articles/sphereao/sphereao.htm AO= (N.D) (r2/D.D) Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  17. Spritelights take 1 • as presented at SIGGRAPH 06, I started by slicing the view frustum and rendering light volumes directly into those slices (as a volume texture) • A sort of ‘irradiance volume’ - but with no directional component. (‘0th order SH’) • Use grad of irradiance in direction of normal to approximate N.L type lighting. Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  18. Sprite Lights Take 1 Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  19. Sprite Lights Take 1 Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  20. Sprite lights take 2: Defer? • Lay down a Z/N pre-pass at 2X MSAA ...then light as if it were grey plastic (specular -> alpha) • ...and do sprite lights in the traditional ‘deferred’ way • BUT! no G-buffer, no material properties, no MRTs: just Z, N in Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  21. Sprite lights take 2: Defer? • then re-render scene ‘forwards’, sampling ‘L’ into BRDF Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  22. sprite lights take 2: god rays • now we’re doing sprite lights properly, we might as well add some features: god rays! Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  23. god rays implementation • completely 2d screenspace effect • each light volume is rendered to an off-screen surface, one channel per light (in batches of 4) • pixel shader integrates light scattering along eye ray to first solid object (z buffer distance) • 3 pass ‘smear shader’ then smears • dark regions with MIN blending • each pass doubles distance of smear • similar to masa’s ‘rthdribl’ lens-flare streak shaders • http://www.daionet.gr.jp/~masa/rthdribl/ Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  24. god rays implementation • completely 2d screenspace effect • each light volume is rendered to an off-screen surface, one channel per light (in batches of 4) • pixel shader integrates light scattering along eye ray to first solid object (z buffer distance) • 3 pass ‘smear shader’ then smears • dark regions with MIN blending • C = sample(u,v) • for (i=0;i<5;i++) • u=(u-cu)*k+cu; v=(v-cv)*k+cv; • C=min(C, sample(u,v) + i/5.f); • (where k<1 sets smear radius, cu,cv is the screenspace position of the lightsource) Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  25. 2 layer transparency • deferred shading with alpha. what? Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  26. 2 Layer transparency • Exploiting the fact that we have 2X MSAA hardware • and we can ‘cast’ the ‘L’ buffer to be 2560x720p non MSAA • deferred lighting done at full 2560x720p resolution • allows us two independent Z values per final pixel • PS3 you have to resolve your 2560x720p into 1280x720 by hand. turn this into a virtue: • trade transparency for MSAA in different regions of the screen • custom resolve shader is quite tricky: uses alpha channel to control blend between the two samples - compositing either as ‘A over B’ or ‘(A+B)/2’ Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  27. 2 layer transparency & DOF &... Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  28. 2 is not a big number • leads to odd effects when you get more than 2 layers: • users actively are exploiting this to great effect! Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  29. 2 layers: Refraction • Glass shader only WRITES to even pixels in X (‘alpha layer’) • it only READS from odd pixels in X (‘solid layer’), sampling from it’s own screenspace pos P perturbed by the glass surface normal • k is the ‘index of refraction’ • P.xy+=k*N.xy • P.x-=frac(P.x)+0.5; <--- MAGIC! • Cout=tex2D(screen,P) • code above relies on using UV’s measured in pixels, not 0-1 Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  30. careful MB/DOF calculation • Motion-blur & DOF were vital to the miniature look • willing to spend lots of GPU cycles on this • however it still runs at half res: • PASS 0: downsample 2x in Y, 4x in X • blend 2 layers correctly according to degree to which each sample needs blurring. Alpha channel = coarse z buffer Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  31. careful MB/DOF calculation • PASS 1 & 2: 7 tap ‘classic’ DOF with variable circle of confusion, checking alpha channel for z conflicts • samples distorted along direction of screenspace motion • motion blur comes ‘for free’ using same sample points as DOF stationary pixel fast moving pixel • Final MSAA Resolve: • reads 1/2 res blurred image, and composites it over the full-res screen; • however it does this BEFORE blending the two alpha layers, and mixes the blurred image into each layer only according to that layer’s need. • This conditional blend happens at full res, so no need for bilateral filter et al Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  32. first stab at motion blur • we have a screenspace motion vector (x,y) for each pixel • however, boundaries of objects don’t look nice: • Initially, tried to render geometry extruded along direction of motion. however, fast rotating objects broke this: Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  33. improving motion blur • how to improve ‘leading’ and ‘trailing’ edges? • insight: use 2 frames of info - this frame (1) and old frame (0) • blur C1 with V0, and C0 with V1 (twice as many taps) • use Z-in-alpha to help composite samples together Frame 0 Frame 1 these regions get frame 0 background blurred by frame 1’s velocity while these regions get fr.1 background, with fr.0 velocity Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  34. improving motion blur: results Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  35. fluids: how? • Stam stable fluids, 2D water-grid PDE, or SPH? • Anton will take them all, thanks very much! • Stam-style stable fluids can be seen all over the place for smoke, fire & dissolve fx • Water Grid used for ‘sea level’ • Rigid body coupling to grid, generates... • ... splash particles, combining SPH water-volume preserving forces with existing vertex pipeline (verlet, collisions etc) Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  36. water grid • shallow water simulation - 256x128 on SPU • fixed grid only contains visible play area, and is post-perspective distortion to give even tesselation • The method is as in Claes Johanson’s MSc thesis, "Real Time Water Rendering: Introducing the projected grid concept” but tweaked • to keep the grid evenly spaced in X when the camera is very near the surface and looking forward. • V advects quantities with bicubic filtering • linear not good enough - leads to ‘viscous’ look • method is similar to Chapter 11 of ‘Fluid Simulation for Computer Graphics by Robert Bridson 2008’ Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  37. Water rendering • The water surface normals are generated by bicubic filtering a texture of the height field, linear isn’t good enough: • Refraction is just cheesy resampling of screen buffer with perturbed UVs by the surface normal • additional z check to ensure refracted sample is actually below surface: Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  38. Water rendering Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  39. water grid / rigid body coupling • For rigid body coupling: • find rigid bodies intersecting the water surface, take their AABBs and compute the change of "submerged-ness" in the water. • This is diffused and added to the water depth to create wakes and ripples. • The method is as per [Thurey et. al."Real Time Breaking Waves for Shallow Water Simulations" 2007] , • but with Stam style stable diffusion so it doesn't blow up so much. • We also add some of the velocities of the immersed objects to the water grid Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  40. Grid -> SPH: Splashes! • We generate splashes wherever the water height changes rapidly • emit particles where these both hold: • change in water height > threshold • gradient(water height) dot water velocity > 0 • Splash particles (and bubbles) also add depth to the water grid when they collide with it to cause secondary ripples. • splash particles are in an SPH simulation • as per Muller et. al. "Particle based Fluid Simulation for Interactive Applications" Eurographics 2003 • with a novel collision detection scheme that is more SPU friendly • more details in the course notes! Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  41. splat based splash rendering • 2 pass rendering: • 1st pass sorts coarse quads front-to-back, with UNDER compositing of N and alpha • this ensures isolated particles look sharp, while still blending adjacent particles smoothly • each particle is elongated along its direction of motion • particles near the water surface stretch to simulate joining • 2nd pass outputs color • uses stencil buffer to avoid overdraw • uses similar shader as the water surface to provide consistency Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  42. Incoming light on regular grid small area compared to grid = bright large area compared to grid = dark underwater & caustics! • inspired by Pixar’s ‘Finding Nemo’ DVD extras & • Advanced Animation and Rendering Techniques by Alan Watt & Mark Watt, 1992, Ch 10.1 • intersect refracted grid of sun-rays with fixed ground-plane beneath the water surface Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  43. underwater & caustics! • inspired by Pixar’s ‘Finding Nemo’ DVD extras & • Advanced Animation and Rendering Techniques by Alan Watt & Mark Watt, 1992, Ch 10.1 • simply intersect refracted grid of sun-rays with fixed ground-plane beneath the water surface • on SPU, for each vertex, compute change in area of surrounding refracted quads vs un-refracted quads • output as a gouraud shaded mesh, rendered by GPU • blend as MAX rather than +, to keep dynamic range down Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  44. Refraction • a 1024x512 caustic map renders in only 0.6ms! Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  45. underwater • when rendering main scene, project caustic texture through world Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  46. stable fluids • 3 layers of 256x128 2D fluid, used for smoke & dissolve FX • borrowing tricks from Mark Harris’ GPU implementation (GPU Gems) • GPU re-samples screenspace z & v buffers to initialize boundary conditions • fluid only run on 3 world aligned slices, that scroll with the camera • GPU copies from the backbuffer into the fluid buffer to cause objects to ‘dissolve’ • SPU outputs list of point sprites on a grid to avoid wasting fill rate where there is no fluid Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  47. breathe.... • ...and be thankful that we didn’t go through all the other fun stuff, like virtual texturing for stickering, or procedural mesh generation on the SPU... Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  48. remind me what that was about? • the key here is that many disparate techniques can be blended to create something unique looking • it’s valid to treat dev-time as a primary constraint • for LBP, we were able to trade accuracy & generality for ‘good enough’ approximations... • the devil joy is in the details! • the SPU freed us from worrying about casting every problem as a VS/PS stream processing ‘DX9’ combination • which ironically unlocked a lot of fun pre-GPU literature • avoiding special case code paths kept the complexity down and the code coverage good... • ...making it cheap to re-visit major decisions, like ‘forward’ vs ‘deferred’, and re-implement features several times. Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  49. conclusion • in other words, • have fun mixing up and blending as many ideas as you think will help you achieve the visual look you’re after • (or something) Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

  50. thankyou! Advances in Real-Time Rendering in 3D Graphics and Games New Orleans, LA (August 2009)

More Related