320 likes | 528 Views
Lighting & Visual Effects. CSE 191A: Seminar on Video Game Programming Lecture 8: Lighting & Visual Effects UCSD, Spring, 2003 Instructor: Steve Rotenberg. Lighting. Normals. Normals are usually specified per vertex (rather than per polygon)
E N D
Lighting & Visual Effects CSE 191A: Seminar on Video Game Programming Lecture 8: Lighting & Visual Effects UCSD, Spring, 2003 Instructor: Steve Rotenberg
Normals • Normals are usually specified per vertex (rather than per polygon) • Normals are usually set up entirely through an interactive modeling program and are just read in as is into the real time renderer • A ‘smooth’ vertex normal is usually computed as the average of the normals of the triangles using that vertex (offline) • Consider that a cube has 8 vertices & 6 normals, but requires 24 unique vertex/normal pairs • Each vertex/normal pair should be uniquely lit
Light-Surface Interaction • L: Light • N: Normal • R: Reflection • H: Halfway • V: Viewer • T: Transmission N L R H V T
Diffuse Lighting • An ideal diffuse surface reflects light uniformly in all directions C=[r,g,b] Cfinal=Clight*Cdiffuse*(N·L)
Light Types • Ambient: uniform light from all directions • Directional: light from a single direction (usually approximates a distant light source such as the sun) • Point: light emitting from a point source (like a light bulb). Point lights should obey the inverse square law: I=I0/distance2 • Spot: light emitting from a point source, but aiming in a particular direction cone
Specular Lighting • An ideal specular surface reflects incident light rays in one direction (a perfect mirror) • A less-ideal specular surface may scatter rays in a cone around the mirror direction • Specular surfaces can be approximated with the (old fashioned) Blinn (or Phong) models: H=~(L+V) (halfway vector) (~=normalize) Cfinal=Clight*Cspecular*(N·H)shine • Computing specular lighting at the vertices doesn’t always look very good because the lighting can vary a large amount over a small distance
Environment Mapping • Environment mapping is a technique that uses a texture map to fake the appearance of a shiny surface. • The ‘environment map’ itself is a 360 degree view of the world viewed from the object’s position. • There are a variety of actually mapping techniques (polar, spherical, cube, dual paraboloid…) • The view vector is reflected off of the normal and then converted to a texture coordinate • Polar environment map: • Vector reflection: R=-V+2*N*(V·N) • Tx=(atan2f(R.x,R.z)+PI)/(2*PI) • Ty=(R.y+1)/2 -or- (asin(R.y)+PI/2)/PI • Sphere map: • N’=N·Mview • Tx=(N’.x+1)/2 • Ty=(N’.y+1)/2
Environment Mapping • Environment maps can be rendered on the fly or can be precomputed • Environment maps can be blurred to simulate ‘glossy’ reflections
BRDFs & Global Illumination • Bidirectional Reflectance Distribution Function • BRDF = ρ(θi, φi,θr,φr,λ) • Real materials reflect light in complex ways • Real light bounces around in complex ways • There is some modern research on implementing accurate BRDFs and global illumination in real time, but these techniques are still a little out of reach for mainstream gaming
Practical Real Time Lighting • Precompute any static lighting when possible • Turn point lights into directional lights (if possible) • Ignore darker lights (not necessarily distant ones) • Use environment mapping for specular lights (rather than per-vertex) • Use projected textures for spot lights & other custom projection shapes (rather than per-vertex)
Shadows • Drop shadows • Polygonal projection • Texture projection • Stencil
Precomputed Lighting • Ideal diffuse light is view independent and so it can easily be precomputed and stored • Possible effects include: • Diffuse lighting • Complex light types (point, spot, area…) • Shadows (and soft shadows) • Diffuse inter-reflection • Techniques for precomputing global illumination • Photon mapping • Monte-Carlo path tracing • Radiosity • Dynamic light can be layered on top of precomputed light
Alpha • ‘Alpha’ is a generic name for an extra parameter that can treated as a fourth color component (i.e., rgba: red, green, blue, alpha) • Often, alpha is used to represent opacity in a 0…1 range (opacity = 1-transparency) • Alpha can be specified per vertex and can also be specified per texel in a texture map • The alpha blending function can be controlled through graphics API calls • Different hardware systems tend to have radically different alpha blending capabilities
Transparency • Usually, for transparency to work, you must render polygons sorted from distant to near • When a partially transparent pixel is rendered, the incoming color (source color) is blended with the existing pixel color (destination color) Cfinal=αsrcCsrc+(1-αsrc)Cdest
Additive Blending • Incoming color is simply added to the existing color in the framebuffer • Useful for lighting effects such as glows, lens flares, lighting bolts, plasma beams, etc. Cfinal=αsrcCsrc+Cdest or even: Cfinal=Csrc+Cdest
Color Modulation • Useful for colored lighting (either precomputed or dynamic projected lights) Cfinal=Csrc*Cdest or sometimes: Cfinal= αsrc*Cdest
Source & Destination Factors • One traditional method of specifying alpha blending is the use of src and dest factors: Cfinal=FsrcCsrc+FdestCdest • Where Fsrc and Fdest can be: 0, 1, αsrc, αdest, (1- αsrc), (1- αdest), Csrc, Cdest, or others • This leads to lots of possible blending functions. Only a small number of them are generally useful.
Multipass Rendering • In multipass rendering, a polygon is rendered several times (passes) to combine various effects • Useful for several lighting & material type effects. • Example: 1. Precomputed diffuse light (overwrite) 2. Projected shadow textured light (add) 3. Diffuse material texture (multiply) 4. Specular map (overwrite into alpha) 5. Environment map (Cfinal=αdestCsrc+Cdest
Multistage Rendering • Same idea as multipass rendering, except the individual passes and blending is done internally and the final color is only written into the framebuffer once. • Because combination isn’t necessarily linear, you can potentially do more complex effects • Number of stages may be limited (4 on XBox) • You can still do multipass rendering with multistage rendering to get more passes
Vertex & Pixel Shaders • Shaders are microprograms that can run per-vertex or per-pixel • Different hardware supports radically different capabilities • As graphics chips become more complex, pixel and vertex programs become more general purpose • Stream architecture
Particles • Useful for tons of visual effects • Fire • Smoke • Water • Dirt • Debris, explosions • Trash, leaves blowing around • Usually, particles have only a position and no orientation info • Particles are usually rendered as sprites/quads with alpha effects
Particles class Particle { Vector3 Position; Vector3 Velocity; Vector3 Force; Vector4 Color; float Mass; float Radius; int TexFrame; }; class ParticleSystem { int ActiveParticles; int MaxParticles; Particle *Particles; float CreationRate; Particle Mean; Particle Variance; };
Fog • Fog (or depth cueing) is an important visual feature that provides perception of depth • Different hardware supports different fog features • Linear, exponential, exp2 • Depth based vs. distance based
Billboards • Complex geometric objects can be approximated with simple ‘cards’ or ‘billboards’ (trees are a common example)
Texture Movies • Useful for fire, water surface, clouds, misc. • Cheap and powerful effect, but may require a lot of texture memory • Streaming texture movies
Lens Effects • Glows, blooms, stars • Halos • Lens flare • Internal reflections, scattering
Off-Screen Rendering • Shadow maps • Environment maps (cube map…) • Imposters • Full-screen effects • 2D distortion, ripple, heat wave, shockwave • Color: night vision, visual adaptation… • ‘Predator’ effect, etc. • Supersampling • Focus (depth of field) • Motion blur • Issues • Video memory • State changes • Pixel fill
Preview of Next Week • Networking with guest speaker Mark Rotenberg, Technical Director for Midnight Club 2