490 likes | 669 Views
Modern Real-Time Rendering. The Z-buffer algorithm and geometric primitives. Vector Graphics. Drawing with lines and curves only – no surfaces Used today in PostScript, text, and some output devices. Raster Graphics.
E N D
Modern Real-Time Rendering The Z-buffer algorithm and geometric primitives
Vector Graphics • Drawing with lines and curves only – no surfaces • Used today in PostScript, text, and some output devices
Raster Graphics • Creating images for a rectangular array of pixels -- virtually all modern displays • Usually RGB colorspace used: say 8 bits for each of red, green, blue • Hardware-accelerated graphics emphasized populating the raster with sensible values as quickly as possible
The Synthetic World • Rendering: converting a description of a scene into an image of the scene • Typically, scene descriptions are geometry • Explicit geometry: list of points (vertices) about which some information is known
Geometric Primitives • Different ways of assembling vertices • isolated vertices (points) • sequences of points (lines, piecewise linear curves) • triangles • collections of triangles
Assembling Triangles • Triangle fan • Triangle Strip • fewer than 3 vertices per triangle • saves memory, bus usage
Triangle Meshes • Hardware support for real-time rendering • Rasterization • which pixels are needed to show the object? • Visibility • which object can be seen?
Z-buffer: Basic Idea • Project objects onto screen screen eye position
Z-buffer: Basic Idea • Project every vertex onto screen • Pixels receive appropriate colors screen eye position
"King Vertex" • The vertex is the fundamental primitive in modern real-time rendering • All information stored in vertex • position • color • texture coordinates • surface normal (direction perpendicular to surface) • possibly other attributes, in custom vertex format
Hidden Surface Removal • Fundamental rendering problem: • Have a collection of geometry • Need to know what is visible (closest to the eye) at a given point on the screen • Don’t draw things that are behind other things • Historically, devolved to sorting
How to Draw • "Painter's Algorithm" • Sort your objects in order of decreasing distance to the eye • Paint the most distant ones first, the closest ones last • Paint over the images of the distant objects with the closer objects that are in front of them
The Z-buffer • Sorting is an enormous burden • The Z-buffer uses dedicated memory to free us from that problem (mostly) • Depth buffer: stores z value at every pixel • Depth test: only draw a fragment if it is closer than the last drawn fragment • Now, objects can be drawn in any order
Z-buffer: Basic Idea • With each pixel, store a depth (Z) value • Specialized buffer available • Initialize z-buffer values to infty • For each fragment: Draw it iff it has a lower value than the previous value (hence is closer) • Update the depth buffer • Brute force solution to visibility
Complications • Lack of resolution in depth buffer results in “z-fighting” between close values • Transparent objects need to be drawn last, and multiple transparent objects still demand sorting • Few transparent objects • Maybe, don’t care if we accurately show transparent objects behind other transparent objects
Vertex Shading • Lighting calculations done on each vertex to determine color • Custom vertex shader executed, or standard one : BasicEffect in XNA • Historically, final colors computed by vertex shader, later interpolated across pixels • "Three term lighting model" • Gouraud shading
Transformations • Critical task of vertex shader: compute final position of every vertex • Each vertex in the geometry receives appropriate transformation • Same transformation on each vertex • Modeling transform: Moving, orienting, and scaling the objects to create the scene • Viewing transform: Change of coordinate systems from whatever world coordinates into canonical coordinates
Interpolation • Values from vertices interpolated to find values of fragment • Color interpolated (RGBA) • Texture coordinates interpolated • Texture lookup produces per-pixel color • Custom pixel shader executes at this step, potentially taking additional values from vertices and computing final color
Rasterization • Primitives converted into pixels in the raster (grid) • Fragment values combined into pixel values (might have multiple fragments per pixel) • Depth test applied here
Real-time rendering • Virtually all modern real-time computer graphics done with z-buffering • Hardware executes operations in parallel to accelerate image synthesis • Strict limitations on what can be done • “why do all video games look the same?” • Changing because of access to shaders
But what does it all mean? • Now we will look at how to make use of some of this information in practice • Future lectures: • Writing custom vertex and pixel shaders • Applying and combining transformations • Using transformations to control the camera • moving the camera around in a scene, like in a FPS • For now: • putting geometry in the world • rendering with static camera and BasicEffect shader
The Zbuffer and XNA • XNA runs the Z-buffer algorithm for you • enable depth testing, and nearer objects will be drawn in front of further objects • Fixed bit depth of Z-buffer is an issue • floating point Z means less resolution at larger distances • Reminder: pixel shader can modify depth values • can obtain interesting special effects by adjusting depth
Abstracted Clouds • Fein and McGuire, NPAR 2006 • Partial silhouettes by adjusting depth values
Setting up Geometry • To render geometry, execute the following steps: • create your vertices and set their properties • create a vertex declaration for the graphics device • create and configure an Effect • establish your camera parameters • in Draw, use the Effect to draw your vertices
Vertex Types • Various builtin vertex types provided • different combinations of what information stored • position • color • texture coordinates • surface normal • VertexPositionColor • VertexPositionColorTexture • VertexPositionNormalTexture (**) • VertexPositionTexture
Vertex Array • Probably you will want to make an array containing your vertex data VertexPositionColor[] mydata = new VertexPositionColor[6]; ... mydata[0] = new VertexPositionColor( new Vector3(1, 3, -1), Color.Aquamarine);
Vertex Declaration • The Graphics Device has to be informed what kind of data it will receive • Done through a VertexDeclaration object vd = new VertexDeclaration(graphics.GraphicsDevice, VertexPositionColor.VertexElements); ... graphics.GraphicsDevice.VertexDeclaration = vd;
the BasicEffect • Built-in vertex shader • The BasicEffect can do lighting • cleverly designed with a 3-light rig • key light: main light (often overhead) • fill light (somewhat dimmer, reduces shadows) • back light (behind object, illuminates silhouettes) • Or, you can disable lighting and just use the raw color • "the Basic Effect is not so basic" • Critical job of any vertex shader – (what?)
Final Position Calculation • Calculation of screen position from vertex position done with matrix multiplication • we'll look at this in some detail in later • Done with three matrices: • world matrix: computes true world coordinates • just set to identity for now • view matrix: transforms world to "canonical" coordinates relative to camera • projection matrix: transforms 3D canonical coordinates to 2D screen coordinates
Creating the View • Viewing transformation matrix Matrix view; ... Matrix.CreateLookAt(eyepos, lookat, up, out view);
Creating the View • Viewing transformation matrix Matrix view; ... Matrix.CreateLookAt(eyepos, lookat, up, out view); position of camera position looked at "up" direction output – view matrix
Creating the Projection • Projection transformation matrix Matrix projection; ... Matrix.CreatePerspectiveFieldOfView( fov, aspect, near, far, out projection);
Creating the Projection • Projection transformation matrix Matrix projection; ... Matrix.CreatePerspectiveFieldOfView( fov, aspect, near, far, out projection); projection matrix far clipping plane aspect ratio Field of View (radians) near clipping plane
Projection Information • Need to define a frustum (truncated pyramid) • Different ways of describing • always need near & far distances • In any API: function to get projection matrix given frustum description
Matrices to BasicEffect effect = new BasicEffect(graphics.GraphicsDevice, null); ... effect.View = view; effect.Model = model; effect.Projection = projection;
Using BasicEffect • An Effect contains one or more Techniques • A Technique contains one or more Passes effect.Begin(); foreach (EffectPass pass in effect.CurrentTechnique.Passes) { pass.Begin(); ... // drawing geometry here pass.End(); } effect.End();
Rendering your Geometry • Various ways to specify • Arguably simplest: graphics.GraphicsDevice.DrawUserPrimitives( PrimitiveType.TriangleStrip, mydata, start, numprimitives);
Rendering your Geometry • Various ways to specify • Arguably simplest: graphics.GraphicsDevice.DrawUserPrimitives( PrimitiveType.TriangleStrip, mydata, start, numprimitives); vertex array first element number of primitives
Recap • Z-buffer: algorithm for real-time rendering • vertices projected onto screen • vertices contain data: position, color, ... • intermediate fragments interpolated • depth test used to render fragments in front • Lot of setup needed in XNA to render • Vertex data and VertexDeclaration • "Effects" to transform and light vertices • transforms achieved through matrices • Drawing syntax
Looking Forward • Custom shaders • Texture for added visual complexity • Closer look at transforms • mathematics of transforms • homogeneous coordinates • composite transforms • modeling transforms, camera control