500 likes | 720 Views
Graphics Pipeline & Shader s. SET09115 Intro to Graphics Programming. Breakdown. Background Working with Data Graphics Pipeline Pipeline Stages Programmable Shaders. Recommended Reading. “Real-Time Rendering”, Third Edition. Akenine -Moller, et. al. Chapters 1 - 3. Background.
E N D
Graphics Pipeline & Shaders SET09115 Intro to Graphics Programming
Breakdown • Background • Working with Data • Graphics Pipeline • Pipeline Stages • Programmable Shaders
Recommended Reading • “Real-Time Rendering”, Third Edition. Akenine-Moller, et. al. • Chapters 1 - 3
Review • So far you have been mainly working with basic geometry • Creating triangles from three points • Creating quads from four points • Using these shapes to build more complex models • Geometry can have other values attached • Colour • Texture positions • etc.
What is a Pipeline? • A pipeline is just a series of processes that occur in order • Read File Process Data Print Results • A pipeline can have any number of processes within it • The point is that data from one process is fed into the next process • Think of an assembly line
Parallelizing a Pipeline • Pipelines are very easy to parallelize • Pipelining • Each process works independently • Whenever one process has data ready for the next, it forwards it on • The sending process can still be processing data • Internal Parallel • Each process can itself work on different parts of the incoming data • Allows the process to speed up based on the number of internal processes
Throughput of Data • Data throughput in a pipeline is fast in comparison to traditional processing • Data parallelism • Streaming • Data throughput is determined by the slowest process • Some processes may be fast
Questions? Pipelines Graphics Pipeline
Vertex Data • Data sent to graphics card is called vertex data • Position of vertex • Colour of vertex • Texture coordinates • Normals • This is what you use to define your objects // Vertex 1 glVertex3f(1.0f, 0.0f, 0.0f); glColor3f(1.0f, 0.0f, 0.0f); // Vertex 2 glVertex3f(1.0f, 1.0f, 0.0f); glColor3f(1.0f, 0.0f, 1.0f); // etc.
Vertex Buffers • To stream vertex information efficiently, we use vertex buffers • Arrays of vertices • Buffers exist for vertices, normals, colours, etc. GLfloat vertices[] = { 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f 0.0f, 1.0f, 0.0f }; glEnableClientState(GL_VERTEX_ARRAY); glVertexPointer(3, GL_FLOAT, 0, vertices); glDrawArrays(GL_TRIANGLES, 0, 3);
Index Data • Working with buffers allows us to exploit index data to further improve efficiency • Indices means that we reuse vertex data from a buffer at different points
Index Buffers • Index data is also stored in a buffer • Buffer contains index numbers relating to positions of vertices in the vertex buffer • Render reuses vertices from the buffer // Create the eight vertices GLfloat vertices[] = { .. }; glVertexPointer(3, GL_FLOAT, 0, vertices); // Create index buffer Glubyte indices[] = { 0, 3, 1, 1, 3, 2, // etc }; glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_BYE, indices);
Questions? Data for the Pipeline Using Data on the Pipeline
History • First work on 3D rendering occurred in the 1960s • Most common 3D rendering techniques evolved in the 1960s-1970s • 1980s saw the introduction of graphics hardware • Commodore Amiga first commercial computer to have a GPU • 1990s saw the introduction of graphics cards and graphics APIs • OpenGL, DirectX
Fixed-Function Pipeline • Initially graphics cards were not programmable • Fixed-function pipeline • Programmers could turn of and on states, and provide some values to these states • This is what you have done so far with OpenGL • Nintendo Wii still has a fixed-function pipeline • Although advancing rendering, this did not provide flexibility
Programmable Pipeline • To add flexibility, GPUs were given programmable stages • Shaders • First shaders used in 1980s • Pixar’s RenderMan • 2001 nVidia introduced the GPU with programmable shaders • Vertex shading • Today, nearly all GPUs support shading • Vertex, Geometry, Pixel / Fragment
Application Geometry Rasterizer • The three main stages of the graphics pipeline are: • Application • OpenGL, DirectX stage • Application stage feeds geometry data (vertex data) to the geometry stage • Geometry • GPU • Works on the vertices and polygons towards screen mapping • Rasterizer • GPU • Sets pixel colours for the screen
Questions? Graphics Pipelines Evolution General Stages
Application Stage • Complete control from the programmer • Use graphics API to send data to the GPU • This is where we have been working up until now • May do culling to improve performance
Geometry Stage • GPU transforms input Application stage data into screen mapped data • Has five internal stages • Model & View Transform • Vertex Shading • Geometry shader we will discuss in a later lecture • Projection • Clipping • Screen Mapping
Model and View Transform • Model initially exists within its own model coordinate space • Allows movement, rotation, etc. • Model transformation can be done on the GPU • More on this throughout the module • View transformation involves moving the model so it exists in the camera’s coordinate space • Camera at the origin • Model moved to suit
Vertex Shading • First programmable stage • Vertex data is used to perform per-vertex lighting calculations • You can see the triangles
Projection • 3D data is now projected using the projection matrix • View (frustum) is transformed into a unit cube • (-1, -1, -1) to (1, 1, 1) • Two common projection methods • Orthographic (essentially flat) • Parallel lines remain parallel • Perspective • Distance from camera has an effect
Clipping • We only ever try and draw what we can see • Culling has removed much to us already • After model, view and projection transform, only some primitives are still visible • These are removed • Some primitives may only be partially visible • Clipped • New vertices are introduced to support the clipping
Screen Mapping • The final process in the geometry stage • Only visible primitives need to be mapped to the screen • This is relatively simple – the transformed (x, y) coordinates are used as screen coordinates
Rasterizer Stage • This is where we start assigning colours to pixels on screen • Four stages • Triangle setup • Triangle traversal • Pixel Shading • Merging
Triangle Setup and Traversal • These stages are fixed within the graphics pipeline • Goal is to determine which triangles cover which pixels • Including the depth of the triangle pixel • This data is used later in the merging stage
Pixel Shading • Programmable stage allowing manipulation of pixel colours • Lighting here provides a smoother effect • Texturing also done here
Merging • Determines the final colour of a pixel • Data from pixel shader, colour buffers, depth, etc. are used to determine final colour of a screen pixel • Other operations can be performed here based on pipeline state
High Level Stages • For our purposes, we can think of the graphics pipeline as two stages • Programmable Shader Stage • Vertex shader to perform some lighting calculations • Geometry shader • Transforms vertices
High Level Stages • Merging / Raster Operations Stage • Colours individual pixels • Pixel shader can be used in here • Per-pixel lighting • Texturing etc. • Understanding the two stage process is important, but be aware of the internal operations
Questions? Application Stage Geometry Stage Rasterization Stage
Programmable Shaders • As mentioned, three stages in the graphics pipeline are programmable • Vertex shader • Manipulates individual vertices • Geometry shader • Works with primitives (triangles, lines, etc.) • Pixel shader • Works with pixels
Vertex Shader • Lets us work with the incoming vertex data • Position • Colour • Normals • Can calculate values related to these vertices • Light colour • World position
Geometry Shader • Let’s us work with individual pieces of geometry • Lines • Triangles • Quads • Can introduce new geometry at this stage
Stream Output • It is possible to take the output of the geometry shader at this point if required • No rendering to the screen • This is done when working with geometry data you wish to manipulate and then use in another rendering pass • Particle effects (explosions) performed here
Pixel Shader • Let’s us determine colours of individual pixels • Per-pixel lighting • Texturing • Normal (bump) mapping • etc.
Render to Texture • Also possible to output at this stage to a texture instead of the screen • Useful for post-processing techniques • Has numerous uses • Motion blur, depth of field, shadowing, etc. • Benie and Artur’s lighting effect last year used this technique
Example Shader • Vertex Shader varying vec3 normal, lightDir; void main() { lightDir= normalize(vec3(gl_LightSource[0].position)); normal = normalize(gl_NormalMatrix * gl_Normal); gl_Position= ftransform(); } • Pixel Shader varying vec3 normal, lightDir; void main() { float intensity; vec3 n; vec4 color; n = normalize(normal); intensity = max(dot(lightDir, n), 0.0); if (intensity > 0.98) color = vec4(0.8, 0.8, 0.8, 1.0); else if (intensity > 0.5) color = vec4(0.4, 0.4, 0.8, 1.0); // etc. gl_FragColor = color; }
Effect Files • Shader programs are stored in effect files • DirectX uses a single file approach • OpenGL allows shaders to be split between files then linked together • OpenGL and DirectX compile these files at runtime • So, do it during the loading process • Can set values using the API calls
Languages • Number of shader languages exist • High Level Shader Language (HLSL) • DirectX language • GL Shader Language (GLSL) • OpenGL language • C for graphics (Cg) • nVidia language • Supported via libraries in DirectX and OpenGL • Slower due to higher level integration
Summary • We’ve taken a brief trip down the graphics pipeline • Application Geometry Rasterization • We’ve also looked at the programmable stages • Vertex, Geometry, Pixel • We will be using these ideas as we go forward through the module