470 likes | 695 Views
GLSL OpenGL Programming and Reference Guides, other sources. ppt from Angel, AW, van Dam, etc. CSCI 6360/4360. Review and Introduction … Last time: Implementation – CG Algorithms. First part of course dealt mainly with geometric processing Below focuses on role of transformations:
E N D
GLSLOpenGL Programming and Reference Guides, other sources.ppt from Angel, AW, van Dam, etc. CSCI 6360/4360
Review and Introduction …Last time: Implementation – CG Algorithms • First part of course dealt mainly with geometric processing • Below focuses on role of transformations: • Last week, looked at other elements of viewing pipeline, and algorithms: • Clipping, was one thing we looked at • Eliminating objects (and parts of objects) that lie outside view volume - and, so, not visible in image • Rasterization • Produces fragments (pixels) from remaining objects • Hidden surface removal (visible surface determination) • Determines which object fragments are visible, and, so, put in frame buffer
Review & Intro.Implementation -- Algorithms • Next steps in viewing pipeline: • Clipping • Eliminating objects (and parts of objects) that lie outside view volume • Rasterization • Produces fragments (pixels) from remaining objects • Hidden surface removal (or, visible surface determination) • Determines which object fragments are visible, and, so, put in frame buffer • Show objects (surfaces, pixels) not blocked by objects closer to camera • Recall, … and next week … • Now, it’s next week! • Quick, re-orientation
Tasks to Render a Geometric Entity1Review and Angel Explication • Angel introduces more general terms and ideas, than just for OpenGL pipeline… • Recall, chapter title “From Vertices to Fragments” … and even pixels • From definition in user program to (possible) display on output device • Modeling, geometry processing, rasterization, fragment processing • Modeling • Performed by application program, e.g., create sphere polygons (vertices) • Angel example of spheres and creating data structure for OpenGL use • Product is vertices (and their connections) • Application might even reduce “load”, e.g., no back-facing polygons
Tasks to Render a Geometric Entity2Review and Angel Explication • Geometry Processing • Works with vertices • Determine which geometric objects appear on display • 1. Perform clipping to view volume • Changes object coordinates to eye coordinates • Transforms vertices to normalized view volume using projection transformation • 2. Primitive assembly • Clipping object (and it’s surfaces) can result in new surfaces (e.g., shorter line, polygon of different shape) • Working with these “new” elements to “re-form” (clipped) objects is primitive assembly • Necessary for, e.g., shading • 3. Assignment of color to vertex • Modeling and geometry processing called “front-end processing” • All involve 3-d calculations and require floating-point arithmetic
Tasks to Render a Geometric Entity3Review and Angel Explication • Rasterization • Only x, y values needed for (2-d) frame buffer • … as the frame buffer is what is displayed • Rasterization, or scan conversion, determines which fragments displayed (put in frame buffer) • For polygons, rasterization determines which pixels lie inside 2-d polygon determined by projected vertices • Colors • Most simply, fragments (pixels) are determined by interpolation of vertex shades & put in frame buffer • Color can also be determined during fragment processing (more later) • Output of rasterizer is in units of the display (window coordinates)
Tasks to Render a Geometric Entity4Review and Angel Explication • Fragment Processing – will consider more about this tonight • (last time) Hidden surface removal performed fragment by fragment using depth information • Colors • OpenGL can merge color (and lighting) results of rasterization stage with geometric pipeline • E.g., shaded, texture mapped polygon • Lighting/shading values of vertex merged with texture map • For translucence, must allow light to pass through fragment • Blending of colors uses combination of fragment colors, using colors already in frame buffer • e.g., multiple translucent objects • Anti-aliasing also dealt with
Architecture ViewsGeometry Path & Pixel Path • Pixel path Geometry Path Geometry Path Pixel Path
Shaders • “Early” OpenGL had fixed software (algorithms) for performing processing of vertices and fragments • Succession of changes to OpenGL standard have introduced ability to let programmer specify software (algorithms) for the processing • Called “shaders” • Now, providing shaders is “required” Fragment shader Vertex shader
History of OpenGL … It Mattersor, how Moore’s law changes things … quickly • In the Beginning … (Angel) • OpenGL 1.0 was released on July 1st, 1994 • Its pipeline was entirely fixed-function • only operations available were fixed by the implementation of OpenGL • The pipeline evolved, but remained fixed-function through OpenGL versions 1.1 through 2.0 (Sept. 2004) VertexData Vertex Transform and Lighting Primitive Setup and Rasterization Fragment Coloring and Texturing Blending PixelData TextureStore
Start of the Programmable PipelineOpenGL 2.0, 2004, (optional) programmable shaders • OpenGL 2.0 (2004) added programmable shaders • vertex shadingaugmented the fixed-function transform and lighting stage • fragment shadingaugmented the fragment coloring stage • However, the fixed-function pipeline was still available VertexData Vertex Transform and Lighting Primitive Setup and Rasterization Fragment Coloring and Texturing Blending PixelData TextureStore
An Evolutionary ChangeOpenGL 3.1, 2009, deprecation • OpenGL 3.0 introduced the deprecation model • Method used to remove features from OpenGL • Pipeline remained same until OpenGL 3.1 (March 24th, 2009) • Introduced a change in how OpenGL contexts are used
Exclusively Programmable Pipeline AND OpenGL 3.1, 2009, required shaders • OpenGL 3.1 removed the fixed-function pipeline • Programs were required to use only shaders • Additionally, almost all data is GPU-resident • all vertex data sent using buffer objects VertexData VertexShader Primitive Setup and Rasterization FragmentShader Blending TextureStore PixelData
More Programability OpenGL 3.2, 2009, geometry shaders • OpenGL 3.2 (released August 3rd, 2009) added an additional shading stage – geometry shaders VertexData VertexShader Primitive Setup and Rasterization FragmentShader Blending GeometryShader TextureStore PixelData
More Evolution – Context Profiles • OpenGL 3.2 also introduced context profiles • profiles control which features are exposed • Currently two types of profiles: core and compatible
The Latest Pipeline OpenGL 4.1, 2010, tesselation shaders • OpenGL 4.1 (released July 25th, 2010) included additional shading stages – • tessellation-control and tessellation-evaluation shaders • Latest version is 4.3 VertexData VertexShader Primitive Setup and Rasterization FragmentShader Blending TessellationEvaluationShader TessellationControlShader GeometryShader TextureStore PixelData
OpenGL ES and WebGL • OpenGL ES 2.0 • Designed for embedded and hand-held devices such as cell phones • Based on OpenGL 3.1 • Shader based • WebGL • JavaScript implementation of ES 2.0 • Runs on most recent browsers, e.g., Mozilla, Chrome, … but not all
OpenGL and GLSL • Shader-based OpenGL not state machine model, but data flow model • Most state variables, attributes and related pre 3.1 OpenGL functions have been deprecated • Vertex and fragment algorithms are executed in shaders • Application transfers data to GPU, then GPU executes shaders • More detail later • GLSL – “new” language used to program shaders • OpenGL Shading Language • C-like with • Matrix and vector types (2, 3, 4 dimensional) • Overloaded operators • C++ like constructors • Similar to Nvidia’s Cg and Microsoft HLSL • Code sent to shaders as source code • New OpenGL functions to compile, link and get information to shaders
Recall, A (realllly) Simple Program(3rd week) • Simple is good … • In fact, only simple because OGL sets defaults for all • If all uninitialized, or to say 0’s, then could be 50-100 lines of code • end of semester exercise • Generate a square on a solid background:
Recall, simple.c(3rd week) #include “glut.h” // uses lots of defaults int main(int argc, char** argv) { glutCreateWindow("simple"); glutDisplayFunc(mydisplay); // callback function glutMainLoop(); } void mydisplay() // changes, glVertex (immediate) – will draw gpu-resident { glClear(GL_COLOR_BUFFER_BIT); glBegin(GL_POLYGON); glVertex2f(-0.5, -0.5); glVertex2f(-0.5, 0.5); glVertex2f(0.5, 0.5); glVertex2f(0.5, -0.5); glEnd(); glFlush(); } • Just draws it:
Using GLSL and OpenGL 3.1 • Again, “new style” OpenGL (3.1 or later) • Most OpenGL functions deprecated • Not depend on state variable default values that no longer exist • Viewing, colors, window parameters, … • Still similar structure with functions • main(): specifies callback functions, opens window(s), enters event loop • init(): sets the state variables for viewing, attributes, etc. • callbacks • Display function • Input and window functions • But now, initShader():read, compile and link shaders • initShader() utilities usually hide details of setting up shaders • Recall, survey of OpenGL supported by your computers, Angel init. req. 3.2 • Key issue is must form a data array (“vertex buffer object”) to send to GPU and then render it
Graphics Modes and VBOs (vertex buffer objects) • Immediate Mode Graphics – “old” OpenGL • Geometry specified by vertices • Each time a vertex is specified in application, its location is sent to the GPU • Old style uses glVertex • Creates bottleneck between CPU and GPU • Removed from OpenGL 3.1 • Retained Mode Graphics – a middle step • Put all vertex and attribute data in array • Send array to GPU to be rendered immediately • Better, but have to send array over each time we need another render of it • Better to send array over once and store on GPU for multiple renderings – “new” OpenGL • “Vertex buffer objects” GPU GPU GPU xxxxxxxx x xxxxxxxx CPU App CPU App CPU App xxxxxxxx xxxxxxxx
Display Callback • Once data available on GPU, can initiate rendering with a simple callback • Arrays are buffer objects that contain vertex arrays • Vs: • void mydisplay() • { • glClear(GL_COLOR_BUFFER_BIT); • glDrawArrays(GL_TRIANGLES, 0, 3); • glFlush(); • } GPU GPU x xxxxxxxx CPU App CPU App xxxxxxxx void mydisplay() { glClear(GL_COLOR_BUFFER_BIT); glBegin(GL_POLYGON); glVertex2f(-0.5, -0.5); glVertex2f(-0.5, 0.5); glVertex2f(0.5, 0.5); glVertex2f(0.5, -0.5); glEnd(); glFlush(); }
Vertex ArraysNew Programming Approach • Vertices can have many attributes • Position, color, texture coordinates, application data • A vertex array holds these data • Using types in vec.h, e.g., • Vertex array object • Bundles all vertex data (positions, colors, …) • Get name for buffer then bind: • At this point have a current vertex array but no contents • Use of glBindVertexArray allows switching between vertex buffer objects point2 vertices[3] = {point2(0.0, 0.0), point2( 0.0, 1.0), point2(1.0, 1.0)}; Glunit abuffer; glGenVertexArrays(1, &abuffer); glBindVertexArray(abuffer);
Buffer Object • So, buffers objects allow transfer of large amounts of data to GPU • Need to create, bind and identify data - buffer • Data in current vertex array is sent to GPU Gluint buffer; glGenBuffers(1, &buffer); glBindBuffer(GL_ARRAY_BUFFER, buffer); glBufferData(GL_ARRAY_BUFFER, sizeof(points), points);
Initialization Different with Shaders • Vertex array objects and buffer objects can be set up on init() • Just do once at start up of program • Also set clear color and other OpenGL parameters • Also set up shaders as part of initialization • Read, compile, link • Again, utilities common
Programming Shaders • First programmable shaders were programmed in an assembly-like manner • OpenGL extensions added for vertex and fragment shaders • Cg (C for graphics) C-like language for programming shaders • Works with both OpenGL and DirectX • Interface to OpenGL complex • OpenGL Shading Language (GLSL)
GLSL • OpenGL Shading Language • Part of OpenGL 2.0 and up • High level C-like language • New data types • Matrices, vectors, samplers • “Built-in variables” – or, keywords, who would have known • Note – below is a shader! -- A bit more later later • gl_Position: output position from vertex shader • gl_FragColor: output color from fragment shader input from application must link to variable in application in vec4 vPosition; void main(void) { gl_Position = vPosition; } built in variable
Execution Model for ShadersDetail of First Diagram Application Program Application Program Vertex data GPU GPU xxxxxxxx glDrawArrays CPU App Fragment Shader Frame Buffer Rasterizer Vertex Shader Primitive Assembly Fragment Color Fragment Vertex
GLSL Data Types and PointersA new, somewhat primitive, language with familiar constructs • C types: int, float, bool • Vectors: • float vec2, vec3, vec4 • Also int (ivec) and boolean (bvec) • Matrices: mat2, mat3, mat4 • Stored by columns • Standard referencing m[row][column] • C++ style constructors • vec3 a =vec3(1.0, 2.0, 3.0) • vec2 b = vec2(a) • Pointers • There are no pointers in GLSL! • Can use C structs which can be copied back from functions • Because matrices and vectors are basic types, they can be passed into and output from GLSL functions, e.g. mat3 func(mat3 a)
Example ShadersRecall gl_Position and gl_FragColor keywords • Vertex shader const vec4 red = vec4(1.0, 0.0, 0.0, 1.0); out vec3 color_out; void main(void) { gl_Position = vPosition; color_out = red; } • Fragment shader • // “pass through” fragment shader in vec3 color_out; void main(void) { gl_FragColor = color_out; }
More GLSL • Passing values • call by value-return • Variables are copied in • Returned values are copied back • Three possibilities • In, out, inout (deprecated) • Operators and functions • Standard C functions • Trigonometric, arithmetic, Normalize, reflect, length • Overloading of vector and matrix types mat4 a; vec4 b, c, d; c = b*a; // a column vector stored as a 1d array d = a*b; // a row vector stored as a 1d array
Swizzling and Selection • Can refer to array elements by element using [] or selection (.) operator with • x, y, z, w • r, g, b, a • s, t, p, q • E.g., a[2], a.b, a.z, a.pare the same • Swizzling operator lets us manipulate components • vec4 a; • a.yz = vec2(1.0, 2.0);
Swizzling – From Wikipedia • swizzling means rearranging the elements of a vector • For example, if A = {1,2,3,4}, where the components are x, y, z, and w respectively, you could compute B = A.wwxy, and B would equal {4,4,1,2}. • In terms of linear algebra, this is equivalent to multiplying by a matrix of zeros and ones such that each row has exactly one one. If , then swizzling as above looks like
Getting Shaders into OpenGL Create Program • Shaders need to be compiled and linked to form an executable shader program • OpenGL provides the compiler and linker • A program must contain vertex and fragment shaders • other shaders are optional • Typically use a utility • Recall Angel’s utility for text requires OpenGL 3.2 glCreateProgram() CreateShader glCreateShader() These steps need to be repeated for each type of shader in the shader program Load Shader Source glShaderSource() Compile Shader glCompileShader() Attach Shader to Program glAttachShader() Link Program glLinkProgram() Use Program glUseProgram()
A Phong Vertex Shader • Recall basic OpenGL shading model discussed two weeks ago: • Can write and incorporate vertex shader that does the same thing • Though will not consider attenuation • First, values for variables below are passed in from the application // Variables to specify vertex positions, normal, (surface) color // as well as values for the several parameters of the shading model in vec4 vPosition; in vec3 vNormal; out vec4 color; uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct; uniform mat4 ModelView; uniform mat4 Projection; uniform vec4 LightPosition; uniform float Shininess;
A Phong Vertex ShaderGeometry processing • Shader does all the geometry processing, so will need to perform, e.g., the multiplication that transforms the vertex position (in model world coordinates) to position in eye coordinates void main() { // Transform vertex position into eye coordinates vec3 pos = (ModelView * vPosition).xyz; // Used in computing light value (went through quickly in class) vec3 L = normalize(LightPosition.xyz - pos); // vec from vertex to light vec3 E = normalize(-pos); // “eye” vector vec3 H = normalize(L + E); // “half-way vector” // Transform vertex normal into eye coordinates vec3 N = normalize(ModelView * vec4(vNormal, 0.0)).xyz;
A Phong Vertex ShaderIncorporate ambient, diffuse, and specular components • Computing the shading value from model (relatively) straightforward • Again, not consider attenuation // Compute terms in the illumination equation vec4 ambient = AmbientProduct; // computed in application float Kd = max( dot(L, N), 0.0 ); vec4 diffuse = Kd*DiffuseProduct; float Ks = pow( max(dot(N, H), 0.0), Shininess ); vec4 specular = Ks * SpecularProduct; if( dot(L, N) < 0.0 ) // correct, if light behind specular = vec4(0.0, 0.0, 0.0, 1.0) // recall, back face cull gl_Position = Projection * ModelView * vPosition; color = ambient + diffuse + specular; // Finally, sum components color.a = 1.0; // and make opague } // end
Phong Fragment Shader • Can also perform per fragment lighting calculations with fragment shader • Angel example below per vertex lighting per fragment lighting
More Fragment Shader Applications • Enviroment map • Texture generation • Fog • Antialiasing • Scissoring • Alpha test • Blending • Dithering • Logical Operation • Masking
End • .