340 likes | 519 Views
Week 8 - Friday. CS361. Last time. What did we talk about last time? Radiometry Photometry Colorimetry Lighting with shader code Ambient Directional (diffuse and specular) Point. Questions?. Assignment 3. Project 3. Student Lecture: BRDFs. BRDFs. BRDF theory.
E N D
Week 8 - Friday CS361
Last time • What did we talk about last time? • Radiometry • Photometry • Colorimetry • Lighting with shader code • Ambient • Directional (diffuse and specular) • Point
BRDF theory • The bidirectional reflectance distribution function is a function that describes the difference between outgoing radiance and incoming irradiance • This function changes based on: • Wavelength • Angle of light to surface • Angle of viewer from surface • For point or directional lights, we do not need differentials and can write the BRDF:
How is this different? • We've been talking about lighting models • Lambertian, specular, etc. • A BRDF is an attempt to model physics slightly better • A big difference is that different wavelengths are absorbed and reflected different by different materials • Rendering models in real time with (more) accurate BRDFs is still an open research problem
Spheres with different BRDFs • They also have global lighting (shadows and reflections) • Taken from www.kevinbeason.com
Revenge of the BRDF • The BRDF is supposed to account for all the light interactions we discussed in Chapter 5 (reflection and refraction) • We can see the similarity to the lighting equation from Chapter 5, now with a BRDF:
When a BRDF isn't enough… • If the subsurface scattering effects are great, the size of the pixel may matter • Then, a bidirectional surface scattering reflectance distribution function (BSSRDF) is needed • Or if the surface characteristics change in different areas, you need a spatially varying BRDF • And so on…
Constraints on BRDFs • Helmholtz reciprocity: • f(l,v) = f(v,l) • Conservation of energy: • Outgoing energy cannot be greater than incoming energy • The simplest BRDF is Lambertian shading • We assume that energy is scattered equally in all directions • Integrating over the hemisphere gives a factor of π • Dividing by π gives us exactly what we saw before:
Texture mapping in a shader • We'll start with our specular shader for directional light and add textures to it
Texture • The texture (stolen, of course, from RB Whitaker's excellent site http://rbwhitaker.wikidot.com/) for the helicopter is quite simple:
Specular declarations • Here are the declarations we used for the specular shader float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 CameraPosition; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 LightDirection = float3(1, 0, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 1.0; float Shininess = 200; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 1;
Texturing additions • We add a texture variable called ModelTexture • We also add a sampler2D structure that specifies how to filter the texture texture ModelTexture; sampler2D textureSampler = sampler_state { Texture = (ModelTexture); MagFilter = Linear; MinFilter = Linear; AddressU = Clamp; AddressV = Clamp; };
Texturing structures • We add a texture coordinate to the input and the output of the vertex shader structVertexShaderInput { float4 Position : POSITION0; float4 Normal : NORMAL0; float2 TextureCoordinate : TEXCOORD0; }; structVertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; float3 Normal : TEXCOORD0; float2 TextureCoordinate : TEXCOORD1; };
Texturing vertex shader • Almost nothing changes here except that we copy the input texture coordinate into the output VertexShaderOutputVertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float4 normal = normalize(mul(input.Normal, WorldInverseTranspose)); float lightIntensity = dot(normal, LightDirection); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; output.TextureCoordinate = input.TextureCoordinate; return output; }
Texturing pixel shader • We have to pull the color from the texture and set its alpha to 1 • Then scale the components of the color by the texture color float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 light = normalize(LightDirection); float3 normal = normalize(input.Normal); float3 r = normalize(2 * dot(light, normal) * normal - light); float3 v = normalize(mul(normalize(CameraPosition), World)); float dotProduct = dot(r, v); float4 specular = SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); float4 textureColor = tex2D(textureSampler, input.TextureCoordinate); textureColor.a = 1; return saturate(textureColor * (input.Color + AmbientColor * AmbientIntensity) + specular); }
Updates to XNA • To use a texture, we naturally have to load a texture • As with other inputs to shaders, we have to set the texture parameter texture = Content.Load<Texture2D>("HelicopterTexture"); effect.Parameters["ModelTexture"].SetValue(texture);
Bump mapping in shaders • It's easiest to do bump mapping in XNA using a normal map • Of course, a normal map is hard to create by hand • What's more common is to create a height map and then use a tool for creating a normal map from it • xNormal is a free utility to do this • http://www.xnormal.net/Downloads.aspx
Height map to normal map • The conversion from a grayscale height map to a normal map looks like this
Bump map additions • We need another texture to hold the normal map and a parameter to decide how bumpy texture ModelTexture; sampler2D textureSampler = sampler_state { Texture = (ModelTexture); MagFilter = Linear; MinFilter = Linear; AddressU = Clamp; AddressV = Clamp; }; float BumpConstant = 1; texture NormalMap; sampler2D bumpSampler = sampler_state { Texture = (NormalMap); MinFilter = Linear; MagFilter = Linear; AddressU = Wrap; AddressV = Wrap; };
How does bump mapping work? Normal • We have a normal to a surface, but there are also tangent directions • We call these the tangent and the binormal • Apparently serious mathematicians think it should be called the bitangent • The binormal is tangent to the surface and orthogonal to the other tangent • We distort the normal with weighted sums of the tangent and binormal (stored in our normal map) Tangent Binormal
Bump map structures • The input has a tangent and a binormal added structVertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; float3 Tangent : TANGENT0; float3 Binormal : BINORMAL0; float2 TextureCoordinate : TEXCOORD0; };
Bump map structures continued • The output has a tangent and a binormal added structVertexShaderOutput { float4 Position : POSITION0; float2 TextureCoordinate : TEXCOORD0; float3 Normal : TEXCOORD1; float3 Tangent : TEXCOORD2; float3 Binormal : TEXCOORD3; };
Bump map vertex shader • Here we transform the tangent and binormal and copy them into the output VertexShaderOutputVertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); output.Normal = normalize(mul(input.Normal, WorldInverseTranspose)); output.Tangent = normalize(mul(input.Tangent, WorldInverseTranspose)); output.Binormal = normalize(mul(input.Binormal, WorldInverseTranspose)); output.TextureCoordinate = input.TextureCoordinate; return output; }
Bump map pixel shader float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 bump = BumpConstant * (tex2D(bumpSampler, input.TextureCoordinate) - (0.5, 0.5, 0.5)); float3 bumpNormal = input.Normal + (bump.x * input.Tangent + bump.y * input.Binormal); bumpNormal = normalize(bumpNormal); float diffuseIntensity = dot(normalize(LightDirection), bumpNormal); if(diffuseIntensity < 0) diffuseIntensity= 0; float3 light = normalize(LightDirection); float3 r = normalize(2 * dot(light, bumpNormal) * bumpNormal - light); float3 v = normalize(mul(normalize(CameraPosition), World)); float dotProduct = dot(r, v); float4 specular = SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * diffuseIntensity; float4 textureColor = tex2D(textureSampler, input.TextureCoordinate); textureColor.a = 1; return saturate(textureColor * (diffuseIntensity + AmbientColor * AmbientIntensity) + specular); }
Updates to XNA • To use the normal map, we have to load the texture • As with the texture, we have to set the parameter • We also have to go into the properties of the model and tell it to generate tangent frames in the Content Processor • If you make your own shapes, you'll have to make your own tangents and binormals normalMap = Content.Load<Texture2D>("HelicopterNormalMap"); effect.Parameters["NormalMap"].SetValue(normalMap);
Next time… • Choosing BRDFs • Implementing BRDFs • Image-based approaches to sampling BRDFs
Reminders • Finish reading Chapter 7 • Finish Assignment 3 • Due tonight by midnight