310 likes | 487 Views
Week 9 - Monday. CS361. Last time. What did we talk about last time? BRDFs Texture mapping and bump mapping in shaders. Questions?. Project 3. Choosing BRDFs. Fresnel reflectance. Fresnel reflectance is an ideal mathematical description of how perfectly smooth materials reflect light
E N D
Week 9 - Monday CS361
Last time • What did we talk about last time? • BRDFs • Texture mapping and bump mapping in shaders
Fresnel reflectance • Fresnel reflectance is an ideal mathematical description of how perfectly smooth materials reflect light • The angle of reflection is the same as the angle of incidence and can be computed: • The transmitted (visible) radiance Lt is based on the Fresnel reflectance and the angle of refraction of light into the material:
Snell's Law • The angle of refraction into the material is related to the angle of incidence and the refractive indexes of the materials below the interface and above the interface: • We can combine this identity with the previous equation:
External reflection • Reflectance is obviously dependent on angle • Perpendicular (0°) gives essentially the specular color of the material • Higher angles will become more reflective • The function RF(θi) is also dependent on material (and the light color)
Approximating reflection • Because it's non-linear, Schlick gives an approximation that works for most substances: • We can use a table of RF(0°) values
Internal reflection • External reflection needs to be modeled more often than internal reflection • Modeling internal reflection is the same except that the higher optical density can cause total internal reflection
Diffuse light • Usually is not as complex as specular light • We can measure a value ρ that gives the ratio between light escaping a surface relative to light entering a surface • ρ is called the scattering albedo • Because of conversation of energy, the more light that is reflected through Fresnel reflection, the less there is to be reflected diffusely • Thus, a simple approximation for diffuse light is
Microgeometry • The cause of many lighting effects is microgeometry • The smoother the surface, the tighter (and brighter) the reflections are
Weird effects • Glancing angles can minimize the impacts of surface roughness, making rough surfaces reflective at very high angles • Most surfaces are isotropic (symmetrical) in the way they are rough • Anisotropic surfaces like brushed metal have directional blurring
Where do BRDFs come from? • The book gives a number of BRDF equations • It is also possible to samples materials (from every angle, at every color of light) to measure a BRDF of your own • Once you've got such a model, how do you implement it?
Implementation • The shader will use the following equation: • The cosine term is found with the dot product • Most BRDFs contain a 1/π term • Many systems pre-divide EL by π • Make sure you don't double divide (or double multiply) • If some value is computed repeatedly, consider putting it in a texture for lookup • Mipmapping may not work for non-linear BRDFs
Optimizations • It may be expensive to compute the shading based on all the light sources • Also, XNA and other APIs (and various graphics cards) limit the number of light sources • Some lights must be averaged into each other for performance reasons
Deferred shading • Shading is usually done while z-buffer testing is done • It's possible to do all the z-buffer testing and then go back and shade only those fragments that contribute to the final scene
Image based rendering • A great deal of graphics research deals with rendering real scenes • Don't cameras do that? • Sure, but these graphics guys couldn't publish papers if the stuff wasn't hard for some reason: • Reconstructing novel viewpoints • Walkthroughs with user controlled paths • Introducing synthetic objects into real scenes • Re-lighting real scenes with new light sources • I would be remiss if I didn't mention these topics even though they usually have nothing to do with video games and often cannot be rendered in real time
Plenoptic function • A central idea in image-based rendering is the plenoptic function, sometimes called a light field • The basic plenoptic function is and its result is a color • In other words, it tells you the color you would see if you were at and looked in the direction given by angles and • There are also more complicated plenoptic functions that take into account time, wavelength, and more
Sea of Images • Although the research is old now, Daniel Aliaga et al. produced an impressive system for recreating real scenes in real time in which a user can control the path he or she takes • A robot records thousands and thousands of omnidirectional images and its location when it takes them • Then, images are merged together to create a novel view for the current location and orientation
Sea of Images issues • Rendering the images in real time isn't hard • Knowing the robot's position for all images is surprisingly difficult • Storing and loading the next images that will be needed in reconstruction is a huge caching and compression problem • Getting the robot to walk around and scan a scene automatically ended up being too hard • Some of these ideas were used for Google Street View, which is neither real time nor allows for arbitrary locations
Image based lighting • Synthetic objects can be rendered using a BRDF based on measurements of real-world materials • Alternatively, we could sample a real-world object from many different directions and get enough information to re-light it • You can also capture lighting from the real world using a mirrored ball • Then you can re-light: • A real image with a different set of real lights • Synthetic objects with realistic real light
Next time… • Area lighting • Environment mapping
Reminders • Start working on Project 3 • Due April 5 • Keep reading Chapter 8