450 likes | 654 Views
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2010 Stochastic Radiosity. K. H. Ko Department of Mechatronics Gwangju Institute of Science and Technology. Hybrid Algorithms.
E N D
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2010Stochastic Radiosity K. H. Ko Department of Mechatronics Gwangju Institute of Science and Technology
Hybrid Algorithms • There are two of the most popular global illumination algorithms: ray tracing and radiosity. • A ray-tracing algorithm computes radiance values for every pixel in the final image by generating paths between the pixel and the light sources. • A radiosity algorithm computes a radiance value for every mesh element in the scene, after which this solution is displayed using any method that can project polygons to the screen. • There are algorithms which try to combine the best of both worlds.
Final Gathering • Once a radiosity solution is computed and an image of the scene is generated, Gouraud shading is often used to interpolate between radiance values at vertices of the mesh, thus obtaining a smoothly shaded image. • This technique can miss significant shading features. • It is often difficult to generate accurate shadows. • Shadows may creep under surfaces. • Mach band effects may occur. • Other secondary illumination effects containing features with a frequency higher than that which the mesh can represent are also possible. • One way of solving this is to consider the radiosity solution to be a coarse precomputed solution of the light distribution in the scene.
Final Gathering • During a second phase, when the image is actually generated, a more accurate per-pixel illumination value is computed, which is based on the ray-tracing algorithm. • The ray-tracing set-up for computing the radiance for a pixel is given by
Final Gathering • L(p->eye) equals L(x->Θ) with x being the visible point in the scene and Θ the direction from x towards the eye. • Suppose we have a precomputed radiance solution in a diffuse scene, given by L(y) for every surface point y. • We can then acquire the value of L(x->Θ) by writing the rendering equation, approximating the radiance distribution in the kernel of the transport equation by L(y).
Final Gathering • This integral can now be evaluated using Monte Carlo integration. • The main difference with the stochastic ray-tracing algorithm is that there is no recursive evaluation of the radiance distribution, since it is substituted by the precomputed radiosity solution. • Thus one gains the advantage of using an accurate per-pixel method, using a fast precomputed finite element method.
Final Gathering • Various sampling strategies can be used to evaluate either of the two integrals. • In a diffuse scene, with a constant radiance value Lj for each surface element j, the equation can also be rewritten as
Final Gathering • Simple Hemisphere Sampling • The most straightforward approach is to sample random directions over the hemisphere and evaluate L at the nearest intersection point. • It is very similar to simple stochastic ray tracing and will result in a lot of noise in the final image. • Light sources will be missed by just randomly sampling the hemisphere. • Splitting the integral into a direct and indirect term is a good approach for increasing the accuracy.
Final Gathering • Importance Sampling • Construct a probability density function that matches the kernel of the integral as closely as possible. • Since we have a precomputed solution, it can be used to sample surface elements and directions to bright areas in the scene. • Depending on the radiosity algorithm used, the following data may be available to construct a PDF: • Average radiance value for each surface element. • The form factors between surface elements.
Final Gathering • Importance Sampling • An importance sampling procedure can be constructed by first selecting a surface element, and then sampling a surface point within that surface element. • The probability of picking surface element j should be proportional to with surface element i containing point x.
Final Gathering • Importance Sampling • An importance sampling procedure can be constructed by first selecting a surface element, and then sampling a surface point within that surface element. • Thus, each surface element j is assigned a probability
Final Gathering • Importance Sampling • The second step then involves the evaluation of For the surface element j selected in Step 1.
Final Gathering • Importance Sampling • Several methods for evaluating this integral are possible. • Choosing a sample point y with uniform probability 1/Aj on surface element j. The total estimator is then given by
Final Gathering • Importance Sampling • Several methods for evaluating this integral are possible. • An algorithm is available to sample a random direction with uniform probability 1/Ωj on a spherical triangle Ωj. This sampling procedure can be used to sample a surface point y by first selecting a direction Θx ∈ Ωj ; y is the point on surface element j along Θx. The total estimator is then
Final Gathering • Importance Sampling • Several methods for evaluating this integral are possible. • The cosine factor cos(Nx,Θx) can be taken into account as well by using rejection sampling. • A direction is sampled on a bounding region on the hemisphere. • The bounding region needs to be chosen such that sampling according to a cosine distribution is possible. • If the sampled direction falls outside Ωj, the estimator evaluates to 0. • Alternatively, one can also generate samples until a nonrejected sample is generated.
Final Gathering • Importance Sampling • Several methods for evaluating this integral are possible. • When surface element j is fully visible from point x, the point-to-surface form factor can be computed analytically, and thus no Monte Carlo sampling is needed.
Multipass Methods • A multipass method uses various algorithms (finite-element-based, image-based) and combines them into a single image-generation algorithm. • Care has to be taken that light transport components are not counted twice since this would introduce errors in the image. • At the same time, all possible light transport modes need to be covered by at least one pass. • A good multipass algorithm tries to exploit the various advantages of the different individual passes.
Multipass Methods • Regular Expressions • Regular expressions are often used to express which light transport modes are covered by which pass. • Notations • L: One of the light sources in the scene. • D: A diffuse reflection component of the BRDF. • G: A semidiffuse or glossy reflection component of the BRDF. • S: A perfect specular component of the BRDF. • E: The eye or virtual camera. • LD+E. • A light transport path between a light source and the camera, only reflecting at diffuse surfaces • D+ indicates the path bounces off of at least one diffuse surface.
Multipass Methods • Regular Expressions • LDSE • A diffuse surface, reflected in a visible specular material, would be described by the path of type LDSE. • L(D|G|S)*E • All possible paths in the scene. • * indicates zero or more reflections. • Algorithms can now be characterized by describing what light transport paths they cover. • Radiosity algorithms cover all paths of type LD*E, or all diffuse bounces. • A classic ray-tracing algorithm, stopping the recursion of reflected rays at nonspecular surfaces, covers all paths of type LD0…1(G|S)E, with D0…1 indicating 0 to 1 reflections at a diffuse surface.
Multipass Methods • Construction of a Multipass Algorithm • A multipass algorithm usually starts with one or more object-space methods, which store a partial approximation of the light transport in the scene. • A radiosity method might only store the diffuse light interactions and might ignore all other types of light transport. • The image-space algorithms compute radiance values per pixel, but they rely on the partially computed and stored light transport approximations of the previous passes. • To access these stored solutions, they need a read-out strategy. • This read-out strategy might itself include some computations or interpolations. • It is determined by the nature of the stored partial solution. • It also determines the nature of the paths that are covered by the image-space pass.
Multipass Methods • Construction of a Multipass Algorithm • Some typical read-out strategies include: • Direct visualization of the stored solution • For each pixel, the stored light transport solution is accessed directly and the resulting value attributed to the pixel. • Radiosity solutions are often displayed this way. • Final gathering • The final gathering method reconstructs the incoming radiance values over the hemisphere for each point visible through the pixel. • These radiance values are read from the stored radiance solution.
Multipass Methods • Construction of a Multipass Algorithm • Some typical read-out strategies include: • Recursive stochastic ray tracing • A recursive ray-tracing algorithm is used as a read-out strategy, but paths are only reflected at those surfaces. • Use only those reflection components that are not covered by the object-space pass. • Ex: If the first pass stores a radiosity solution, covering all paths of type LD*, then the recursive ray-tracing pass would only reflect rays at G or S surfaces. • At each D surface, the stored value in the precomputed solution is read out and incorporated in the estimator at that reflection point. • Thus, the covered paths are of type LD*(G|S)*E.
Multipass Methods • Most multipass strategies make sure that the light transport paths covered in the different passes do not overlap. • Otherwise, some light transport might be counted twice and the resulting image will look too bright in some parts of the scene. • Every pass of the multipass algorithm covers distinct, separate types of light transport.
Multipass Methods • An alternative approach is to have some overlap between the different passes, but weigh them appropriately. • The problem is now to find the right weighting heuristics such that the strengths of each individual pass are used in the optimal way. • A very good strategy assigns weights to the different types of paths in each pass based on the respective probability density functions for generating these paths. • Thus, caustic effects might predominantly use their results from a bidirectional ray-tracing pass, while direct illumination effects might originate mostly from a ray-tracing or radiosity pass.
Multipass Methods • A total of three passes are used. • First, a radiosity solution is computed, which is subsequently enhanced by a stochastic ray tracer.
Multipass Methods • A total of three passes are used. • A third pass involves a bidirectional ray tracer, which generates paths of the same type but with different probabilities due to the nature of the sampling process.
Bidirectional Tracing • Ray tracing traces paths through the scene starting at the surface points, which eventually end at the light sources. • Light tracing, another path-tracing algorithm, does the opposite: paths start at the light sources and end up in any relevant pixels. • Bidirectional ray tracing combines both approaches in a single algorithm and can be viewed as a two-pass algorithm in which both passes are tightly intertwined. • Bidirectional ray tracing generates paths starting at the light sources and at the surface point simultaneously and connects both paths in the middle to find a contribution to the light transport between the light source and the piont for which a radiance value needs to be computed.
Bidirectional Tracing • Bidirectional tracing combines the specific advantages of ray tracing as well as light tracing. • Bidirectional path tracing is one of the few algorithms that start from the formulation of the global reflection distribution function (GRDF). • The flux Ф(S) is given by
Bidirectional Tracing • The core idea of the algorithm is that one has the availability of two different path generators when computing a Monte Carlo estimate for the flux through a certain pixel: • An eye path is traced starting at a sampled surface point y0 visible through the pixel. • By generating a path of length k, the path consists of a series of surface points y0, y1, …, yk. • The length of the path is controlled by Russian roulette. • The probability of generating this path can be composed of the individual PDF values of generating each successive point along the path. • Similarly, a light path of length l is generated starting at the light source. This path, x0, x1, …, xl, also has its own probability density distribution.
Bidirectional Tracing • By connecting the endpoint yk of the eye path with the endpoint xl of the light path, a total path of length k+l+1 between the importance source S and the light sources is obtained. • The probability density function for this path is the product of the individual PDFs of the light and eye paths. • An estimator for the Ф(S) using this single path is given by
Bidirectional Tracing • Stochastic ray tracing and light tracing are special cases of bidirectional ray tracing. • When tracing a shadow ray in stochastic ray tracing, we actually generate a light path of length 0, which is connected to an eye path. • Example • A path of length 3 could be generated by a light path of length 2 and an eye path of length 0. • Or by a light path of length 1 and an eye path of length 1 • Or by a light path of length 0 and an eye path of length 2.
Bidirectional Tracing • Some light distribution effects are better generated using either light paths or eye paths. • When rendering a specular reflection that is visible in the image, it is better to generate those specular bounces in the eye path. • The specular reflections in caustics are better generated in the light path.
Photon Mapping • Photon mapping is a practical two-pass algorithm that traces illumination paths both from the lights and from the viewpoint. • However, unlike bidirectional path tracing, this approach caches and reuses illumination values in a scene for efficiency. • In the first pass, “photons” are traced from the light sources into the scene. • These photons, which carry flux information, are cached in a data structure, called the photon map. • In the second pass, an image is rendered using the information stored in the photon map.
Photon Mapping • Photon mapping decouples photon storage from surface parameterization. • This representation enables it to handle arbitrary geometry, including procedural geometry, thus increasing the practical utility of the algorithm. • It is also not prone to meshing artifacts. • By tracing or storing only particular types of photons, it is possible to make specialized photon maps, just for that purpose. • The caustic map: It is designed to capture photons that interact with one or more specular surfaces before reaching a diffuse surface. • By explicitly capturing caustic paths in a caustic map, the photon mapping technique can find caustics efficiently.
Photon Mapping • Photon mapping is a biased technique. • The bias is the potentially nonzero difference between the expected value of the estimator and the actual value of the integral being computed. • Since photon maps are typically not used directly, but are used to compute indirect illumination, increasing the photons eliminates most artifacts.
Photon Mapping • Tracing Photons: Pass 1 • The use of compact, point-based “photons” to propagate flux through the scene is key in making photon mapping efficient. • Photons are traced from the light sources and propagated through the scene just as rays are in ray tracing. • They are reflected, transmitted, or absorbed. • Russian roulette and the standard Monte Carlo sampling techniques are used to propagate photons. • When the photons hit nonspecular surfaces, they are stored in a global data structure called the photon map. • To facilitate efficient searches for photons, a balanced kd-tree is used to implement this data structure.
Photon Mapping • Tracing Photons: Pass 1 • Photon mapping can be efficient for computing caustics. • The reflected radiance at each point in the scene can be computed from the photon map. • The photon map represents incoming flux at each point in the scene. Therefore, the photon density at a point estimates the irradiance at that point. • The reflected radiance at a point can then be computed by multiplying the irradiance by the surface BRDF.
Photon Mapping • Tracing Photons: Pass 1 • To compute the photon density at a point the n closest photons to that point are found in the photon map. • This search is efficiently done using the balanced kd-tree storing the photons. • The photon density is then computed by adding the flux of these n photons and dividing by the projected area of the sphere containing these n photons. • Thus, the reflected radiance at the point x in the direction ω is
Photon Mapping • Computing Images: Pass 2 • The simplest use of the photon map would be to display the reflected radiance values computed in pass 1 for each visible point in an image. • However, unless the number of photons used is extremely large, this display approach can cause significant blurring of radiance, thus resulting in poor image quality. • Instead, photon maps are more effective when integrated with a ray tracer that computes direct illumination and queries the photon map only after one diffuse or glossy bounce from the view point is traced through the scene.
Photon Mapping • Computing Images: Pass 2 • The final rendering of images could be done as follows: • Rays are traced through each pixel to find the closest visible surface. • The radiance for a visible point is split into direct illumination, specular or glossy illumination, illumination due to caustics, and the remaining indirect illumination. • Direct illumination for visible surfaces is computed using regular Monte Carlo sampling. • Specular reflections and transmissions are ray traced. • Caustics are computed using the caustic photon map. Since caustics occur only in a few parts of the scene, they are computed at a higher resolution to permit direct high-quality display.
Photon Mapping • Computing Images: Pass 2 • The final rendering of images could be done as follows: • The remaining indirect illumination is computed by sampling the hemisphere; the global photon map is used to compute radiance at the surfaces that are not directly visible. This extra level of indirection decreases visual artifacts.