1 / 65

Shading, Surfaces & Textures

Shading, Surfaces & Textures. Outline. Polygon shading models Constant, interpolated, polygon mesh, Lambert, Gouraud, Phong Lighting models Lambert, Phong, Blinn Surfaces properties Surface mapping Texture Bump Displacement …. Lighting algorithms.

Download Presentation

Shading, Surfaces & Textures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Shading, Surfaces & Textures

  2. Outline • Polygon shading models • Constant, interpolated, polygon mesh, Lambert, Gouraud, Phong • Lighting models • Lambert, Phong, Blinn • Surfaces properties • Surface mapping • Texture • Bump • Displacement • …

  3. Lighting algorithms • Most algorithms use the polygon or surface normal (vector that is perpendicular to the polygon) • The amount of light reflected depends on the angle of the incident light, e.g. for diffuse reflection: N L

  4. Constant shading • Also known as Lambert, faceted and flat shading • Calculates a single intensity value for each polygon • Valid if following assumptions are true: • Light source is at infinity (i.e. angle between light and surface is constant) • Viewer is at infinity (i.e. angle between viewer and surface is constant) • Polygon is not an approximation of a curved surface

  5. Interpolated shading • Shading is linearly interpolated across polygon • Faster than calculating each pixel, better than flat shading • Still assumes that polygon represents the true surface

  6. Polygon mesh shading • Many sets of polygons actually are approximations to curved surfaces • Shading the polygons to realistically simulate the surface is therefore important • Several methods exist toachieve this, the most important of which are Gouraud and Phong shading.

  7. Gouraud shading (Henri Gouraud, PhD 1971) • This is a form of intensity interpolation shading • Starts from knowing the normal of the surface at each vertex of the polygon • These may be stored with the mesh or may be averaged from the adjacent polygons • Next step is to calculate the vertex intensities • This will use the vertex normal and the illumination algorithm

  8. I2 IP I1 I3 Gouraud shading (2) • The polygon is then shaded by: • Interpolating the intensities along each edge • Interpolating across the polygon between the edges along each scan line

  9. Gouraud algorithm • Compute SA, SB, SC for triangle ABC. • Si = shade of point i. • For a scanline XY, compute SX, SY by interpolating • e.g. tAB = |AX| / |AB|. • SA = tAB* SA + (1-tAB)*SB • Compute SP • By interpolating between SX and SY B X P Y scanline SX SP SY A C

  10. Gouraud example

  11. Phong shading (Bui-Tong Phong, PhD 1975) • This is a form of normal-vector interpolation shading • Starts from the normal vector at each vertex again, but interpolates the vector rather than the intensity

  12. Phong shading (2) • In general, this yields much more realistic results, especially for specular surfaces • Gouraud shading may miss specular highlights • It is several orders of magnitude slower than Gouraud

  13. Problems with interpolated shading • Polygonal silhouette • Perspective distortion • Orientation dependence • Shared vertices • Unrepresentative vertex normals

  14. Polygonal silhouette • Approximation is polygonal, so no matter how good the shading algorithm the silhouette will be faceted • Can be improved by increasing the polygon count at expense of processing time

  15. Perspective distortion • Because the interpolation is based on the scan lines, the number of interpolation increments per polygon depends on the angle of the polygon to the viewer • Steep angle = few increments • Shallow angle = more increments

  16. Orientation dependence • Because interpolation occurs between vertices and along scan lines, result depends on orientation of polygon relative to scan lines

  17. Shared vertices • Problems occur when adjacent polygons do not share vertices • This can lead to discontinuities in shading

  18. Unrepresentative vertexnormals • Vertices may not adequately represent the surface geometry • Usually can be solved by further subdividing surface

  19. Surfaces • Definition of visual characteristics of a surface • Often defined by a ‘shader’ or ‘material’ • Specifies colour, how shiny it is, etc.

  20. Surface parameters • Colour usually defined by RGB values or via GUI • Ambient is a false parameter that defines how ‘ambient’ light is treated • Diffuse parameter specifies how much light in total is evenly reflected • Specularity defines how shiny a surface is (a high value will have highlights like a billiard ball) • May also have controls for highlight size and colour

  21. Advanced surface parameters • Reflectivity defines how much of the surrounding environment is reflected by the surface (like a mirror) • Transparency defines of the background is visible through the surface • Translucency defines how much light is transmitted through the surface • Refractivity defines how much light is bent as it enters and leaves the material

  22. Surfaces in Maya • Basic materials: • Lambert • Phong • Phong E • Blinn • Special materials: • Anisotropic, Layered Shader, Ocean Shader, Ramp Shader, Shading Map, Surface Shader, Use Background

  23. Lambert • This is the simplest material • It creates a matt surface with diffuse and ambient components but no highlights • The shading interpolates between adjacent polygon normals so the surface appears smooth • It uses Lambert’s cosine law

  24. Lambert’s Cosine Law • “The reflected luminous intensity in any direction from a perfectly diffusing surface varies as the cosine of the angle between the direction of incident light and the normal vector of the surface” Johann Lambert, 1728-1777.

  25. Lambert’s Cosine Law • Ideal diffuse surfaces obey this cosine law • Often called Lambertian surfaces. • Id = kd Iincidentcos  = kd Iincident (N·L). where kd is the diffuse reflectanceof the material. • Wavelength dependent, so usually specified as a colour.

  26. Maya Lambert properties

  27. Phong (Bui-Tong Phong, 1975) • This lighting model includes specularity • This takes into account that the amount of light you see depends on the viewer’s angle to the surface as well as the light’s angle • His original formula for the specular term: • W(i)(cos s )n • s is the angle between the view and specular reflection directions. • “W(i) is a function which gives the ratio of the specular reflected light and the incident light as a function of the the incident angle i.” • Ranges from 10 to 80 percent. • “n is a power which models the specular reflected light for each material.” • Ranges from 1 to 10.

  28. L N R V  Phong lighting model • More recent formulations are slightly different • Replace W(i) with a constant ks, independent of the incident direction • Is= ks Iincidentcosn = ks Iincident (V·R)n. • V is the view direction. • R is the specular reflection direction

  29. Variation of n

  30. Maya Phong properties

  31. Maya’s ‘Phong E’ • Uses a simplified model, faster to render than pure Phong

  32. Maya’s Blinn (Jim Blinn, 1977) • Based on Blinn-Phong shading (an adaptation of Phong) • Offers great control over specularity but in general will not look as ‘shiny’ as a Phong material

  33. Maya Blinn properties II

  34. 2D texture mapping • A ‘cheap’ way of enhancing the the surface definition • So far, surfaces have been defined with a plain colour • In real life, many surfaces are multi-coloured • E.g. Wood looks ‘wooden’ because of many different colour variations in the grain

  35. 2D texture mapping (2) • 2D texture mapping involves applying a 2D image to the surface of a 3D object • In this case the term ‘texture’ applies to the image rather than the more traditional ‘feel’ of the surface • Textures may be ‘real’ (e.g. scanned), manually generated (e.g. with a paint program) or procedurally generated (e.g. by a computer program)

  36. 2D texture mapping (3) • The process of applying the texture is the ‘mapping’ • The two mapping processes: • Projecting the image onto the surface • Stretching the image to fit on the surface

  37. Projection mapping • In projection mapping the image is projected through space • Wherever it ‘hits’ the surface the surface becomes the colour of the texture • The purest form of this is planar projection

  38. Cylindrical projection • Image is bent into a cylinder before projection takes place • Object is then placed ‘in’ the cylinder

  39. Spherical projection • Image bent onto an imaginary sphere before projection • Object placed ‘in’ sphere

  40. Limitations • Unless mapping on a perfect flat surface, cylinder or sphere, all projection methods will have a tendency to ‘streak’ • Parameterised texture mapping offers a solution to this problem

  41. Parameterised (UV) mapping • Imagine printing the texture image onto thin transparent film • Then stretch the film over the surface • This is the principle of parameterised texture mapping

  42. Parameterised mapping (2) • The 2D texture image is (obviously) an array of rectangular pixels • These can be referenced by Cartesian coordinates, usually with the lower left pixel being (0,0) (and for a 512 x 512 image the top right being (511,511)) • Each pixel rectangle is then mapped to a corresponding area of the 3D surface

  43. Parameterised mapping (3) • You therefore have to divide the surface into the same number of steps • The surface patch is defined to use the coordinate system U and V to specify locations on it • The area defined by (0,0) on the surface has the colour applied to it from the image pixel at (0,0)

  44. Procedural textures • Textures can be generated by procedures rather than using predefined images • This is good for regular patterns (e.g. brickwork) or (semi-)random patterns (e.g. noise)

More Related