220 likes | 239 Views
Dive into high-level networking decisions, error minimization strategies, variance methods, and shader advancements in game development. Explore topics like recentering variance, shader languages, deferred shading, and historical changes in GPU pipelines. Learn from industry expert Jonathan Blow at an influential seminar.
E N D
KIPA Game Engine Seminars Day 4 Jonathan Blow Seoul, Korea November 29, 2002
High-Level Networking(continued) • Review of yesterday…
Deciding What To Transmit • Limited bandwidth to fit all entity updates into • Apportion this out into slices somehow? • Can we do this without adding latency? A hard problem! • Need to accept the fact that the client won’t be perfectly updated about everything, always a little bit wrong • Approach network communications as an error minimization problem
Deciding What To Transmit • Need a metric for the amount of error between object on client and server • Position, orientation, state variables • Probably attenuated by distance to deal with viewpoint issues! • What about a rocket with contrail attached? • Record the world state that you sent to the client, and diff it against current state • A lot of memory! • A lot of CPU!
There is a lot of coherence between clients’ error functions • We ought to be able to exploit that • Example of objects that move a lot • They will have high error in all world views • Similarly for objects that move slowly • How do we detect “a lot of motion”? • Should not use distance traveled per frame • example on whiteboard
Detecting “a lot of motion” • Idea: Neighborhood bounding box • Too quantized; how do we decide when to move the center? • Idea: Bounding sphere with moving center • How do we compute this without holding big arrays of data? • Also, too anisotropic • (we are picky because we pay for bandwidth!)
Why we want anisotropy • For distant objects, we care most about motion parallel to the view plane • Motion orthogonal to that plane only produces small changes in perspective size • (graph of 1/z on whiteboard)
Variance of a vector • Also: “Covariance” of the vector components • “Variance/Covariance Matrix” of components • (demo) • Can be filtered, like scalars, to approximate where something has been over different periods of time
Summary of Variance Methods • Characterized by ellipsoid • Find ellipsoid by eigenvalues/eigenvectors of outer product matrix • These variances can be treated intuitively like mass (tensor of inertia, in physics)
Derivation of Variance Recentering • Allows us to filter the variance of the vector, and transform that relative to a filtered position, in order to visualize • (derivation on whiteboard)
Code Inspection • Covariance2, openGL demo app • Covariance3 • discuss finding eigenvectors of 2x2 versus 3x3 matrix
Do we need the eigenvectorsfor networking? • Perhaps not! • First, discussion of how we would use the eigenvectors • But instead of back-projecting, can we forward-query? • A simple query is very cheap • (example on whiteboard)
For the global sort,it’s even easier • The product of the eigenvalues is the determinant of the matrix • Volume of ellipsoid! • The sum of eigenvalues is the trace of the matrix • Useful in approximating eccentricity of ellipsoid; ratio of volume to ideal volume of a sphere with radius of (1/3) tr M
Shaders • Some people are confused by marketing hype to think shaders are new… • They have been around for a long time in software rendering • In hardware, the fixed-function pipeline provided shader functionality.
Interesting idea:Deferred Shading • Only write iterated vertex parameters into the frame buffer • Perform complicated shading operations in a post-pass • If multipass rendering, vertex shader will run less often • But technique is of limited use? http://www.delphi3d.net/articles/viewarticle.php?article=deferred.htm
Early Game Shader Language:Quake3 shaders.txt • Goal: to abstract away the number of texture stages in a graphics card’s pipeline • Earlier cards had 1, 2, or 3 stages • Also: enable level designers to create shaders by hand
Normalization Cube Map • Promoted by Mark Kilgard of Nvidia • An interesting idea for early shaders, but outdated now? • With more shader instructions we can actually run a fast normalization function • Does not require texture memory or a texture input slot! • Cube maps are still useful for parameterizing arbitrary functions over the sphere
Early DirectX vertex / pixel shaders • (Version 1.0, 1.1) • Did not do much you couldn’t already do in fixed function pipeline • But, an important step toward paradigm of programmability
OpenGL vs. DirectX:Extensions vs. Control • OpenGL provides extensions • DirectX is about Microsoft creating a “standard”