880 likes | 1.05k Views
By: Michael Smith. Sandstorm: A Dynamic Multi-contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation. acknowledgments. Overview. Introduction Background Idea Software Engineering Prototype Results Conclusions and Future Work. Introduction.
E N D
By: Michael Smith Sandstorm: A Dynamic Multi-contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation
Overview • Introduction • Background • Idea • Software Engineering • Prototype • Results • Conclusions and Future Work
Introduction • The use of Virtual Reality(VR) to visualize scientific phenomenon, is quite common. • VR can allow a scientists to immerse themselves in the phenomenon that they are studying.
Introduction • Such phenomenon, such as dust clouds or smoke, would need a particle system to visualizes such fuzzy systems. • Vector fields can be used to 'guide' particles according to real scientific data. • Not a new idea, Vector Fields by Hilton and Egbert, c 1994.
Introduction • VR applications and simulations require a multi-context environment. • A main context, controls and updates multiple rendering contexts. • This multi-contextual environment can cause problems with particle systems.
Introduction • GPU offloading techniques have been proven to allow applications and simulations to offload work to the graphics hardware. • This can allow for acceleration of non-traditional graphics calculations. • GPU offloading can be used to accelerate particle calculations.
Introduction • Sandstorm • Dynamic • Multi-contextual • GPU-based • Particle System • Using Vector Fields for Particle Propagation
Background • Helicopter and Dust Simulation(Heli-Dust), is a scientific simulation in which the effect of a helicopter's downdraft on the surrounding desert terrain. • Written using the Dust Framework, a framework which allows the developer to setup a scene using an XML file
Background • Early prototypes for Heli-Dust, implemented a very simple particle system. • This particle system did not have a way to guide particles, according to observed scientific data.
Background • Virtual Reality, is a technology which allows a user to interact with a computer-simulated environment, be it a real or imagined one. • Immerses the user in an environment.
Background • Depth Cues, is an indicator in which a human can perceive information regarding depth. • They come in many shapes and sizes. • Monoscopic • Stereoscopic • Motion
Background • Monoscopic depth cue • Information from a single eye, or image is available. • Information can include: • Position • Size • Brightness
Background • Stereoscopic depth cue: • Information from two eyes. • This information is derived from the parallax between the different images received by each eye. • Parallax, is the apparent displacement of objects viewed from different locations.
Background • Motion depth cue • Motion parallax • The changing relative position between the head and the object being observed. • Objects in the distance move less than objects closer to the viewer.
Background • Stereoscopic Displays, 'trick' the user's eyes into thinking there is depth where no depth exists. • Come in all shapes and sizes.
Background • Multiple Contexts • A main context which controls multiple rendering contexts. • Because of these multiple context, a Virutal Reality application developer needs to make sure that all context sensitive information and algorithms are multiple context safe.
Background • There are many Virtual Reality toolkits and libraries. • Such toolkits and libraries handle things such as: • Generating Stereoscopic Images • Setting up the VR environment • And some handle distribution methods.
Background • Virtual Reality User Interface, or VRUI, is a virtual reality development toolkit. • Developed by Oliver Kreylos at UC Davis. • VRUI's main mission statement is to shield the developer from a particular configuration of a VR system.
Background • Tries to accomplish the mission by the abstraction of three main areas • Display abstraction • Distribution abstraction • Input abstraction • Another feature of VRUI is its built in menu systems.
Background • FreeVR, developed and maintained by William Sherman. • Open-source virtual reality interface/intergration library. • FreeVR was designed to work on a diverse range of input and output hardware. • FreeVR currently is designed to work on shared memory systems.
Background • In 1983, William T. Reeves wrote, Particle Systems – A Technique for Modeling a Class of Fuzzy Objects. • This paper introduces the particle system, a modeling method that models an object as a cloud of primitives particles that define its volume.
Background • Reeves categories particle systems as “fuzzy” objects, in which they do not have smooth, well-defined, and shiny surfaces. • Instead their surfaces are irregular, complex, and ill defined. • This particle system was used to create the Genesis Effect, for the movie Star Trek II: The Wrath of Khan.
Background • Reeves described, in his paper, a particle system that had five steps. • Particle Generation • Particle Attributes Assignment • Particle Dynamics • Particle Extinction • Particle Rendering
Background • Particle Generation • First the number of particles to be generated per time interval is calculated. • Then the particles are generated.
Background • Particle Attributes Assignment, whenever a particle is created, the particle system must determine values for the following attributes: • Initial position and velocity • Initial size, color and transparency • And initial shape and lifetime. • Initial position of the particles is determined by a generation shape.
Background • Particle Dynamics, once all the particles have been created and assign initial attributes, the positions and or velocities are updated. • Particle Extinction, once a particle has live past its predetermined lifetime, measured in frames, the particle dies.
Background • Particle Rendering, once the position and appearance of the particles where determined the particles are rendered. • Two assumption where made • Particles do not intersect with other surface-based objects. • Particles where considered point light sources.
Background • In recent years, graphics vendors have replaced areas of fixed functionality with areas of programmability. • Two such areas are the Vertex and Fragment Processors.
Background • Vertex Processor, is a programmable unit that operates on incoming vertex values. • Some duties of the vertex processor are: • Vertex transformation • Normal transformation and normalization • Texture coordinate generation and transformation.
Background • Fragment Processor, is a programmable unit that operates on incoming fragment values. • Some duties of the fragment processor are: • Operations on interpolated values. • Texture access. • Texture application. • Fog
Background • While a program, shader, is running on one of these processors, the fixed functionality is disable. • Several programming languages where created to aid in the development of shaders, one such langauge is OpenGL Shading Language(GLSL).
Background • Vertex and Fragment shaders can't create vertices, only work on data past to them. • Geometry shaders can create any number vertices. • Can allow shaders to create geometry without having to be told to by the CPU.
Background • Transform Feedback, allows a shader to specify the output buffer. • The target output buffer can be the input buffer of another shader. • Allows developers to create multi-pass shaders that do not relay information back to the CPU for the other passes.
Background • ParticleGS, is a Geometry Shader based particle system, that does the following: • Stores particle information in Vertex Buffer Objects. • Uses a Geometry shader to create particles, and store them as vertex information in VBOs. • Uses Transform Feedback, to send particle data in between shaders. • Uses a Geometry shader to create billboards and point sprites to render particles.
Background • In the days before shaders, the GPU was used just for rendering. • But with the advent of shaders, GPU's can now be used to aid scientific computation. • One can 'trick' the GPU into thinking that it is working on rendering information
Background • Uber Flow, is a system for real-time animation and rendering of large particle sets using GPU computation. • Million Particle System, a GPU-based particle system that can render a large set of particles
Background • Both particle systems doing the following • Store particle information to textures. • Use a series of vertex and fragment shaders to update the particle information. • Use the CPU to create and send rendering information. • And use a series of vertex and fragment shaders to render the information from CPU.
Idea • Sandstorm • Dynamic • Multi-contextual • GPU-based • Particle System • That uses Vector Fields for Particle Propagation.
Idea • Dynamic, Sandstorm should have the ability to change certain attributes on the fly. • Rate of emission • Size of particles • Lifetime of particles
Idea • Multi-contextual, as previously stated 3D VR environments uses multiple contexts. • Thus Sandstorm must be designed to handle these multiple contexts. • Random number generation • Between screen consistency