270 likes | 327 Views
Interactive 3D Graphics and Virtual Reality. Faculty of Computing, Engineering and Technology Staffordshire University. I3I3DG&VR CE00539-m. Bob Hobbs. Module Details What is 3D programming? Typical Processing Steps Modelling and Rendering Applications Summary. Outline.
E N D
Interactive 3D Graphics and Virtual Reality Faculty of Computing, Engineering and Technology Staffordshire University I3I3DG&VR CE00539-m Bob Hobbs
Module Details • What is 3D programming? • Typical Processing Steps • Modelling and Rendering • Applications • Summary Outline
Teaching Team • Bob Hobbs r.g.hobbs@staffs.ac.uk • Dr. Len Noriega l.a.noriega@staffs.ac.uk • Semester 2 15 cats • 4 Hours per week • 2 Hours Lecture Tues 2pm (C321) & Tue 4pm (BLUE) • 2 Hours Practical Monday 1pm-3pm (K106) Module Details
Course Handbook & Lecture Notes • http://www.soc.staffs.ac.uk/rgh1 Assignment Details • 50% assignment work • 50% two hour exam Module Details
Week 01 RGH Introduction, OpenGL and general 3D concepts • Week 02 RGH Virtual Reality Concepts, Semantic cues and HCI • Week 03 LAN Lighting, Shading and Texturing • Week 04 LAN Physics and Collision Detection • Week 05 LAN Matrix Operations for Mechanics • Week 06 LAN Cognitive Agents and AI concepts • Week 07 RGH Interaction Metaphors, Immersion and Presence • Week 08 RGH Human movement, Bio-Mechanics and Kinematics • Week 09 LAN Quaternion Operations for Mechanics • Week 10 RGH Quaternion Operations for Mechanics II • Week 11 BOTH Assignment surgery • Week 12 BOTH Assessment demo in exam week Program of Study
Hierarchy of Models Behaviour Bio-Mechanics Physics Geometry
Simulation Loop • read input sensors • update objects • render scene in display • Uses traditional 3D graphics methods to render or ‘draw’ the scene How does this work ?
Read Sensors Check any defined actions Update objects with sensor input Objects perform tasks Step along any defined paths Render universe Simulation Loop
Graphics basics: • Transform geometry (object world, world eye) • Apply perspective projection (eye screen) • Clip to the view frustum • Perform visible-surface processing (Z-buffer) • Calculate surface lighting etc. • Implementing all this is a lot of work (surprise) • OpenGL provides a standard implementation Introducing OpenGL
SGI’s design goals for OpenGL: • Hardware independence without sacrificing performance • Natural, concise API with some built-in extensibility • OpenGL has become a standard because: • It doesn’t try to do too much • Only renders the image, doesn’t manage windows, etc. • No high-level animation, modeling, sound (!), etc. • It does enough • Useful rendering effects + high performance • It is promoted by SGI (& Microsoft, half-heartedly) OpenGL Design Goals
Functions in OpenGL start with gl • Functions starting with glu are utility functions (i.e., gluLookAt()) • Functions starting with glx are for interfacing with the X Windows system (i.e., in gfx.c) • Function names indicate argument type/# • Functions ending with f take floats • Functions ending with i take ints, functions that end with v take an array, with b take byte, etc. • Ex: glColor3f() takes 3 floats, but glColor4fv() takes an array of 4 floats OpenGL: Conventions
Geometry in OpenGL consists of a list of vertices in between calls to glBegin() and glEnd() • A simple example: telling GL to render a triangle glBegin(GL_POLYGON); glVertex3f(x1, y1, z1); glVertex3f(x2, y2, z2); glVertex3f(x3, y3, z3); glEnd(); • Usage: glBegin(geomtype) where geomtype is: • Points, lines, polygons, triangles, quadrilaterals, etc... OpenGL: Specifying Geometry
Viewer Synthetic image will vary according to: viewing direction, viewer position, illumination, object properties, ... Projection onto 2D surface Object What is 3D rendering? Generally deals with graphical display of 3D objects as seen by viewer
Specification Rendering Graphical display such as What is 3D Computer Graphics? • 3D graphics: generation of graphical display (rendering) of 3D object(s) from specification (model(s)) Modelling
Light source Transformation Hidden surface removal Vertices Shading Viewpoint Facets Typical Processing Steps Wireframe polygonal model Solid object
Illumination model Graphics engine Object model(s) Graphical display Viewing and projection specification Typical Processing Steps Modelling: numerical description of scene objects, illumination, and viewer Rendering: operations that produce view of scene projected onto view surface
Modelling 1438 facets Human Head Model
Modelling 7258 facets Human Head Model
Modelling 2074 facets Teacher and Board Model
Rendering 1438 facets Shaded Human Head
Rendering Shaded Teacher and Board
In VR programming the structure used is a scene graph which is special tree structure designed to store information about a scene. • Typical elements include • geometries • positional information • lights • fog Scene Graphs
Root node Fog node Light node Group node Geom node Xform node Simple scene graph
Parent Scene Graph Nodes • Content Nodes • contain basic elements of a scene • geometry • light • position • fog • Group Nodes • no content • link the hierarchy • allow grouping of nodes sharing a common state Parent Child #1 Child #2
Root Light Group “Dog” Xform T1 Geom Lampost Geom Dog Xform T2 Example Hierarchy Group “Lampost”
VR programming is used in many applications, e.g. • Entertainment (computer games, ‘movie’ special effects, ...) • Human computer interaction (GUI, ...) • Science, education, medicine (visualisation …) • Business (marketing, ...) • Art Applications
Simulation consists of a series of scenes • Objects defined as Scenes in a scene graph which may be one object or a related collection of objects • Each iteration of the simulation loop determines actions, translations(along paths) and other inputs which affect properties of the objects • NOT animation !!! • The world is redrawn using rendering process Summary