620 likes | 789 Views
Advanced Game Design. Prof. Roger Crawfis Computer Science & Engineering The Ohio State University. Course Overview. Project-based / Team-based Little lecturing Focus on programming for games Systems integration – graphics, sound, AI, networking, user-interfaces, physics, scripting
E N D
Advanced Game Design Prof. Roger Crawfis Computer Science & Engineering The Ohio State University
Course Overview • Project-based / Team-based • Little lecturing • Focus on programming for games • Systems integration – graphics, sound, AI, networking, user-interfaces, physics, scripting • Utilize higher-level toolkits, allowing for more advanced progress while still developing programming skills.
Course Structure • I will lecture for about the first week and a half. • Student game project groups will provide several presentations on their game ideas and progress. • Student technology teams will provide an intermediate and an advanced lecture on their findings and analysis about their area.
Project Goals • Large-scale software development • Team-based (synergistic development) • Toolkit-based (fast-start development) • Learn and utilize the many non-graphical elements needed for games. • Leverage and extend your software engineering, graphics, AI, …, expertise.
Elements • Gaming Engine • Responsible for providing primitives • Hardware abstraction • Handle different areas of the game • Physics, AI, etc. • Game • Defined by a genre • Defines the gameplay
Requirements of a gaming engine • Stunning Visuals • Immersive sound stage • Varied input/output devices • Scalability • Simulation • Animation • Networking
Requirements of a game • Scripting • Artificial Intelligence • Supporting Tools • Optimizing game content • Developing game content • Extending game content • Debugging / Tuning of game performance
Stunning Visuals • Adding realism • Smarter Models • Use hardware • Bump-mapping • Dynamic water or other liquids • Rich textures (Billboards, gloss-maps, light-maps, etc.) • Shadows • Particle systems
Immersive sound stage • Multi-track sound support • Positional sound effects (3D immersion) • Dynamic sounds / movement (doppler effects)
Input devices • Commonly available devices are • Keyboard, mouse, gamepads and joysticks • Force feedback (haptic) devices are gaining popularity • Steering wheels • Joysticks • Motion tracking • Output devices • Multiple monitors • Head mounted displays
Scalability • Multiple hardware capabilities • Multi-resolution models • Multi-user support • LOD • Multiple model definitions • Multi-res models • Subdivision surfaces
Simulation • Virtual worlds are all good to look at • Immersion breaks when real world physics are not applied • So we need • Collision detection • Collision response
Animation • Linear transformations • Modeled animations • Articulated motion • Lip syncing • Facial Expressions • Blending animations
Networking • Multi-player support essential • Common problems • Latency • Synchronization • Scalability • Consistent game state • Security
Scripting • Strict coding is tedious • Support for scripting is essential for RAD • Scripting has added a whole new fun factor for many games.
Artificial Intelligence • Games need specialized AI • Strategy • Path finding • Modeling behavior • Learning • Non-perfect! • Fast!
Tools • Creating varied content • models, video, images, sound • Integrating content • Common file format support • Supporting existing popular tools via plug-ins • 3DS Max, Lightwave, Maya etc. • Adobe premier, Adobe Photoshop
Interactive Programs • Games are interactive systems - they must respond to the user • How?
Interactive Program Structure • Event driven programming • Everything happens in response to an event • Events come from two sources: • The user • The system • Events are also called messages • An event causes a message to be sent… Initialize User Does Something or Timer Goes Off System Updates
User Events • The OS manages user input • Interrupts at the hardware level … • Get converted into events in queues at the windowing level … • Are made available to your program • It is generally up to the application to make use of the event stream • Windowing systems may abstract the events for you
System Events • Windowing systems provide timer events • The application requests an event at a future time • The system will provide an event sometime around the requested time. Semantics vary: • Guaranteed to come before the requested time • As soon as possible after • Almost never right on (real-time OS?)
Polling for Events while ( true ) if ( e = checkEvent() ) switch ( e.type ) … do more work • Most windowing systems provide a non-blocking event function • Does not wait for an event, just returns NULL if one is not ready • What type of games might use this structure? • Why wouldn’t you always use it?
Waiting for Events e = nextEvent(); switch ( e.type ) … • Most windowing systems provide a blocking event function • Waits (blocks) until an event is available • Usually used with timer events. Why? • On what systems is this better than the previous method? • What types of games is it useful for?
The Callback Abstraction • A common event abstraction is the callback mechanism • Applications register functions they wish to have called in response to particular events • Translation table says which callbacks go with which events • Generally found in GUI (graphical user interface) toolkits • “When the button is pressed, invoke the callback” • Many systems mix methods, or have a catch-all callback for unclaimed events • Why are callbacks good? Why are they bad?
Upon Receiving an Event … • Event responses fall into two classes: • Task events: The event sparks a specific task or results in some change of state within the current mode • eg Load, Save, Pick up a weapon, turn on the lights, … • Call a function to do the job • Mode switches: The event causes the game to shift to some other mode of operation • eg Start game, quit, go to menu, … • Switch event loops, because events now have different meanings • Software structure reflects this - menu system is separate from run-time game system, for example
Real-Time Loop • At the core of interactive games is a real-time loop: • What else might you need to do? • The number of times this loop executes per second is the frame rate • # frames per second (fps) while ( true ) process events update animation / scene render
Lag • Lag is the time between when a user does something and when they see the result - also called latency • Too much lag and causality is distorted • With tight visual/motion coupling, too much lag makes people motion sick • Big problem with head-mounted displays for virtual reality • Too much lag makes it hard to target objects (and track them, and do all sorts of other perceptual tasks) • High variance in lag also makes interaction difficult • Users can adjust to constant lag, but not variable lag • From a psychological perspective, lag is the important variable
Computing Lag Process input Event time • Lag is NOT the time it takes to compute 1 frame! • What is the formula for maximum lag as a function of frame rate, fr? • What is the formula for average lag? Frame time Update state Render Lag Process input Update state Render Process input
Frame Rate Questions • What is an acceptable frame rate for twitch games? Why? • What is the maximum useful frame rate? Why? • What is the frame rate for NTSC television? • What is the minimum frame rate required for a sense of presence? How do we know? • How can we manipulate the frame rate?
Frame Rate Answers (I) • Twitch games demand at least 30fs, but the higher the better (lower lag) • Users see enemy’s motions sooner • Higher frame rates make targeting easier • The maximum useful frame rate is the monitor refresh rate • Time taken for the monitor to draw one screen • Synchronization issues • Buffer swap in graphics is timed with vertical sweep, so ideal frame rate is monitor refresh rate • Can turn of synchronization, but get nasty artifacts on screen
Frame Rate Answers (II) • NTSC television draws all the odd lines of the screen, then all the even ones (interlace format) • Full screen takes 1/30th of a second • Use 60fps to improve visuals, but only half of each frame actually gets drawn by the screen • Do consoles only render 1/2 screen each time? • It was once argued that 10fps was required for a sense of presence (being there) • Head mounted displays require 20fps or higher to avoid illness • Many factors influence the sense of presence • Perceptual studies indicate what frame rates are acceptable
Reducing Lag • Faster algorithms and hardware is the obvious answer • Designers choose a frame rate and put as much into the game as they can without going below the threshold • Part of design documents presented to the publisher • Threshold assumes fastest hardware and all game features turned on • Options given to players to reduce game features and improve their frame rate • There is a resource budget: How much of the loop is dedicated to each aspect of the game (graphics, AI, sound, …) • Some other techniques allow for more features and less lag
Decoupling Computation • It is most important to minimize lag between the user actions and their direct consequences • So the input/rendering loop must have low latency • Lag between actions and other consequences may be less severe • Time between input and the reaction of enemy can be greater • Time to switch animations can be greater • Technique: Update different parts of the game at different rates, which requires decoupling them • For example, run graphics at 60fps, AI at 10fps • Done in Unreal engine, for instance
Animation and Sound • Animation and sound need not be changed at high frequency, but they must be updated at high frequency • For example, switching from walk to run can happen at low frequency, but joint angles for walking must be updated at every frame • Solution is to package multiple frames of animation and submit them all at once to the renderer • Good idea anyway, makes animation independent of frame rate • Sound is offloaded to the sound card
Overview of Ogre3D • Not a full-blown gameengine. • Open-source • Strong user community. • Decent Software Engineering. • Cool Logo
Features • Graphics API independent 3D implementation • Platform independence • Material & Shader support • Well known texture formats: png, jpeg, tga, bmp, dds, dxt • Mesh support: Milkshape3D, 3D Studio Max, Maya, Blender • Scene features • BSP, Octree plugins, hierarchical scene graph • Special effects • Particle systems, skyboxes, billboarding, HUD, cube mapping, bump mapping, post-processing effects • Easy integration with physics libraries • ODE, Tokamak, Newton, OPCODE • Open source!
Startup Sequence • ExampleApplication • Go() • Setup() • Configure() • setupResources() • chooseSceneManager() • createCamera() • createViewport() • createResourceListener() • loadResources() • createScene() • frameStarted/Ended() • createFrameListener() • destroyScene()
Basic Scene • Entity, SceneNode • Camera, lights, shadows • BSP map • Integrated ODE physics • BSP map • Frame listeners
CEGUI • Window, panel, scrollbar, listbox, button, static text • Media/gui/ogregui.layout CEGUI::Window* sheet = CEGUI::WindowManager::getSingleton().loadWindowLayout((CEGUI::utf8*)"ogregui.layout"); mGUISystem->setGUISheet(sheet);
Animation • Node animation (camera, light sources) • Skeletal Animation AnimationState *mAnimationState; mAnimationState = ent->getAnimationState("Idle"); mAnimationState->setLoop(true); mAnimationState->setEnabled(true); mAnimationState->addTime(evt.timeSinceLastFrame); mNode->rotate(quat);
Animation • Crowd (instancing vs single entity) InstancedGeometry* batch = new InstancedGeometry(mCamera->getSceneManager(), "robots" ); batch->addEntity(ent, Vector3::ZERO); batch->build(); • Facial animation VertexPoseKeyFrame* manualKeyFrame; manualKeyFrame->addPoseReference(); manualKeyFrame->updatePoseReference ( ushortposeIndex, Realinfluence)
Picking CEGUI::Point mousePos = CEGUI::MouseCursor::getSingleton().getPosition(); Ray mouseRay = mCamera->getCameraToViewportRay(mousePos.d_x/float(arg.state.width), mousePos.d_y/float(arg.state.height)); mRaySceneQuery->setRay(mouseRay); mRaySceneQuery->setSortByDistance(false); RaySceneQueryResult &result = mRaySceneQuery->execute(); RaySceneQueryResult::iterator mouseRayItr; Vector3 nodePos; for (mouseRayItr = result.begin(); mouseRayItr != result.end(); mouseRayItr++) { if (mouseRayItr->worldFragment) { nodePos = mouseRayItr->worldFragment->singleIntersection; break; } // if }
Particle Effects mSceneMgr->getRootSceneNode()->createChildSceneNode()->attachObject( mSceneMgr->createParticleSystem("sil", "Examples/sil")); Examples/sil { material Examples/Flare2 particle_width 75 particle_height 100 cull_each false quota 1000 billboard_type oriented_self // Area emitter emitter Point { angle 30 emission_rate 75 time_to_live 2 direction 0 1 0 velocity_min 250 velocity_max 300 colour_range_start 0 0 0 colour_range_end 1 1 1 } // Gravity affector LinearForce { force_vector 0 -100 0 force_application add } // Fader affector ColourFader { red -0.5 green -0.5 blue -0.5 } }
Fire and Smoke affector Rotator { rotation_range_start 0 rotation_range_end 360 rotation_speed_range_start -60 rotation_speed_range_end 200 }
Cel Shading vertex_program Ogre/CelShadingVP cg { source Example_CelShading.cg entry_point main_vp … default_params { … } } material Examples/CelShading { … vertex_program_ref Ogre/CelShadingVP {} fragment_program_ref Ogre/CelShadingFP {} }
Cube Mapping • With Perlin noise to distort vertices void morningcubemap_fp ( float3 uv : TEXCOORD0, out float4 colour : COLOR, uniform samplerCUBE tex : register(s0) ) { colour = texCUBE(tex, uv); // blow out the light a bit colour *= 1.7; }