190 likes | 328 Views
Artificial Interactions And Collisions In A Virtual Environment. Bringing it, virtually. Outline. Introduction Purpose and Motivation Background Related work Project Details Design Decisions Demonstration Implementation Details Conclusion Limitations Future work Q + A.
E N D
Artificial Interactions And Collisions In A Virtual Environment Bringing it, virtually.
Outline • Introduction • Purpose and Motivation • Background • Related work • Project Details • Design Decisions • Demonstration • Implementation Details • Conclusion • Limitations • Future work • Q + A
Purpose And Motivation • Provide a framework for further development with the Lumiere Ghosting project • Easily modifiable • Artistically adaptable • Allow for interactions between AI and human users as an interface in the CompuObscura
Background On the Lumiere Ghosting Project • Project started in 2002 by Prof. Gillette of the English department; currently maintained and led by Prof. Gillette and Prof. Fowler of the architecture department • Designed to be curriculum for interactive media and information design courses • New Media I: Narratives & Semiotics • New Media II: Technologies & Construction • New Media Projects: Synthesis and Performance • Project encourages students to apply what they learn and engage in the development of the CompuObscura, “an interactive new media theatre”
Related Work • Generic bots found in WoW, Counter Strike, etc. • AI exists to oppose user • Second Life with human interaction in virtual environment • No motion tracking, i.e., system reacts to user input rather than user movement • No generic bot behavior system • Search Assistant in text editors • Rudimentary algorithms for suggesting words as they’re typed—our project looks to more long term searching and goal achievement • Brainworks AI • Complete re-write of Quake 3 AI
Project Requirements • The environment should support multiple avatars. • The user should be able to control the avatar movements. • The user should be able to see or detect other avatars in the vicinity of his avatar. • The system will implement a set of modularized behaviors and emotions that will be associated with the avatars. • The system will handle the collision based transfer of avatar emotions and behaviors. • The artificial-intelligent behaviors of the avatar will be affected by the modularized behaviors and emotion being exchanged. • The user must be able to determine the changes that happened to the avatar after a collision. • After a collision the avatar should be able to continue to move freely in the environment. • If multiple avatars collide all will be affected. • The avatars will be designed with the capability of performing simple missions autonomously.
Design Decisions • Goal: System capable of simulating first-person interactions • Bots should have “personalities” • Bots should be able to exchange/mix-and-match these “personalities” • Previous implementations allowed for addition of body parts • Mixing through collision
Design Decisions • System to base our work off • Previous systems used proprietary software • Open source? • Game engine the obvious choice • Pre-existing code for 'Bots' • Physics • User controlled characters • Common
Design Decisions • Game Engines • Source • IOQuake3 • We Chose IOQuake3 • Free • Open Source • Runs on almost any modern system • Built in inventory system • Had all features we needed, minus custom AI • Which is what we wanted to do anyway :)
Background On ioquake3 • Server maintains state internally • Clients send it updates on what the user does (picks up items, etc) • Server determines how clients interact • Server controls bots • Server can be modified for our own ends • Clients can remain the same • Easy upgrades to the system
Implementation Details • Bot has innate and acquired behavior • IOQuake3 has bot characteristics • aggression, scaredness, ... • We modify to happiness, sadness, excitedness • IOQuake3 has inventory control • rocket launcher, BFG10K • We have shown it possible to integrate with behaviors • Combination gives us innate and acquired behavior • When bots collide, they trade 'weapons'
Implementation Details • Server keeps a “bot state” for each bot • Bot state contains 'AINode' • Points to the appropriate AI function for each state • AINode_Afraid • AINode_Violent • … • Every 100 milliseconds, the AINode function is called • Every 2 seconds, the AINode variable is updated • Server begins in default state based on characteristic values
IMPLEMENTATION DETAILS • Collisions transfer characteristics between the bots • The next state is then calculated by comparing the values of each characteristic • Example: • Violent=0.8, Passive=0.2 • There would be about an 80% chance that the next AINode state would be set to AINode_Violent.
Limitations • Almost none • Open source project allows for any modifications • Some problems may be non-trivial • Lack of documentation • We started from a blank slate, looking at the code • Lack of time • Making a complete product with time we had • Two behaviours remain unimplemented
Future Work • We developed the framework • Future teams can put pieces together • Need graphics team to “renovate” environment • Give human interaction more meaning • Integration with motion tracking component of CompuObscura system • Player control interaction • Full character motion in-game • Integration with inventory control system • Functional prototype shows viability of this system
Questions and Answers AuthorsDaniel NelsonDaniel MedinaIsrael UrquizaAlexander Sideropoulos CPE480 – Fall 2008Computer Science DepartmentCal Poly State University, San Luis Obispo