320 likes | 335 Views
An environment for creating and testing strategies for playing Mastergoal using ML techniques. Includes user documentation, component design, source code, and assessment evaluations. Step into the future of game AI development!
E N D
Mastergoal Machine Learning Environment Phase III Presentation Alejandro Alliana CIS895 MSE Project – KSU
MMLE Project Overview • Provide an environment to create, repeat, save experiments for creating strategies for playing Mastergoal using ML techniques. • Divided in 4 sub-projects. • Mastergoal (MG) • Mastergoal AI (MGAI) • Mastergoal Machine Learning (MMLE) • User Interface (UI)
Phase III Artifacts • User Documentation • Component Design • Source Code (and exe) • Assessment Evaluation • Project Evaluation • References
User Documentation • MMLE UI User manual provided. • Install • Common use cases • Data fields types and formats • Examples • MG, MGAI, MMLE libraries. • API generated by doxygen.
Component Design • Project divided in 4 main subprojects. • UI uses MMLE, MGAI and MG as libraries.
Component Design – Design Patterns Used • Factory Method: Board, Search Algorithm. • Prototype: Strategy. • Singleton: All factories, Terms, Fitness Functions, Selection Criteria, Termination Criteria. • Template Method: Terms, Strategy. • Strategy: Search Algorithms – Agents, Strategy – Search Algorithms. • Observer: GameSubject – GameListener, TrainSubject – TrainListener. • Proxy: TrainBridge, UIAgentProxy
Component Design • Deployment Diagrams show packages and files.
Component Design • Deployment diagrams. • Class diagrams • Sequence diagrams. • Object Diagrams. • Short description of classes and link to API online documentation.
Source Code • Kept in SVN repository (7 projects). • 4 sub-projects. • 3 test sub-projects. • Metrics were taken weekly and will be discussed later in the presentation.
Installer and executable. • Installer created with NSIS (Nullsoft Scriptable Install System). • UI created with the Windows Forms GUI API available in the .NET Framework. • All other sub-projects coded in (unmanaged) C++ and are available as libraries.
Assessment Evaluation • I used the CPPUnit framework to perform unit testing on the projects. • MastergoalTest • MastergoalAiTest • Mmle-test • Assertions used to test for pre and post-coditions. • I used the Visual Leak Detector system to detect memory leaks .
Assessment Evaluation • Test Plan • All test passed* • CPPUnit • Regression Bugs. • Coding of test cases. • Document and debug test cases. • Memory Leak Bugs.
Size of the test projects. • Overall the three test projects have 1125 lines of code.
Project Evaluation • Metrics • 533 hours (13.3 weeks or 3 months over a period of 10 months) and 11 KLOC. • Estimations • FP: Time 10.79 months, 2.79 KLOC. • COCOMO: Time 9.24 months, 7.5 KLOC. • COCOMO II: Time 9.54 months, 7.5 KLOC.
Project Evaluation - FP • Real • 11 KLOC and 3 months. • Estimates of Function Points • Size 2.79 KLOC, Time 10.79 months. • Lack of experience using FP. • Some of the user interfaces were more complex than previously thought • No .NET conversion rates • A big part of the project is the user interface which contains automatically generated code • Algorithms not well represented.
Project Evaluation - COCOMO • Real • 11 KLOC and 3 months. • Estimates of COCOMO. • Actual size is arbitrary, based on experience. • Size 7.5 KLOC, time = 9.25 months • Inexperience in C++/ .NET. • Conversion rates of the languages. • Estimates of COCOMO II • Application Composition model 5.57 Person months (Object Points / Productivity) • Post Architectural Model 9.54 months (7.5 KLOC)
Project Evaluation - Lessons Learned • Implementation: • C++ language, memory management, implementation of design patterns. • Tools and libraries (NSIS, CPPUnit, VLD, Doxygen) • Design: • Design Patterns.
Project Evaluation - Lessons Learned • Experience on various estimate models. • Measurement • Tools (CCCC, Process Dashboard). • Testing. • CPPUnit framework. • VLD. • Process • Iterative process • Artifacts
Project Evaluation - Future work • Improve performance of search algorithm and add new algorithms. • Add more functionality to the game playing library and UI. • Add more selection mechanisms to the GA Experiments. • Add more learning algorithms. • Distributed computation to speed up training • Refactoring of some classes. • Add test classes for each feature.
Tools Used • MS Visual Studio 2005 • BoUml • Rational Software Architect • NSIS (Nullsoft Scriptable Install System) • CCCC (C and C++ Code Counter) • TinyXML • Visual Leak Detector • Doxygen • Process Dashboard • TortoiseSVN
References • Design Patterns: Elements of Reusable Object-Oriented Software, Gamma, Erich; Richard Helm, Ralph Johnson, and John Vlissides (1995). Addison-Wesley. ISBN 0-201-63361-2. • Machine Learning, Tom Mitchell, McGraw Hill, 1997 ISBN 0-07-042807-7 • BoUML http://bouml.free.fr/ • Rational Software Architect http://www-306.ibm.com/software/awdtools/architect/swarchitect/ • http://sourceforge.net/projects/tinyxml/ • NSIS (Nullsoft Scriptable Install System) http://nsis.sourceforge.net/Main_Page • CCCC (C and C++ Code Counter) http://sourceforge.net/projects/cccc • CPPUnit http://cppunit.sourceforge.net/doc/lastest/cppunit_cookbook.html • TinyXML http://www.grinninglizard.com/tinyxml/ • Visual Leak Detector available at http://dmoulding.googlepages.com/vld • Doxygen http://www.stack.nl/~dimitri/doxygen/ • Process Dashboard http://processdash.sourceforge.net/ • TortoiseSVN http://tortoisesvn.tigris.org/