1 / 29

Intelligent Training System: Behavior Analysis & Synthesis

This intelligent training system utilizes behavior analysis, review, and behavior synthesis to provide effective pre, in, and post-evaluation for training purposes. It addresses the need for accurate performance assessment, offers intelligent inquiries, and supports remediation and exploration of alternative courses of action.

gregm
Download Presentation

Intelligent Training System: Behavior Analysis & Synthesis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intelligent Training System for Pre/In/Post Evaluation Using Behavior Analysis, Review, and Behavior Synthesis Amela Sadagic, PhD

  2. Agenda • The Team • MOUT Training Situation • Our Approaches • Our Solution: Concept • Need for S&T • Main Technical Objectives & Deliverables • Q & A

  3. Performer Lab Team • NPS - MOVES and CS Department: • Dr. Rudy Darken, PI • Dr. Amela Sadagic, Deputy PI • Dr. Chris Darken • Dr. Neil Rowe • Dr. Mathias Kolsch • Delta3D team

  4. Performer Lab Team • Research: • 3D visual simulations, Game-based simulations, • Computer-generated autonomy and computational cognition, • Behavior Analysis and Applied AI, • Human factors, Training systems and training methodologies, • Computer vision. • Laboratories involved: • Simulation and Training Laboratory, • Human Factors, Training Systems and Combat Modeling Lab, • Motion Capture Studio.

  5. Collaborating Research Teams • Sarnoff Corporation • Dr. Rakesh Kumar • Dr. Hui Cheng • Dr. Harpreet Sawhney • University of North Carolina at Chapel Hill • Dr. Henry Fuchs • Dr. Greg Welch • Dr. Marc Pollefeys • Dr. Anselmo Lastra

  6. Sarnoff Corporation • Projects/products: • Video FlushLight • Acadia • TerraSight • VideoQuest • JAM • VideoDetective • Research: • Computer vision • Image processing • Cognitive science & AI • Display technologies • Laboratories involved: • Visualization Lab

  7. UNC Chapel Hill • Past projects/products: • PixelPlanes and PixelFlow • Office of the Future • Effective Virtual Environments: Walkthrough, Virtual Pit, Passive haptics • HiBall Tracking System • 3D from video • Multi-projector displays • Research: • Computer vision • Display technologies and tele-immersion • Computer graphics and VR • 3D simulations • Laboratories involved: • Graphics, Vision and Image Lab • Microelectronic Systems Lab

  8. MOUT Training Situation • The shortcomings: • Before: Lack of exploratory training opportunities for larger units and their familiarization with a complex spectrum of training events to be encountered in Mojave Viper (training before the courses in physical training facilities): mission planning and mission rehearsal. • During: Lack of optimal support for human operation and supervision of training in physical facility: help resolve ambiguities, accurately capture-measure-analyze-detect-record-bookmark-&-visualize instances of all important events and performances of the warfighters. • After: Lack of intelligent after-action review with behavior analysis and smart inquiries about unit’s performances. Lack of opportunities for remediation and exploration of alternative courses of action (free-play). Lack of system support for analysis of historical data to indicate a need for possible doctrinal and instructional changes.

  9. Our Approaches • Affect large number of participants (ideally everyone): incorporate elements that ensure easier and faster large scale adoption of this technology innovation. • Provide solutions that empower both instructors and trainees. • Affect largest portion of unit’s training cycle: solution should address their training needs before the exercise, operational needs during the exercise, and operational and training needs after the exercise. • Provide a ‘complete package’ solution: technology (systems, tools, algorithms) and training methodologies of how to use that technology most effectively. • Affect different layers of their organization structure - ensure the paths for organization-wide benefits and provide useful understandings for Marine-wide organization.

  10. Our Approaches (2) • Solutions transparent to participants when their safety is of concern: no changes of the way they conduct training, no interference with training event, • Offered solutions constitute supplementary training interventions, • Solutions applicable to a variety of military training situations & segments applicable to non-military situations, • Use Open Source software solutions: Delta3D, MySQL • No proprietary solutions, no license issues - open the paths for easy future upgrades (by anyone and whenever needed).

  11. Concept

  12. Concept

  13. Pre- experience Re- experience before during after

  14. Pre- experience Re- experience before during after

  15. Pre- experience Re- experience before during after

  16. Need for S&T • Provide effective behavior analysis and behavior synthesis from the set of accurate 3D data acquired during unit’s performance in real training environment. • Design a multi-sensor data capture system capable of recording and deriving dynamic 3D participant models, and dynamic multi-dimensional participant pose tracks that can be viewed from any perspective. • Design automatic semantic parsing and analysis of the dynamic models and tracks, and identify individual and team performance trends. • Design approaches and system that provides re-play, intelligent inquiries and free-play (alternative courses of actions) features, to serve USMC training needs before and after training in physical MOUT environment. • Design novel training approaches to be used with the system - maximize training potential for the masses of intended users.

  17. Technical Objective • Provide a state-of-the-art, multipurpose system and set of training approaches to support a wide range of training and operational needs in preparing for and conducting training in MOUT facilities, and remediation of training after the courses in physical environments. • Provide a system that represents a synergy of computer vision, computer graphics (including virtual reality), cognitive science and artificial intelligence technologies at the level that has not been attempted before in military domain, and not done with the scale and complexity that is planned in this project. • “Roll up” the training-planning-rehearsal-execution-review cycle such that it is viewed as one (albeit phased) event rather than five.

  18. Multi-sensor Solution • Different types of sensors: • Optical sensors (multiple PTZ cameras): 3D real time data acquisition + 2D images/video -> 3D Marine position + Marine pose • Position sensors - GPS (IGRES): 3D Marine position • Inertial sensors: head and weapon orientation • Size of training environment: 1 x 1 km

  19. 3D Marine Position • Capabilities: Provide as accurate as possible 3D information about each Marine inside the training environment. • Concept: Derive best 3D position using 3D real-time vision based data acquisition and associated 2D video techniques. • Automated PTZ movement, Video-GPS track fusion and hand-off to keep Marine targets in view, • Improve position estimation accuracy from ~2-6m offered by GPS alone to sub-meter for behavior analysis, • Improve accuracy by fusing GPS and video based position estimation using video registration to 3D scene model and reference imagery, • For training video, Marines’ position will be estimated using a human model to account for height of the Marine and the change of the perspectives w.r.t. the gimbaled camera.

  20. Marine Pose Estimation • Capabilities: New methods for estimating participant pose and shape parameters, including the dynamic parameters of an articulated 3D human body model, and 3D models of their dynamic shape and appearance. Dynamic articulated body posture will enable new automated behavioral analysis (squatting, kneeling, prone, etc.). Different degrees of Marine-specific appearance and shape information will offer new visualization capabilities. • Concept: Multi-scale/resolution model-based 3D participant model reconstruction for vision-based participant tracking and modeling related to exercise capture and control. (Yan & Pollefeys)

  21. Behavior Analysis Capabilities: Automatically detect and recognize Marine behavior in a training exercise and evaluate Marine’s performance. Concept: Combine position, poses and motion based time series and classify them; develop Marine training ontology. • Human tracking data cannot be usefully partitioned at predefined training “events” - unambiguous & well defined events occur rarely (each unit has different plan of attack). • More interesting behaviors require correlating longer periods to find trends over time. • AAR greatly benefits from knowing when such higher-level behaviors occur. • Establishing such behaviors benefits from tracking of consistency of acceleration and velocity vectors, as well as line-of-sight analysis on the terrain, and correlation between the fire and movement (maneuver). • Solutions to be validated by the SMEs.

  22. Behavior Synthesis: Free Play • Capabilities: Ability of Marine to directly control actions of all entities of interest. Go beyond passive playback system and review of past training events - allow Marines to jump into free-play mode and explore alternative courses of action not represented in the exercise database. • Concept: AI control of all other entities responsive to and consistent with the Marine's control actions. • Behavioral models based on naturalistic decision making theory (mental simulation), • High fidelity, image based perceptual models (detections, missed detections, false positive detections), • Environment understanding based on automated environment exploration. • Solutions validated by the SMEs

  23. Advanced Display Platforms • Capabilities: Display platforms for individual and group use before, during, and after exercises. • Concept: Technologies and methods for deployable wide-area display systems based on ad hoc collections of new “intelligent projection units” (IPU) that automatically & continuously correct for image distortions, photometric blending, and mechanical perturbations. Develop IPU-based approaches to dynamic projection on non-planar surfaces for real-virtual sand tables. Develop pose-aware handheld display devices for situational awareness in situ during an exercise. • Conventional projector-based systems typically support one-shot image-based calibration. IPUs will combine projection engines, image and other sensors, and new algorithms for continuous and automatic geometric and photometric calibration.

  24. Advanced Display Platforms Intelligent Projection Units Virtual Sand-Table Pose-aware Handheld Devices

  25. 3D Visualization System • Capabilities: • Provide augmented situational awareness and analysis of units’ past performance. Enable not only easier operation of physical training facility but also training before and after the training in that environment: mission planning, mission rehearsal, behavior (performance) analysis and free-play. • New: behavior analysis with smart queries of unit (past) performance in physical environment (including playback feature) and exploration of alternative courses of action. Use open source software solutions and enable future easy(er) upgrades. • Concept: Integration of the results of computer vision, computer graphics (VR), cognitive science and AI in a single 3D visualization system.

  26. Learning & Training Methodologies • Capabilities: A novel set of learning and training methodologies to maximize training results in preparation for training in physical MOUT facility (mission planning and mission rehearsal). • Concept: (1) Explore a combination of new and old (traditional) instructional approaches and systems, (2) Inject motivation tools and aspects in instructional design, (3) Incorporate parameters that support fast, large scale adoption of our solutions + how to train the trainers.

  27. Program Plan FY07: Start preparation for technical tasks, prepare setup and algorithms for Ground Truth segment. FY08: Form a thorough understanding of training environment, training situations, training and operational needs in MOUT. Collect baseline data. Start developing algorithms for 3D data acquisition, behavior analysis and alternative courses of action capabilities. Define system requirements, system and data base architecture. Develop projector-based display system. FY09: Develop behavior analysis algorithms. Complete data base development. Complete development of library of scenarios. Develop first system prototypes (Intelligent Training System for Pre/In/Post Evaluation Using Behavior Analysis, Review, and Behavior Synthesis). Complete Marine posture estimation. Develop a prototype of sandtable platform. Start user studies and SME system validation. FY10: Complete performance evaluation algorithms. Refine and complete all system solutions and prototypes. Develop handheld-based platform. Finish user studies and SME system validation. Finish field system integration and demonstration.

  28. Special thanks to: Sponsor: Office of Naval Research Capable Manpower Future Naval Capability Transition customers: USMC: TECOM, PMTRASYS Collaborating partners: TTECG and Simulation Center, 29 Palms

  29. Q & A Contact: asadagic@nps.edu / ph: 831.656.3819 www.movesinstitute.org/~amela/

More Related