1 / 39

What’s Next for Digital Human Models?

What’s Next for Digital Human Models?. Norman I. Badler Center for Human Modeling and Simulation University of Pennsylvania Philadelphia, PA 19104-6389 http://www.cis.upenn.edu/~badler. Outline. Applications and Influences Towards Smarter Models Action Representation Examples

cato
Download Presentation

What’s Next for Digital Human Models?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What’s Next for Digital Human Models? • Norman I. Badler • Center for Human Modeling and Simulation • University of Pennsylvania • Philadelphia, PA 19104-6389 • http://www.cis.upenn.edu/~badler Digital Human Models 2000

  2. Outline • Applications and Influences • Towards Smarter Models • Action Representation • Examples • Near Futures Digital Human Models 2000

  3. Outline • Applications and Influences • Towards Smarter Models • Action Representation • Examples • Near Futures Digital Human Models 2000

  4. Applications for Digital Human Models: • Engineering Ergonomics. • Design and Maintenance Assessment. • Games/Special Effects. • Military Simulations. • Job Education/Training. • Medical Simulations. Digital Human Models 2000

  5. Human Model “Dimensions” • Appearance: shape, size • Function: physical capabilities • Time: to move onscreen • Autonomy: actions, decisions, reactions • Individuality: generic or specific people Digital Human Models 2000

  6. Comparative Models Application Appear. Function Time Autonomy Individ. Cartoons high low high low high Sp. Effectshigh low high low med Medicalhigh high med med med Gameshigh low low med/high med Ergonomicsmed high med med low Military med med low med/high low Educationmed low low med/high med Training med low/med low high med Digital Human Models 2000

  7. Outline • Applications and Influences • Towards Smarter Models • Action Representation • Examples • Near Futures Digital Human Models 2000

  8. Towards Smarter Agents • Actions to Execute: • Action Representation - What it can do. • Behavior Model: • The agent’s decision-making, “thought,” and reaction processes - What it should do or wants to do. • Inputs to Effect Behavior: • Incoming knowledge about the outside world - What it needs to know. Digital Human Models 2000

  9. World model • Messages • Sensors • Situation “Classic” AI: Agent Action Cycle Control Sense Act Agent model Digital Human Models 2000

  10. Smart Agent Requirements • Actions to execute: • Action Representation - What it can do. • Behavior Model: • The agent’s decision-making, “thought,” and reaction processes. • Inputs to Effect Behavior: • Incoming knowledge about the outside world. Digital Human Models 2000

  11. 4 Levels of Action Representation • 0: Basic Motion Generators • 1: Parallel -Transition Networks • 2: Parameterized Actions • 3: Natural Language Instructions Digital Human Models 2000

  12. Level 0: Basic Human Movement Capabilities • Gesture / Reach / Grasp. • Walk / Turn / Climb. • Posture Transitions (Sit / Stand) • Visual Attention / Search. • Pull / Lift / Carry. • Motion playback (captured or scripted). • ‘Noise’ or secondary movements. Digital Human Models 2000

  13. Synthesized Motions -- Leverage Economy of Expression • Few parameters controlling many: • Inverse kinematics for arms, legs, spine. • Paths or footsteps driving locomotion. • Balance constraint on whole body. • Dynamics control from forces and torques. • Facial expressions Digital Human Models 2000

  14. Smart Agent Requirements • Actions to execute: • Action Representation. • Behavior Model: • The agent’s decision-making, “thought,” and reaction processes - What it should do or wants to do. • Inputs to Effect Behavior: • Incoming knowledge about the outside world. Digital Human Models 2000

  15. Level 1: Parallel Transition Networks (PaT-Nets) • A Virtual Parallel Execution Engine for agent actions (a.k.a. Finite State Machines) • Common software paradigm for human model control. Digital Human Models 2000

  16. PaT-Net Applications • Conversational agents. (SIGGRAPH ‘94) • Hide and seek. (VRAIS ‘96) • MediSim: Physiological models. (Presence ‘96) • Jack Presenter. (AAAI-97 Workshop/IEEE CG&A) • Delsarte Presenter. (Pacific Graphics ‘98) • JackMOO. (WebSim ‘98, VR ‘99) • AVA (Attention). (Autonomous Agents ‘99) Digital Human Models 2000

  17. What’s Missing? • PaT-Nets are effective but hand-coded. • Connect language and animation through an intermediate level --- Digital Human Models 2000

  18. Level 2: Parameterized Action Representation (PAR) • Derived from BOTH Natural Language analyses and animation requirements: • Agent, Objects, Sub-Actions. • Preparatory Specifications, Postconditions. • Applicability and Termination (Success and Failure) Conditions. • Purpose (Achieve, Generate, Enable). • Path, Duration, Motion, Force. • Agent Manner. Digital Human Models 2000

  19. Level 3: Natural Language Instructions • Instructions let people communicate with digital human models. • Instructions say what to do. • Instructions depend on underlying action skills. • Instructions build agent behaviors: future actions or standing orders. Digital Human Models 2000

  20. Digital Human Models 2000

  21. Smart Agent Requirements • Actions to execute: • Action Representation. • Behavior Model: • The agent’s decision-making, “thought,” and reaction processes. • Inputs to Effect Behavior: • Incoming knowledge about the outside world - What it needs to know. Digital Human Models 2000

  22. Input Sensing • Message passing. • Explicit transfer or direct knowledge of state information between agents. • Artificial perception. • Visual/auditory/haptic [collision detection] sensing to attend to and observe local context. • Situation awareness. • Recognizing complex relationships. Digital Human Models 2000

  23. Outline • Applications and Influences • Towards Smarter Models • Action Representation • Examples • Near Futures Digital Human Models 2000

  24. Example 1: The Virtual Reality Checkpoint Trainer • Joint ONR Project between UPenn, UHouston, and EAI. • Multi-agent and/or avatar situation. • Process simulator for traffic. • Autonomous agents. • Real-time behaviors and reactions. • Natural Language input for “standing orders”. • (Next step: Live trainees in VR.) Digital Human Models 2000

  25. Virtual Environment Digital Human Models 2000

  26. The Checkpoint Scene Digital Human Models 2000

  27. The Checkpoint Video Digital Human Models 2000

  28. Example 2: Maintenance Instruction Validation • Project sponsored by USAF. • Simulate the execution of a maintenance instruction. • The simulation generates behaviors of a virtual technician and the maintained system as well as their interaction. • Pull human factors analysis closer to design cycle. Digital Human Models 2000

  29. Multi-Domain Simulation • The simulation is generated from four separate simulators that work together: • Scene Management System (CAD/display) • Intelligent Agent (task model) • Semi-Qualitative Simulator (equipment model) • Human Form Model • Each has a database: CAD and feature data, CAE or functional models, human form anthropometry, etc. Digital Human Models 2000

  30. Initial State • The aircraft is jacked above the ground • The technician works inside the front landing gear wheel well to access the front uplock hook (FUH). Digital Human Models 2000

  31. Front Uplock Hook (FUH) Removal • The removal of the FUH requires the disconnecting its hydraulic lines and removing its bolts from the bulkhead. • However, the bolts should not be removed from the hook since they hold spring loaded parts together. Simulated Demo Digital Human Models 2000

  32. Demonstration Summary • The validation tool uses structural and behavioral models of the equipment to simulate normal and hazardous conditions. • The Product Data Model (PDM) should be accessible to the human model. • The virtual technician should plan and execute complex disassembly tasks including fine motor skills, constrained assembly planning, and end effector and tool accessibility. Digital Human Models 2000

  33. Example 3: Towards Interactively Building Parameterized Actions • Automatically abstract semantically significant points in an agent’s movements into spatial and visual constraints which are then used to construct a PAR. • Populate the Actionary from real examples. • Re-target PARs to agents of different sizes while maintaining the spatial and visual constraints and similar motionstyle. Digital Human Models 2000

  34. “Drink from a mug” Digital Human Models 2000

  35. Outline • Applications and Influences • Towards Smarter Models • Action Representation • Examples • Near Futures Digital Human Models 2000

  36. Near Futures (1) • Instruction execution must be context-, perception-, and agent-sensitive. • Language interfaces (through PAR) expand usability and agent building. • Motion capture can be used to “learn” new parameterized actions. • Expand Actionary application independence. Digital Human Models 2000

  37. Near Futures (2) • The games and training communities will build models with excellent appearance and capabilities. • Smarter agents based on cognitive models and knowledge databases. • Better integration of human performance models with real-time bodies. • Better integration with PDM and CAE models. Digital Human Models 2000

  38. Near Futures (3) • Encourage development of standardized Appilication Programmer Interfaces (API) for digital human models. • Expect multiple levels of detail to smoothly transition between views of a simulation. • Improved user-friendliness (non-programmer access) of human modeling systems. Digital Human Models 2000

  39. Acknowledgments • Colleagues: Martha Palmer, Aravind Joshi, Jan Allbeck, Aaron Bloomfield, MeeRan Byun, Diane Chi, Sonu Chopra, Monica Costa, Rama Bindiganavale, Charles Erignac, Ambarish Goswami, Karin Kipper, Seung-Joo Lee, Sooha Lee, Jianping Shi, Hogeun Shin, William Schuler, and Liwei Zhao. • Sponsors: NSF, USAF, ONR, DARPA, NASA, ARO THANK YOU! Digital Human Models 2000

More Related