10 likes | 130 Views
3D Display of Virtual Humans. Kristina Khuu & Luana Sanchez MOVES Institute, Naval Postgraduate School, Monterey, CA 93943 Dr. Amela Sadagic, Research Associate Professor, Virtual Humans Lab. Abstract
E N D
3D Display of Virtual Humans Kristina Khuu & LuanaSanchezMOVES Institute, Naval Postgraduate School, Monterey, CA 93943 Dr. Amela Sadagic, Research Associate Professor, VirtualHumans Lab Abstract 3D virtual humans represent digital models of human beings. These models are used as an important resource for research studies, learning and training systems, video games, etc. In situations when the quality of interactions with virtual humans is as good as (or close to) the quality of interactions with real humans, 3D virtual humans can be used instead. 3D virtual humans provide a flexibility that may be needed in given system - they are available any time we 'ask' them and they represent an inexpensive alternative to hiring real humans. This project focused on revising a set of 3D virtual characters that are planned to be used for research studies at MOVES Institute. The effort included three distinct sets of activities: (1) identifying the flaws in visual appearance (3D geometry and textures), (2) testing the functionalities of the Face Controller application, and (3) building a comprehensive set of animation scenarios. The goal was to look for discoloration of skin, believability of emotions of facial expressions, the errors in the programing and to create a set of behaviors that would be used in user studies. Flaws found in the program were analyzed and logged in a report - each error found has been presented with the pictures and short description. This resource will later on be used by the programmers who will then know what needed to be fixed. • Results • 6 3D models of Virtual Humans have been examined (3 male models and 3 female models with different shades of skin color). The flaws in appearance that detracted from the believability of the human models have been recorded and commented. Figure 2 shows a typical example of a texture flaw found on a male model. • Number of revisions • AA Male: 2 • AA Female: 7 • AS Male: 5 • AS Female: 6 • C Male: 9 • C Female:5 • All functions of Face Controller pose events and animation events were tested and logged. Anything that was unnatural or nonfunctioning was declared to be an error. Figure 3 illustrates the error of a rendering program that displayed on a head of a male character. • Number of revisions • AA Male: 38 • AA Female: 40 • AS Male: 41 • AS Female: 35 • C Male: 37 • C Female: 34 • Multiple scenarios were created by manipulating the faces. The scenarios were stored to be used later in user studies. One hundred five scenarios were created using the expressions of varying complexity ranging from very simple and short animations, to more complex and longer animations. Figure 4 and Figure 5 show two examples of animated scenario created in our project. Conclusions This project was focused on testing the models of 3D virtual humans with the Face Controller application. The logs of all errors identified will be used to fix and adjust both 3D models and the Face Controller application. The updated versions will be used for a series of user studies planned to be executed by researchers at the MOVES Institute. The studies are designed to help the researchers improve their understanding on how real humans react when presented with the situations in which they need to collaborate with virtual humans, and what characteristics of virtual humans need to be improved to make sure the interaction with virtual humans is qualified by the human participants as realistic and effective. 1. Revisions of Appearance Figure 2. Example of discoloration on side of head. 2. Functionalities of the Face Controller Application • Literaturecited • "3D Display and Capture of Humans for Live Virtual Training" projectSadagic, A. (2011), "Tangible Virtual Humans: Meet Your New Role Players", MOVES Research and Education Summit 2011 Figure 3. The figure at left illustrates a command and visual error. Polygons protrude out of his neck. 3. Animated Scenarios Acknowledgments A special thanks to Dr. AmelaSadagic and Professor Mathias Kolsch for being kind and nurturing mentors. A thanks to Alison Kerr of the Naval Postgraduate Cebrowski Institute, Joe Welch, Computer Science at Hartnell College, Andy Newton, Director of the Science Math Institute at Hartnell College. Lastly a thanks to everyone at the MOVES Institute for our success. The Internship was funded by the College to University Success Program. I am looking forward to seeing you next week. Figure 1. Example of a virtual reality head (left) and a true 3D virtual reality head with image projected on it (right). Figure 4. The images above are screen shots of one of the animated scenarios and the tracks that control it. • Materialsandmethods • Face Controller Application • Alienware laptop and desktop • Microsoft Word 2010 • Paint Forfurtherinformation For more information contact Dr. Amela Sadagic, email: asadagic@nps.edu, Kristina Khuukgkhuu@gmail.com, Luana Sanchez, sanchez_luana@yahoo.com Figure 5. Example of a full body animated scenario and the tracks that control it.