160 likes | 263 Views
Implementation of Gesture Recognition in the Immersive Visualization Environment. By Danny Catacora Under the guidance of: Judith Terrill and Terence Griffin National Institute of Standards and Technology. Problem Statement:.
E N D
Implementation of Gesture Recognition in the Immersive Visualization Environment By Danny Catacora Under the guidance of: Judith Terrill and Terence Griffin National Institute of Standards and Technology
Problem Statement: To add a set of gestures that were intuitive to the users, and have actions that could be easily reconfigured to be applied to different visualizations Courtesy of capital-benefits-group.com.au
Introduction to the IVE in the RAVE Importance of needed gestures in the IVE Procedure consisted of designing a gesture, record data, and coding it Created similar Wii gestures that are easily defined and recognized Future work includes more complicated gestures Outline:
Introduction: IVE? • Immersive Visualization Environment • Produces visualizations from large amounts of computed data • A visualization technology that allows human interaction Courtesy of NIST
Introduction: RAVE? • Reconfigurable Automatic Virtual Environment • Takes up the size of a room • Consists of a wand, headset, and 3 screens Courtesy of NIST
Importance of Problem: • Flexible and intuitive navigation methods, but lacks pointer gestures • Newer interactions needed • Applicable and basic gestures should be investigated Courtesy of NIST
Research & Design: Game Technology: • The Wii : • Uses a remote controller to recognize gestures • Xbox Kinect : • Removes the controller, recognizes human movement and actions Improving 3D Gesture Recognition with Spatially Convenient Input Devices
Procedure: Steps for each gesture: • Gesture definition: • Specify what happens when the gesture is performed • Define what the gesture program will recognize/look for • Data collection: • Figure out how each variable will respond to the gesture • Determine correlations between gestures and changes in position, speed, and acceleration
Procedure: Steps for each gesture: • Program implementation of gesture: • Write code that will recognize gesture accurately • Use algorithms and data analysis to write code in C++ • Gesture testing: • Test the written code for the gesture in both environments • See if calibration for the gesture works
Data Interpretation: • Had program record the movement of the wand • Wand data could be analyzed for specific changes in the x, y, z, yaw, pitch, and roll values. • Code was written to catch the patterns of gestures.
Results: • Resulting algorithms and code that recognizes gestures • Further complex gestures later defined from these basic gestures
Results: • Displaying gesture recognition in the RAVE • Display: from terminal, to pop-ups, to 3d arrow
Conclusion • In the end, able to successfully identify a total 15 gestures • Actions completely user definable • Gestures will help current interaction within the IVE • Set basis for more complex gestures in the future
Future Research: • Eventually remove the wires, glasses, or the remote • Set basis for more complex gestures in the future • Establish a greater user-friendly interaction between man and machine in the RAVE
Acknowledgments: • Judith Terrill - N.I.S.T • Terence Griffen - N.I.S.T • John Hagedorn - N.I.S.T • Steven Satterfield - N.I.S.T • Elizabeth Duval - Montgomery Blair High School