150 likes | 290 Views
Paul D. Varcholik ACTIVE Laboratory Institute for Simulation and Training University of Central Florida pvarchol@ist.ucf.edu. James L. Merlo LTC, US Army US Military Academy West Point james.merlo@usma.edu.
E N D
Paul D. Varcholik ACTIVE Laboratory Institute for Simulation and Training University of Central Florida pvarchol@ist.ucf.edu James L. Merlo LTC, US Army US Military Academy West Point james.merlo@usma.edu Gestural communications with accelerometer-based input devices and multi-modal displays
Problem Statement • Reliable communications between military personnel is critical • Hand gestures are used when vocal means are inadequate • Line of sight issues may cause visual signals to be unreliable
Our Research • Purpose To determine the usefulness of a computer mediated gesturing recognition system for non-visual communication • Scope Provide a proof of concept which would lay the groundwork for future research • Research Questions • Have we developed a recognition system capable of accurately converting and transmitting a visual communication mode into a non-visual form? • Do computer-mediated gestures provide a viable form of non-visual communication?
Input Device • Nintendo Wiimote • 3-axis accelerometer • Wireless (Bluetooth) • Inexpensive • COTS • 100Hz Sampling Rate Nintendo Wiimote
Output Devices • Auditory (headphones) • Tactile Display • Wireless (Bluetooth) • 1.2 lbs (w/o battery) • Elastic belt • 8 tactors at 45-degree increments Tactile Belt
Tactile Patterns Emulating Standard Army Hand Signals (FM 21-60) Tactons
Gesture Recognition • Machine Learning Algorithms (3 implemented for evaluation) • Linear Classifier • AdaBoost • Artificial Neural Network (evolved w/ NEAT) • 29 Features • Based on work by Rubine (1991) on 2D symbol recognition • Example features: • Bounding Volume Length • Min, Max, Median, Mean (X, Y, Z) • Starting Angle, Total Angle Traversed, Total Gesture Distance
Training & Visualization UI • Arbitrary gesture set • Left-hand, right-hand, both-hands • 3D animated soldier • Text label display • Sound display • Tactile display • Wiimote visualization • Data serialization • UI Independent of recognition API
Experiments & Results • Several experiments run to date • Different algorithms and gesturesets • Accuracy > 94% • Classification time < 10ms / gesture • Linear classifier best performer (for training time and classification considered together) • AdaBoost (highest accuracy, but slower training time than linear classifier) • ANN w/ NEAT (worst performer – requires more training data)
Discussion • Proved Concept • System capable of accurately converting and transmitting a visual communication mode into a non-visual form. • Wiimote is a convenient and inexpensive device for experimentation. Technology transfers to more robust hardware (e.g. instrumented glove). • Wiimote produces some ambiguous data (e.g. static poses). Additional attachment (e.g. gyroscopes) required for more accuracy. • Experiments indicate promising form of communication – more experiments are needed.
Future Work • Determine the maximum number of gestures that can be accurately recognized • Gesture rejection • Dynamic mapping between gesture, sound, and tactile sequence • Scenario development for realistic experimentation (establishing context) • Transmitting signal data via RF (currently sent to local device or via UDP/IP)
USMA Collaboration • CDT Robert Darket, CDT Zachary Schaeffer (Principal Investigators) • Application: Training • Collect exemplar gestures from SMEs • Validate less-experienced soldier’s gestures against exemplars
Gestural communications with accelerometer-based input devices and multi-modal displays Questions? Paul D. Varcholik ACTIVE Laboratory Institute for Simulation and Training University of Central Florida pvarchol@ist.ucf.edu James L. Merlo LTC, US Army US Military Academy West Point james.merlo@usma.edu