1 / 2

ASLLENGE

ASLLENGE. He Zhou, Hui Zheng, William Mai, Xiang Guo Faculty Advisor: Prof. Patrick A. Kelly. Abstract.

evan-mills
Download Presentation

ASLLENGE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASLLENGE He Zhou, Hui Zheng, William Mai, Xiang Guo Faculty Advisor: Prof. Patrick A. Kelly Abstract Human gestures recognition can be extremely challenging, since human gestures does not provide auditory information to the learner, The human brain is used to distinguish the difference of a word by hearing the sound with the different intensity, pitch, and tone. Instead human gestures rely heavily on streams of information that are much more physically demanding. Human gestures helps men to express themselves by using distinctive facial, eye, head, and body movements. The overall goal of ASLLENGE is to design and implement a modern tool that can be easily used to record and compare human gestures to enhance the communication. The main components of this system are a Microsoft Kinect based arm motion tracking device and a finger tracking module. User Interface Flex Sensor BlueSMiRF Silver Arduino Mega 2560 Cost for The Project Circuitry between the flex sensor glove and the Arduino board The table above shows the detail cost for our project. If it is going to be recreated, the total cost is $431. Department of Electrical and Computer Engineering ECE 415/ECE 416 – SENIOR DESIGN PROJECT 2012 College of Engineering - University of Massachusetts Amherst SDP 12

  2. Detailed Flow Chart Motion Recognition Steps 1: The starting/ending of a gesture is when both hands are the hip User does a gesture 2: When using one hand, the other hand must be resting on the hip Being Inserted Data from the Kinect Camera is collected through the 3D depth sensors and RGB Camera The flex sensor is inserted into each glove. It collects analog signal while fingers are bending Flex Sensor 3: The start and stop are controlled by the controller Arduino Mega 2560 works as an AD converter Two Simple Motion Motion 3 Motion 1 BlueSMiRF Silver transmits data to a PC wirelessly Data from finger tracker split into the left and right Kinect Skeleton generated by the Kinect SDK 1: Raise both hands frontal motion above your head at the same time 1: Raise left hand frontal motion above your head 2: Lower both hands frontal motion on the stop position at the same time Check Threshold 2: Lower left hand frontal motion on the stop position 3: Raise right hand frontal motion above your head 4: Lower right hand frontal motion on the stop position Within Threshold Exceed Threshold

More Related