1 / 45

MULTI-LINGUAL AND DEVICELESS COMPUTER ACCESS FOR DISABLED USERS

This research aims to provide multi-lingual and deviceless computer access for disabled users using hand gestures. By mapping gestures to characters, it offers a simple and cost-effective way for individuals with disabilities to communicate and control computers. The study explores a novel approach that captures user gestures, generates codes, and executes operations based on the gestures. By eliminating the need for glove-based sensing technologies, it enhances natural human-computer interaction for disabled users. This innovative system involves image capturing, pre-processing, edge detection, tracking, code generation, and action execution stages. The methodology focuses on mapping hand gestures to language characters to facilitate communication and computer access for disabled individuals.

Download Presentation

MULTI-LINGUAL AND DEVICELESS COMPUTER ACCESS FOR DISABLED USERS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MULTI-LINGUAL AND DEVICELESSCOMPUTER ACCESS FOR DISABLEDUSERS C.Premnath and J.Ravikumar S.S.N. College of Engineering TamilNadu

  2. Abstract • Hand forms one of the most effective interaction tool for HCI. • Currently, the only technology that satisfies the advanced requirements is glove-based sensing. • We present a new tool for gesture analysis, by a simple gesture-symbol/character mapping esp. suited to the disabled users.

  3. AIM : To use the hand gestures of the user to help them control/access the computer easily OBJECTIVE : • To provide simple and cheap system of communication to people with single or multiple disabilities • To overcome the language problems in communication/computer access.

  4. INTRODUCTION • Difficulties and impairments reduce computer use . • Direct use of the hand as an input device is an attractive method for providing natural human-computer interaction (HCI) .

  5. Currently, the only technology that satisfies the advanced requirements of hand-based input for HCI is glove based sensing. • Hinders the ease and naturalness with which the user can interact with the computer controlled environment • Requires long calibration and setup procedures.

  6. Glove-Based Approach • Basic operation is to sense the gesture by electric/magnetic contact or by monitoring threshold values in chemical/electrode based sensors. • The sensors are connected to a control unit to find the gesture. • Cost, dexterity and flexibility.

  7. Hinders the ease and naturalness with which the user can interact with the computer controlled environment • Requires long calibration and setup procedures.

  8. Computer vision has the potential to provide much more natural, non-contact solutions.

  9. Gesture recognition methods • Model-based approach • Appearance-based approach

  10. Model-Based Approach • Model based approaches estimate the position of a hand by projecting a 3-D hand model to image space and comparing it with image features. • The steps involved are: • Extracting a set of features from the input images • Projecting the model on the scene (or back-projecting the image features in 3D) • Establishing a correspondence between groups of model and image features

  11. Appearance-Based Approach • Appearance based approaches estimate hand postures directly from the images after learning the mapping from image feature space to hand configuration space. • The feature vectors obtained is compared against user templates to determine the user whose hand photograph was taken.

  12. Require considerable research on mapping and other relevant work. • Actually allow us to create simple and cost effective systems.

  13. Systems providing computer-access for people with disabilities • JavaSpeak • Parse the program and "speak" the program’s structure to a blind user • ViaVoice, which has a published API, is used as the speech reader.

  14. Emacspeak • Provides functions geared directly to programming. • Only for someone familiar with a UNIX environment.

  15. METHODOLOGY • Novel approach of mapping the character set of the language with the possible set of hand gestures and executes the actions mapped for the particular gesture. • Capture the user’s gesture • Manipulate and create a 5-digit code • Execute required system operation • User-friendliness – providing audio request.

  16. The phases involved • IMAGE CAPTURING • PRE-PROCESSING • EDGE DETECTION • EDGE TRACKING • CODE GENERATION • ACTION EXECUTION

  17. Image Capturing • Setup and capture

  18. Pre-processing • Synthetic image • An arithmetic operation is performed with the different channels

  19. a) Sample input image b) Synthetic image

  20. Edge Detection • Need for edge detection • Edges • Edge detection methods • Fourier domain • Spatial domain

  21. GradientMagnitude operation • Spatial Domain Method • Performs convolution operations on the source image using kernels.

  22. Sample Output a) Synthetic image b) Edge detection output

  23. Edge Tracking • Find critical points.

  24. Lets us see in detail how we trace fingertip shown below . In-depth finger tip image

  25. Tracing for finger valley shown below is done in the exact reverse manner as discussed for finger tip In-depth finger valley image

  26. Output after edge tracking b) Finger length using Pythagoras Theorem. a) Critical points marked with red dot.

  27. CODE GENERATION • Using phalanx information

  28. Information about the phalanxes of the right hand

  29. Values to be assigned • 1 if the finger is open. • 0 if the finger is half-closed i.e., only the proximal phalanx is visible. • Already have data about the full finger length information of the user • During code generation, • 1 assigned when approximate matches with the stored value • 0 when the obtained finger length is half that of the corresponding one in the database.

  30. 5 fingers - 2 values each • Overall 32 (2×2×2×2×2) action gestures.

  31. Mulitilingualism • Map the gestures currently associated with only English characters, to the characters in other languages by analyzing the phonetics and their translation to English.

  32. For example, the words involving • Tamil characters , • Hindi characters , • Telugu characters , • Malayalam characters , • Can all be mapped with the English letter ‘A’,

  33. European languages the alphabet is almost similar. • Voice engine support important • Latin • Non-Latin languages where we have no space between words (Hindi and Arabic), are supported by tailoring the • Run-time speech engine Free TTS

  34. ACTION EXECUTION • The tree panel • Acquires the path information of a file/folder whenever that particular file/folder gets selected by the user’s input. • The filename is passed to the speech synthesizer unit and verification done by the user.

  35. JMF player controls the browsing work • For example, if character ‘A’ is passed to the file manager then it passes the next file/folder name starting with the letter ‘A’ to the JMF player. • File operations • Type of the file selected (media/text) and the user’s input gesture. • Pass the file to the JMF player unit • Execute appropriate operations

  36. Features • Minimized cost and user friendliness of the project. • Flexibility to change the gesture mapping based upon user’s comfort • Ambidexterity

  37. Limitations • Gesture mapping for languages with large character sets like Chinese and Japanese. • Voice support from the speech engine

  38. Conclusion • Novel approach for providing computer access to disabled user with a multilingual method. • Overcomes the problem of user’s age involved and physical measures. • Support for the illiterate users.

  39. FUTURE WORK • Both the hands as input aided with touch pad technology for the computer access. • 1024 (210 values - taking 2 values for each of the ten fingers) • Assume the ten bit code 10000 10010 is associated with the word “Pause”, then the system would type the word “Pause” if the environment is a text editor and PAUSE the current music track if the environment is a music player.

  40. Map the gestures with system commands. • Other applications currently inaccessible for disabled users.

  41. The project has proposed an ambidextrous system where the computer access is all within your 5 fingers and the proposed enhancement has the potential to bring the world in your hands.

  42. Thank you for listening patiently to our work. QUESTIONS ? ravikumar.jayaraman@gmail.com, prem86ssn@gmail.com

More Related