160 likes | 209 Views
GESTURE CONTROLLED ROBOTIC ARM. PRESENTED BY: PUSHKAR SHUKLA HITESH TRIPATHI SHAILESH BISHT PRIYANSHU RAJ. 1. INTRODUCTION. Gesture is a form of communication that involves the movement of body parts.
E N D
GESTURE CONTROLLED ROBOTIC ARM PRESENTED BY: PUSHKAR SHUKLA HITESH TRIPATHI SHAILESH BISHT PRIYANSHU RAJ
1.INTRODUCTION • Gesture is a form of communication that involves the movement of body parts. • Gesture Controlled Robotics stands for using body movements as input signals to control the robot dynamics. • Features are attached to the body parts. These features may themselves be sensors or they may be objects that can be identified using other external sensors like camera.
2.ROBOTIC ARM • A robotic arm is a robotic manipulator, usually programmable, with similar functions to a human arm . • Generally the total degree of freedom of a robotic arm as that of a human arm. • Similar to a human arm the robotic arm has a shoulder, a wrist an elbow and a hand. FIGURE 2
3. GUROO • Gesture controlled programmable robotic arm having 5 degrees of freedom. • Developed with the aim of simulating the human arm as closely as possible using a variety of mechanical components. • In the Initial phase of the project the arm had six degrees of freedom with the motion restricted to 1 joint at a time . FIGURE 3
4.SENSOR SETUP • A setup consisting of IR sensors, accelerometers and a potentiometer is to be worn around the human hand for sensing the gesture movements. • The Infra Red receiver transmitter pair is to be worn around the fingers and controls the opening and closing of the end effector. • 2 accelerometers are used for sensing the movement of the forearm and the wrist movement. • A potentiometer is used to track the elbow movement.
5. ALGORITHMS USED • LINEAR PREDICTION : The values of the sensors and motors for every joint have been stored for certain predefined positions for each joint movement .The position of the robotic arm is predicted linearly using the given formulae, FIGURE 4 FIGURE 5
FIGURE 6 FIGURE 7
2) CORRECTION: The predicted value is then compared to the present value and the difference between the two values is noted down M2- predefined value of the servo motor for the next known position. M1- predefined value of the servo motor for the previous known position S1- predefined sensor value for the last known position S2- predefined sensor value for the next known position
6.APPLICATION • Industrial Applications • Such arms may prove handy in such sectors where the precision has to be adjusted from time to time. • Such arms make the job of the controller easier and have the capability of being operated at faster speed than the traditional robotic arms used in the industries. • A combination of the traditional and gesture controlled robotic arm may prove to be very handy providing the arm both flexibility as well as accuracy. • Disposing off radioactive wastes or any other hazardous chemical that may be dangerous for human beings. • Can be used in mines and space where human intervention is not possible.
2.Defense • It Can be used for bomb disposal offering as much accuracy as a human arm and also saving a human life. • 3.Medical Uses • Can be used by doctors to perform surgical operations at distant places. • Such a technology can prove to be helping hand to physically disabled people or extremely old people .
9. CONCLUSION • In the paper an algorithm is proposed to control a gesture based robotic arm . • The position of each motor is predicted based on the sensory input later the position is corrected while comparing it to the actual position of the motor . • This algorithm is helpful in reducing the effects of vibrations that may take place in a human arm and hence it can find great use in the area of medical surgery.
10.REFERENCES [1] Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), 2012 IEEE International Conference onMechatronics(ICOM) [2] Matthias Rehm, Nikolaus Bee, Elisabeth André, Wave Like an Egyptian - Accelerometer Based Gesture Recognition for Culture Specific Interactions,British Computer Society, 2007 [3] Pavlovic , V. Sharma, R. & Huang, T. (1997), "Visual interpretation of hand gestures for human- computer Interaction : A review" (IEEE Trans. Pattern Analysis and Machine Intelligence., July, 1997. Vol. 19(7) , pp. 677 -695. [4] R. Cipolla and A. Pentland, Computer Vision, for Human-Machine Interaction ., Cambridge Univrsity Press,1998