160 likes | 310 Views
GESTURE BASED COMPUTING. Team Members: D. Sai Goud - 08211A0569 Satya Swarup Sahoo - 08211A0574 G. Shivadeep - 08211A0576 Rohit Reddy - 07211A0561. Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No .: CSB33. Agenda. Abstract Kinect
E N D
GESTURE BASED COMPUTING Team Members: D. Sai Goud - 08211A0569 Satya Swarup Sahoo - 08211A0574 G. Shivadeep - 08211A0576 Rohit Reddy - 07211A0561 Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33
Agenda • Abstract • Kinect • Working with Kinect • iRobot Create • Working with Create • Integrating Kinect and Create • Use Case Diagram • Class Diagram • Implementation • Progress • Sample Code • Conclusion GESTURE BASED COMPUTING
Abstract • Technology has advanced at a very higher rate in the recent years and has made lives of humans very easy and comfortable. In the field of electronics and computers, people have grown a higher affinity towards comfortable devices that involve very less interaction of the users. One such technology developed is the Gesture Based Computing, where users have minimal physical involvement. And this kind of computing uses sensors that can sense the user and detect the gestures made by the user. Hence, a good 3D sensor KINECT is used for this purpose. It is a 3D sensor which can capture the user's movements and make it computable by using its software development kit. It has its impact even in the field of robotics, where user can control and operate a robot with his gestures. GESTURE BASED COMPUTING
Kinect: • Kinect is the product of Microsoft's Project Natal and launched in America on November 4th, 2010 • It's a well equipped 3D sensing device. It has sensors for RGB image, depth image, and multi-array microphone GESTURE BASED COMPUTING
Kinect: GESTURE BASED COMPUTING
Working with Kinect • Using Microsoft Visual Studio 2010 and Kinect SDK. • We don’t have any simulation environment while working on Kinect. • It supports many different Languages like VB.NET, C++, C#.NET. GESTURE BASED COMPUTING
iRobot Create: GESTURE BASED COMPUTING
iRobot Create: GESTURE BASED COMPUTING
Working with iRobot Create: • Using Microsoft Robotics Developer Studio. • Real time and Simulation Environment while working with iRobot Create. • Support many different Languages like VB.NET, C++, C#.NET. GESTURE BASED COMPUTING
Integrating Kinect and Create: User’s gestures are scanned by the Kinect and the corresponding action to be performed by the iRobot will be initiated to it via Bluetooth. GESTURE BASED COMPUTING
Use Case Diagram: Getting connected through services Robot Make gestures Capture gestures from user User Convert the gesture to a functionality KINECT Perform the operation GESTURE BASED COMPUTING
Class Diagram: Kinect skeleton; joints; User identifyGestures(); compute(); makeGestures(); performAction(); Robot _mainPort; getConnection(); performAction(); GESTURE BASED COMPUTING
Implementation Details: GESTURE BASED COMPUTING
Progress: • We have successfully provided functionalities to some gestures through which we can implement the usage of various keys of the keyboard (such as BACKSPACE, ARROW KEYS, TAB, etc.) • We have been successful in controlling a robot through our system. And are now working on integrating the gesture based computing in it and controlling the robot through our gestures. GESTURE BASED COMPUTING
Conclusion A good part of our project is that it can be integrated with many user applications and reduce the physical involvement to a large extent. Currently we areintegrating it with the iRobot Create robot using the Microsoft Robotics Studio. The outcome of the project will provide the user with the power of controlling the robot using his/her gestures. GESTURE BASED COMPUTING