150 likes | 333 Views
Human-in-the-Loop Control of an Assistive Robot Arm. Katherine Tsui and Holly Yanco University of Massachusetts, Lowell. Challenge. Using a standard controller, put the ball in the cup. 1. Turn the cup over. 2. Pick up the ball. 3. Put the ball in the cup.
E N D
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell
Challenge Using a standard controller, put the ball in the cup. 1. Turn the cup over. 2. Pick up the ball. 3. Put the ball in the cup. Average time of execution: ~3-5 minutes, even by middle school children with video game experience! Although this is a simplified example, similar tasks may occur repeated throughout a handicapped person’s daily life.
Motivation • Why? • Unintuitive controllers • Operator sensory overload • For severely handicapped people, activities of daily life are difficult enough to perform. • While an assitive robotic device like ours allows for limited independence, it can be frustrating and tiresome to operate. • Let’s abstract this away…
Hardware • Manus Assistive Robotic Manipulator (ARM) by Exact Dynamics • 6 Degree of Freedom (DoF) • plus 2 DoF gripper end-effector • Joint encoders • Cameras • Shoulder view • Gripper view
Standard Control Movement for a “out of the box” configuration is done by menus accessed from single switch, keypad, or joystick input.
Standard Control: Using the Joint Menu Direct Joint Mode + Direct control of individual joints - Unable to temporally simultaneously move joints - Not how humans do it… we don’t think in terms like move shoulder up, rotate wrist, extend forearm, etc.
Standard Control:Using the Cartesian Menu Direct Cartesian Mode + Gripper moves linearly in 3D + Joints can move collaterally in space and time - Still not how humans do it… we don’t think in term of moving left, right, up, down, etc.
Alternative Control • Transparent Mode • ARM has Controller Area Network (CAN) communication with PC • ARM transmits status packages at 20ms intervals. • t=20ms: message 0x350 gives ARM status and position • t=40ms: message 0x360 gives gripper position • t=60ms: message 0x37F asks for return package • t=80ms: message 0x350… • Every 60ms, when message 0x37F is sent, movement information can be returned as ARM input.
How should the ARM move? Like humans do! Think: I want the cell phone. Actions: See, reach, grasp However, the intended users may not be capable of these actions, therefore we simplify.
Selection Process Given what the user sees directly ahead of them, and assuming the desired, unobstructed object is within reach… zoom in on the cell phone!
Movement • From the user selection, we know the x,y position of where the ARM should be. • How do we move there? • By using Phission and joint encoder feedback to determine movement length, speed, and direction: • Phission: We’ve trained on the color we desire to track. While the center of the blob is not near the desired (x,y), move towards. • Feedback: Monitor ARM status and position
Drop for Z Depth information is deduced through simulated stereo vision. 2 images are sequentially taken as the gripper moves along the y-axis; B is known. Disparity between the images yields depth Z and the ARM moves “close” to the desired object. Z = Bf/(xL – xR)
Future Work • Distance sensing: laser • Non-rigid stereo vision