640 likes | 1.46k Views
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University. Visual Servoing and Target Tracking. Outline. Visual Servo Control Image based visual servo Position based visual servo Hybrid visual servo and other issues Target Tracking. Outline.
E N D
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University Visual ServoingandTarget Tracking
Outline • Visual Servo Control • Image based visual servo • Position based visual servo • Hybrid visual servo and other issues • Target Tracking
Outline • Visual Servo Control • Image based visual servo • Position based visual servo • Hybrid visual servo and other issues • Target Tracking
Visual Servo Control • Vision Based Robot Control Task: • USE - computer vision data • CONTROL - motion of a robot
Visual Servo Control • Camera Configuration: Eye-in-hand Fixed in workspace
Visual Servo Control Servoing Architecture control law Data extraction
Visual Servo Control • Basic Components • Image features • Error function • Velocity controller • Interaction matrix
Visual Servo Control • Basic Components • Image features • Error function • Velocity controller • Interaction matrix s(m(t),a) ;a is a set of parameters that represent potential additional knowledge about the system (e.g. Camera intrinsic parameters); m(t) is a set of image measurements (e.g. Image coordinates of interest points) s* contains the desired values of the features.
Visual Servo Control • Basic Components • Image features • Error function • Velocity controller • Interaction matrix e(t)=s(m(t),a)-s* The aim of the control scheme is to minimize error e(t) At the desired pose, e(t)=0.
Visual Servo Control • Basic Components • Image features • Error function • Velocity controller • Interaction matrix The control law vc – the spatial velocity of the camera, input to the robot controller Problem: what is the form of Ls
Visual Servo Control • Basic Components • Image features • Error function • Velocity controller • Interaction matrix Ls is the interaction matrix, which describes the relationship between the time variation of s and the camera velocity vc. , Le=Ls is the approximation of the pseudo-inverse of Ls. Problem: how to estimate -- according to different designs of s
Visual Servo Control • Categories: • Image based control • Position based control
Outline • Visual Servo Control • Image based servo control • Position based servo control • Hybrid visual servo and other issues • Target Tracking
Image Based Visual Servo (IBVS) Ls S(m(t),a)
Image Based Visual Servo (IBVS) • Image features s(m(t),a) • Traditionally, s is defined by the image-plane coordinates of a set of points. s=x=(x,y) (x,y)
Image Based Visual Servo (IBVS) • Interaction Matrix The value Z is the depth of the point relative to the camera frame. Therefore, any control scheme that uses this form of the interaction matrix must estimate or approximate the value of Z. When Z is not known, cannot be directly used. An approximation must be used. To control six degrees of freedom, at least three points are necessary. There exists some configurations for which Lx is singular.
Image Based Visual Servo (IBVS) • Effects of different estimations of Ls
Image Based Visual Servo (IBVS) • Advantages: • The positioning accuracy of the system is less sensitive to camera calibration errors • Computational advantage • Disadvantages: • Presence of singularity • Servoing in 2-D
Outline • Visual Servo Control • Image based servo control • Position based servo control • Hybrid visual servo and other issues • Target Tracking
Position Based Visual Servo (PBVS) • extract the image features -> • compute the current camera pose with respect to a reference coordinate on the object -> • compare with the desired camera pose with respect to the reference coordinate on the object Current pose y z x desired pose
Position Based Visual Servo (PBVS) • Consider three coordinate frames: • The current camera frame • The desired camera frame • A reference frame attached to the object • gives the coordinates of the origin of the object frame to the current camera frame • gives the coordinates of the origin of the object frame to the desired camera frame • , the rotation matrix that gives the orientation of the current camera frame relative to the desired frame
Position Based Visual Servo (PBVS) Current pose o desired pose
Position Based Visual Servo (PBVS) • Define s=(t,θu) • t is a translation vector, θu is the angle/axis parameterization for the rotation • 1) t is defined relative to the object frame
Position Based Visual Servo (PBVS) • Define s=(t,θu) • t is a translation vector, θu is the angle/axis parameterization for the rotation • 2) t is defined relative to the desired camera frame
Position Based Visual Servo (PBVS) Effects of different designs
Position Based Visual Servo (PBVS) • Advantages: • Possible to describe tasks in terms Cartesian pose as is common in Robotics • Disadvantages: • Sensitive to calibration error • Depend on having an accurate mode of target objects – a form of calibrations • Servoing in 3-D
Outline • Visual Servo Control • Image based servo control • Position based servo control • Hybrid servo and other issues • Target Tracking
Hybrid servo and other extensions • Hybrid VS – combining 2-D and 3-D features • 2.5-D visual servo – add depth of the point • s • Camera trajectory is a straight line • Image trajectory of the center of the gravity of the object is also a straight line
Hybrid servo and other issues • Stereo vision system in IBVS Because of epipolar constraint, this approach actually requires 3-D parameters in s. Thus, it would be, strictly speaking, a position-based approach
Outline • Visual Servo Control • Image based servo control • Position based servo control • Hybrid visual servo and other issues • Target Tracking
Target Tracking • Moving target=> varying value s*(t) The time variation of e due to the generally unknown target motion Improve estimated value using Kalman filter or more-elaborate filtering methods Estimate ∂e/∂t
End Thanks!