550 likes | 892 Views
Vision-Based Motion Control of Robots. Azad Shademan Guest Lecturer CMPUT 412 – Experimental Robotics Computing Science, University of Alberta Edmonton, Alberta, CANADA. desired. Left Image. Right Image. current. Vision-Based Control. B. B. A. A. A. B. Vision-Based Control.
E N D
Vision-Based Motion Control of Robots Azad Shademan Guest Lecturer CMPUT 412 – Experimental Robotics Computing Science, University of Alberta Edmonton, Alberta, CANADA
desired Left Image Right Image current Vision-Based Control B B A A A B A. Shademan. CMPUT 412, Vision-based motion control of robots
Vision-Based Control Left Image Right Image B B B A. Shademan. CMPUT 412, Vision-based motion control of robots
Vision-Based Control • Feedback from visual sensor (camera) to control a robot • Also called “Visual Servoing” • Is it any difficult? Images are 2D, the robot workspace is 3D 2D data 3D geometry A. Shademan. CMPUT 412, Vision-based motion control of robots
Eye-to-Hand e.g.,hand/eye coordination Eye-in-Hand Where is the camera located? A. Shademan. CMPUT 412, Vision-based motion control of robots
Visual Servo Control law • Position-Based: • Robust and real-time pose estimation + robot’s world-space (Cartesian) controller • Image-Based: • Desired image features seen from camera • Control law entirely based on image features A. Shademan. CMPUT 412, Vision-based motion control of robots
Position-Based Desired pose Estimated pose A. Shademan. CMPUT 412, Vision-based motion control of robots
Image-Based Desired Image feature Extracted image feature A. Shademan. CMPUT 412, Vision-based motion control of robots
x1 x3 x4 x2 q=[q1 … q6] This Jacobian is important for motion control. Visual-motor Equation Visual-Motor Equation A. Shademan. CMPUT 412, Vision-based motion control of robots
B A A B Visual-motor Jacobian Joint space velocity Image space velocity A. Shademan. CMPUT 412, Vision-based motion control of robots
Image-Based Control Law • Measure the error in image space • Calculate/Estimate the inverse Jacobian • Update new joint values A. Shademan. CMPUT 412, Vision-based motion control of robots
Desired Image feature Extracted image feature Image-Based Control Law A. Shademan. CMPUT 412, Vision-based motion control of robots
Jacobian calculation • Analytic form available if model is known. Known model Calibrated • Must be estimated if model is not known Unknown model Uncalibrated A. Shademan. CMPUT 412, Vision-based motion control of robots
Image Jacobian (calibrated) • Analytic form depends on depth estimates. Camera Velocity • Camera/Robot transform required. • No flexibility. A. Shademan. CMPUT 412, Vision-based motion control of robots
Image Jacobian (uncalibrated) • A popular local estimator: Recursive secant method (Broyden update): A. Shademan. CMPUT 412, Vision-based motion control of robots
Relaxed model assumptions Traditionally: Local methods No global planning Difficult to show asymptotic stability condition is ensured The problem of traditional methods is the locality. Model derived analytically Global asymptotic stability Optimal planning is possible A lot of prior knowledge on the model Calibrated vs. Uncalibrated • Global Model Estimation (Research result) • Optimal trajectory planning • Global stability guarantee A. Shademan. CMPUT 412, Vision-based motion control of robots
Synopsis of Global Visual Servoing • Model Estimation (Uncalibrated) • Visual-Motor Kinematics Model • Global Model • Extending Linear Estimation (Visual-Motor Jacobian) to Nonlinear Estimation • Our contributions: • K-NN Regression-Based Estimation • Locally Least Squares Estimation A. Shademan. CMPUT 412, Vision-based motion control of robots
Key idea: using only the previous estimation to estimate the Jacobian RLS with forgetting factor Hosoda and Asada ’94 1st Rank Broyden update: Jägersand et al. ’97 Exploratory motion: Sutanto et al. ‘98 Quasi-Newton Jacobian estimation of moving object: Piepmeier et al. ‘04 Key idea: using all of the interaction history to estimate the Jacobian Globally-Stable controller design Optimal path planning Local methods don’t! Local vs. Global A. Shademan. CMPUT 412, Vision-based motion control of robots
x1 ? 3 NN q2 q1 K-NN Regression-based Method q2 q1 A. Shademan. CMPUT 412, Vision-based motion control of robots
x1 ? q2 KNN(q) q1 Locally Least Squares Method (X,q) A. Shademan. CMPUT 412, Vision-based motion control of robots
Experimental Setup • Puma 560 • Eye-to-hand configuration • Stereo vision • Features: projection of the end-effector’s position on image planes (4-dim) • 3 DOF for control A. Shademan. CMPUT 412, Vision-based motion control of robots
Measuring the Estimation Error A. Shademan. CMPUT 412, Vision-based motion control of robots
Global Estimation Error A. Shademan. CMPUT 412, Vision-based motion control of robots
With increasing noise level, the error decreases Noise on Estimation Quality KNN LLS A. Shademan. CMPUT 412, Vision-based motion control of robots
Effect of Number of Neighbors A. Shademan. CMPUT 412, Vision-based motion control of robots
Presented two global methods to learn the visual-motor function LLS (global) works better than the KNN (global) and local updates. KNN suffers from the bias in local estimations Noise helps system identification Conclusions A. Shademan. CMPUT 412, Vision-based motion control of robots
Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots
Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots
Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots
Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots
Mean-Squared-Error A. Shademan. CMPUT 412, Vision-based motion control of robots
Task Errors A. Shademan. CMPUT 412, Vision-based motion control of robots
Questions? A. Shademan. CMPUT 412, Vision-based motion control of robots
Position-Based • Robust and real-time relative pose estimation • Extended Kalman Filter to solve the nonlinear relative pose equations. • Cons: • EKF is not the optimal estimator. • Performance and the convergence of pose estimates are highly sensitive to EKF parameters. A. Shademan. CMPUT 412, Vision-based motion control of robots
What kind of nonlinearity? IEKF Overview of PBVS 2D-3D nonlinear point correspondences T. Lefebvre et al. “Kalman Filters for Nonlinear Systems: A Comparison of Performance,” Intl. J. of Control, vol. 77, no. 7, pp. 639-653, May 2004. A. Shademan. CMPUT 412, Vision-based motion control of robots
EKF Pose Estimation yaw pitch roll State variable Process noise Measurement noise Measurement equation is nonlinear and must be linearized. A. Shademan. CMPUT 412, Vision-based motion control of robots
Visual-Servoing Based on the Estimated Global Model A. Shademan. CMPUT 412, Vision-based motion control of robots
Control Based on Local Models A. Shademan. CMPUT 412, Vision-based motion control of robots
Estimation for Local Methods • In practice: Broyden 1st-rank estimation, RLS with forgetting factor, etc. A. Shademan. CMPUT 412, Vision-based motion control of robots
A. Shademan. CMPUT 412, Vision-based motion control of robots