160 likes | 186 Views
Explore the concepts of linearly independent vectors, eigenvectors, span, similarity transformation, and least squares solutions. Learn about the rank and nullspace, as well as properties of Singular Value Decomposition (SVD). Discover how to solve problems using pseudoinverse and enforcing orthonormality constraints in rotation matrices. Dive into nonlinear parameter measurement techniques like Newton and Levenberg-Marquardt iterations.
E N D
Linearly independent vectors • span(V): span of vector space V is all linear combinations of vectors vi,i.e.
The eigenvalues of A are the roots of the characteristic equation diagonal form of matrix Eigenvectors of A are columns of S
Similarity transform then A and B have the same eigenvalues The eigenvector x of A corresponds to the eigenvector M-1x of B
Least Squares • More equations than unknowns • Look for solution which minimizes ||Ax-b|| = (Ax-b)T(Ax-b) • Solve • Same as the solution to • LS solution
Properties of SVD si2 are eigenvalues of ATA Columns of U (u1 , u2 , u3 ) are eigenvectors of AAT Columns of V (v1 , v2 , v3 ) are eigenvectors of ATA
Solving pseudoinverse of A equal to for all nonzero singular values and zero otherwise with
Enforce orthonormality constraints on an estimated rotation matrix R’
Newton iteration f( ) is nonlinear parameter measurement