420 likes | 849 Views
Linear Algebra Review. CS479/679 Pattern Recognition Dr. George Bebis. n-dimensional Vector. An n -dimensional vector v is denoted as follows: The transpose v T is denoted as follows:. Inner (or dot) product.
E N D
Linear Algebra Review CS479/679 Pattern RecognitionDr. George Bebis
n-dimensional Vector • An n-dimensional vector v is denoted as follows: • The transpose vTis denoted as follows:
Inner (or dot) product • Given vT= (x1, x2, . . . , xn) and wT= (y1, y2, . . . , yn), their dot product defined as follows: (scalar) or
k Orthogonal / Orthonormal vectors • A set of vectors x1, x2, . . . , xnis orthogonal if • A set of vectors x1, x2, . . . , xnis orthonormalif
Linear combinations • A vector v is a linear combination of the vectors v1, ..., vk: where c1, ..., ck are scalars • Example: any vector in R3 can be expressed as a linear combinations of the unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)
Space spanning • A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S -Example: the vectors i, j, and k span R3 w
Linear dependence • A set of vectors v1, ..., vkare linearly dependent if at least one of them is a linear combination of the others. (i.e., vj does not appear at the right side)
Linear independence • A set of vectors v1, ..., vkis linearly independent if Example:
Vector basis • A set of vectors (v1, ..., vk) is said to be a basis for a vector space W if (1) (v1, ..., vk)are linearly independent (2) (v1, ..., vk)span W • Standard bases: R2 R3 Rn
Matrix Operations • Matrix addition/subtraction • Matrices must be of same size. • Matrix multiplication m x n q x p m x p Condition: n = q
Symmetric Matrices Example:
Determinants 2 x 2 3 x 3 n x n
Matrix Inverse • The inverse A-1 of a matrix A has the property: AA-1=A-1A=I • A-1 exists only if • Terminology • Singular matrix:A-1does not exist • Ill-conditioned matrix: A is close to being singular
Matrix Inverse (cont’d) • Properties of the inverse:
Matrix trace properties:
Rank of matrix • Equal to the dimension of the largest square sub-matrix of A that has a non-zero determinant. Example: has rank 3
Rank of matrix (cont’d) • Alternative definition: the maximum number of linearly independent columns (or rows) of A. Therefore, rank is not 4 ! Example:
Eigenvalues and Eigenvectors • The vector vis an eigenvector of matrix A and λ is an eigenvalue of A if: • Interpretation: the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)
Computing λ and v • To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:
Properties • Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n) • Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv) • Suppose λ1, λ2, ..., λnare the eigenvalues of A, then:
Properties (cont’d) If A has n distinct eigenvalues λ1, λ2, ..., λn, then the corresponding eigevectorsv1,v2 ,. . . vnform a basis: (1) linearly independent (2) span Rn
Matrix diagonalization • Given A, find P such thatP-1APis diagonal (i.e., P diagonalizes A) • Take P = [v1v2 . . . vn], where v1,v2 ,. . . vnare the eigenvectors of A:
Matrix diagonalization (cont’d) Example:
Matrix decomposition • Let us assume that A is diagonalizable, then:
Decomposition of symmetric matrices • The eigenvalues of symmetric matrices are all real. • The eigenvectors corresponding to distinct eigenvalues are orthogonal. P-1=PT A=PDPT=