530 likes | 592 Views
Lecture 12 Inner Product Space & Linear Transformation. Last Time - Length and Dot Product in R n - Inner Product Spaces - Orthonormal Bases:Gram-Schmidt Process. Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE 翁慶昌 -NTUEE SCC_12_2007.
E N D
Lecture 12 Inner Product Space & Linear Transformation Last Time - Length and Dot Product in Rn - Inner Product Spaces - Orthonormal Bases:Gram-Schmidt Process Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE翁慶昌-NTUEE SCC_12_2007
Lecture 12: Inner Product Spaces & L.T. Today • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations Reading Assignment: Secs 5.4,5.5,6.1,6.2 Next Time • The Kernel and Range of a Linear Transformation • Matrices for Linear Transformations • Transition Matrix and Similarity Reading Assignment: Secs 6.2-6.4
What Have You Actually Learned about Inner Product Spaces So Far?
Today • Orthonormal Bases:Gram-Schmidt Process (Cont.) • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations
Gram-Schmidt orthonormalization process: is a basis for an inner product space V is an orthogonal basis. is an orthonormal basis.
Ex 7:(Applying the Gram-Schmidt orthonormalization process) Apply the Gram-Schmidt process to the following basis. Sol:
Orthogonal basis Orthonormal basis
Ex 10: (Alternative form of Gram-Schmidt orthonormalization process) Find an orthonormal basis for the solution space of the homogeneous system of linear equations. Sol:
Thus one basis for the solution space is (orthogonal basis) (orthonormal basis)
Keywords in Section 5.3: • orthogonal set: 正交集合 • orthonormal set: 單範正交集合 • orthogonal basis: 正交基底 • orthonormal basis: 單範正交基底 • linear independent: 線性獨立 • Gram-Schmidt Process: Gram-Schmidt過程
Today • Orthonormal Bases:Gram-Schmidt Process (Cont.) • Mathematical Models and Least Square Analysis • Inner Product Space Applications • Introduction to Linear Transformations
(read “ perp”) • Notes: 5.4 Mathematical Models and Least Squares Analysis • Orthogonal complement of W: Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector inW. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W.
Notes: • Ex:
Thm 5.13: (Properties of orthogonal subspaces) Let W be a subspace of Rn. Then the following properties are true. (1) (2) (3) • Direct sum: Let and be two subspaces of . If each vector can be uniquely written as a sum of a vector from and a vector from , , then is the direct sum of and , and you can write .
Thm 5.14: (Projection onto a subspace) If is an orthonormal basis for the subspace Sof V, and , then
Ex 5: (Projection onto a subspace) Find the projection of the vector v onto the subspace W. Sol: an orthogonal basis for W an orthonormal basis for W
Thm 5.15: (Orthogonal projection and distance) Let W be a subspace of an inner product space V, and . Then for all , ( is the best approximation to v from W)
Pf: By the Pythagorean theorem
Notes: (1) Among all the scalar multiples of a vector u, the orthogonal projection of v onto u is the one that is closest to v. (2) Among all the vectors in the subspace W, the vector is the closest vector to v.
Thm 5.16: (Fundamental subspaces of a matrix) If A is an m×n matrix, then (1) (2) (3) (4)
Ex 6: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:
Sol: (reduced row-echelon form) • Ex 3: Let W is a subspace of R4 and . (a) Find a basis for W (b) Find a basis for the orthogonal complement of W.
Notes: is a basis for W
Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.
Least squares solution: Given a system Ax = bof m linear equations in n unknowns, the least squares problem is to find a vector x in Rn that minimizes with respect to the Euclidean inner product on Rn. Such a vector is called a least squares solution ofAx = b.
Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system . • Thm: For any linear system , the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is
Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is
Ex 7: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.
Sol: the associated normal system
the least squares solution of Ax = b the orthogonal projection of b on the column space of A
Keywords in Section 5.4: • orthogonal to W: 正交於W • orthogonal complement: 正交補集 • direct sum: 直和 • projection onto a subspace: 在子空間的投影 • fundamental subspaces: 基本子空間 • least squares problem: 最小平方問題 • normal equations: 一般方程式
6.1 Introduction to Linear Transformations • Function T that maps a vector space V into a vector space W: V: the domain of T W: the codomain of T
Image of v under T: If v is in V and w is in W such that Then w is called the image of v under T . • the range of T: The set of all images of vectors in V. • the preimage of w: The set of all v in V such that T(v)=w.
Ex 1: (A function from R2 intoR2 ) (a) Find the image of v=(-1,2). (b) Find the preimage of w=(-1,11) Sol: Thus {(3, 4)} is the preimage of w=(-1, 11).
Addition in V Addition in W Scalar multiplication in V Scalar multiplication in W • Notes: (1) A linear transformation is said to be operation preserving. (2) A linear transformation from a vector space into itself is called a linear operator.
Ex 2: (Verifying a linear transformation T from R2 into R2) Pf:
Notes: Two uses of the term “linear”. (1) is called a linear function because its graph is a line. (2) is not a linear transformation from a vector space R into R because it preserves neither vector addition nor scalar multiplication.
Zero transformation: • Identity transformation: • Thm 6.1: (Properties of linear transformations)
Ex 4: (Linear transformations and bases) Let be a linear transformation such that Find T(2, 3, -2). Sol: (T is a L.T.)
Ex 5: (A linear transformation defined by a matrix) The function is defined as Sol: (vector addition) (scalar multiplication)
Thm 6.2: (The linear transformation given by a matrix) Let A be an mn matrix. The function T defined by is a linear transformation from Rn into Rm. • Note:
Ex 7: (Rotation in the plane) Show that the L.T. given by the matrix has the property that it rotates every vector in R2 counterclockwise about the origin through the angle . Sol: (polar coordinates) r: the length of v :the angle from the positive x-axis counterclockwise to the vector v
r:the length of T(v) +:the angle from the positive x-axis counter- clockwise to the vector T(v) Thus, T(v) is the vector that results from rotating the vector v counterclockwise through the angle .
Ex 8: (A projection in R3) The linear transformation is given by is called a projection in R3.