520 likes | 1.19k Views
5.1 Length and Dot Product in R n. Chapter 5 Inner Product Spaces. Notes: The length of a vector is also called its norm. Notes:. is called a unit vector. Notes: The process of finding the unit vector in the direction of v is called normalizing the vector v.
E N D
5.1 Length and Dot Product in Rn Chapter 5 Inner Product Spaces • Notes: The length of a vector is also called its norm. • Notes: is called a unit vector.
Notes: The process of finding the unit vector in the direction of v is called normalizing the vector v. • A standard unit vectorin Rn: • Ex: the standard unit vectorin R2: the standard unit vectorin R3:
Notes: (Properties of distance) (1) (2) if and only if (3)
Euclidean n-space: Rn was defined to be the set of all order n-tuples of real numbers. When Rn is combined with the standard operations of vector addition, scalar multiplication, vector length, and the dot product, the resulting vector space is called Euclidean n-space.
Dot product and matrix multiplication: (A vector in Rn is represented as an n×1 column matrix)
Note: The angle between the zero vector and another vector is not defined.
Note: The vector 0 is said to be orthogonal to every vector.
Note: Equality occurs in the triangle inequality if and only if the vectors u and v have the same direction.
5.2 Inner Product Spaces • Note:
Note: A vector space V with an inner product is called an inner product space. Vector space: Inner product space:
Properties of norm: (1) (2) if and only if (3)
Properties of distance: (1) (2) if and only if (3)
Note: If v is a init vector, then . The formula for the orthogonal projection of u onto v takes the following simpler form.
5.3 Orthonormal Bases: Gram-Schmidt Process • Note: If S is a basis, then it is called an orthogonal basis or an orthonormal basis.
(read “ perp”) • Orthogonal complement of W: Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector inW. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. • Notes:
Notes: • Ex:
Notes: (1) Among all the scalar multiples of a vector u, the orthogonal projection of v onto u is the one that is closest to v. (2) Among all the vectors in the subspace W, the vector is the closest vector to v.
The four fundamental subspaces of the matrix A: • N(A): nullspace of AN(AT): nullspace of AT • R(A): column space of AR(AT): column space of AT
Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is inconsistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small. • Least squares solution: Given a system Ax = bof m linear equations in n unknowns, the least squares problem is to find a vector x in Rn that minimizes with respect to the Euclidean inner product on Rn. Such a vector is called a least squares solution ofAx = b.
Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system . • Thm: For any linear system , the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is
Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is
Note: C[a, b] is the inner product space of all continuous • functions on [a, b].