1 / 42

5.1 Length and Dot Product in R n

5.1 Length and Dot Product in R n. Chapter 5 Inner Product Spaces. Notes: The length of a vector is also called its norm. Notes:. is called a unit vector. Notes: The process of finding the unit vector in the direction of v is called normalizing the vector v.

catrin
Download Presentation

5.1 Length and Dot Product in R n

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 5.1 Length and Dot Product in Rn Chapter 5 Inner Product Spaces • Notes: The length of a vector is also called its norm. • Notes: is called a unit vector.

  2. Notes: The process of finding the unit vector in the direction of v is called normalizing the vector v. • A standard unit vectorin Rn: • Ex: the standard unit vectorin R2: the standard unit vectorin R3:

  3. Notes: (Properties of distance) (1) (2) if and only if (3)

  4. Euclidean n-space: Rn was defined to be the set of all order n-tuples of real numbers. When Rn is combined with the standard operations of vector addition, scalar multiplication, vector length, and the dot product, the resulting vector space is called Euclidean n-space.

  5. Dot product and matrix multiplication: (A vector in Rn is represented as an n×1 column matrix)

  6. Note: The angle between the zero vector and another vector is not defined.

  7. Note: The vector 0 is said to be orthogonal to every vector.

  8. Note: Equality occurs in the triangle inequality if and only if the vectors u and v have the same direction.

  9. 5.2 Inner Product Spaces • Note:

  10. Note: A vector space V with an inner product is called an inner product space. Vector space: Inner product space:

  11. Note:

  12. Properties of norm: (1) (2) if and only if (3)

  13. Properties of distance: (1) (2) if and only if (3)

  14. Note: If v is a init vector, then . The formula for the orthogonal projection of u onto v takes the following simpler form.

  15. 5.3 Orthonormal Bases: Gram-Schmidt Process • Note: If S is a basis, then it is called an orthogonal basis or an orthonormal basis.

  16. 5.4 Mathematical Models and Least Squares Analysis

  17. (read “ perp”) • Orthogonal complement of W: Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector inW. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. • Notes:

  18. Notes: • Ex:

  19. Notes: (1) Among all the scalar multiples of a vector u, the orthogonal projection of v onto u is the one that is closest to v. (2) Among all the vectors in the subspace W, the vector is the closest vector to v.

  20. The four fundamental subspaces of the matrix A: • N(A): nullspace of AN(AT): nullspace of AT • R(A): column space of AR(AT): column space of AT

  21. Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is inconsistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small. • Least squares solution: Given a system Ax = bof m linear equations in n unknowns, the least squares problem is to find a vector x in Rn that minimizes with respect to the Euclidean inner product on Rn. Such a vector is called a least squares solution ofAx = b.

  22. (the normal equations of the least squares problem Ax = b)

  23. Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system . • Thm: For any linear system , the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is

  24. Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is

  25. 5.5 Applications of Inner Product Spaces

  26. Note: C[a, b] is the inner product space of all continuous • functions on [a, b].

More Related