1 / 15

NICE LOOKING MATRICES

NICE LOOKING MATRICES. By now it shoud be clear that matrices are the backbone (computationally at least) of the attempt to solve linear systems, or, even more precisely, the attempt to decide which one of the three possibilities outlined previously obtains. Recall them A linear system may

oria
Download Presentation

NICE LOOKING MATRICES

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NICE LOOKING MATRICES By now it shoud be clear that matrices are the backbone (computationally at least) of the attempt to solve linear systems, or, even more precisely, the attempt to decide which one of the three possibilities outlined previously obtains. Recall them A linear system may • have no solutions at all. • have exactly one solution • have infinitely many solutions

  2. (By the way, the textbook says that if 2 or 3 obtain the system is said to be consistent, if 1 obtains the system is called (duh!) inconsistent! Let’s see how far our three simple elementary row operations can take us. What I will do is use a program I wrote some time ago and show you (here) screenshots of the run, but in real life I will show you the run. I will in fact reproduce the algorithm shown on pp. 15-17 of the textbook.

  3. First, however, we need to learn the technical (in context) meaning of a few words. We consider a matrix (each bullet is an entry.) We define, for any row Leading term to mean the leftmost non-zero entry in that row.

  4. So, the leading term of the 3rd row of the matrix Is , while the 2nd row has leading term . Challenge: What does it mean to say a row has no leading term? Right, the row consists entirely of zeros, we call such anomaly a zero row. (They do happen) One more technical word.

  5. A matrix M is said to be “right-on” (no need to throw a highly intimidating word at you, first we understand the concept, then the Sunday word.) if it obeys the following conditions: • Every non-zero row is above every zero row. • The leading term of every row R is strictly to the right of every leading term of any row above R. Your textbook has a third condition, namely • All entries in a column below a leading term are zeros. For extra credit (1 out of 100 at the end of the semester) give me an argument that shows that

  6. conditions 1 and 2 imply 3. Here are two matrices, one is “right-on”, the other not. Which is which? Why? blue or green?

  7. One last word! A matrix M is said to be “really right-on” if it is right-on and also • Every entry in the same column and above a leading term is zero. Both matrices shown are right-on, ony one is really right-on; which one?, Why? red or green

  8. For another extra credit point rephrase 1,2,3for an alien up in space who (that ?, do aliens have gender?) knows numbers but has no idea what you mean by “above” or “strictly to the right”. Extra credit due Monday, 1/23. Time to translate common English into impressive English. Common Impressive right-onin (row) echelon form really right-on in reduced (row) echelon form If the augmented matrix of a linear system is in reduced echelon form, then the solutions are easy to read off (we’ll do many examples). The beauty of our row operations is in the following

  9. Theorem. Let denote the augmented matrix of a linear system. Then • Elementary row operations do not change the solution set of the system. • An appropriate sequence of elementary row operations produces a matrix that is in reduced echelon form. (Note that B. says you solved the system !) We will provide a “hands-on” proof of the theorem by providing the steps needed to achieve B. for any augmented matrix.

  10. We need one last word (for today, promise!) A position in the matrix is called a pivot if it is a leading term in the reduced echelon form of . Returning to the theorem, the proof of A. is trivial, we did it when we defined each elementary row operation. To prove B. we take an augmented matrix such as the one exhibited in the textbook on p. 15 and follow the steps as shown in the textbook. The program I am using will be available to you online soon. Here we go.

  11. Start: Next: Now use (1,1) as pivot, replace row2 with (-1)xRow1 + row2. You will get

  12. the following display: next use (2,2) as a pivot and (aiming for reduced echelon form) get

  13. Writing things as a linear system we get Which tells us that all the solutions are given by and are called free variables and can have any value.

  14. Study the previous example carefully, it contains all the aspects of solving linear systems but one: What happens if the last column in the augmented matrix ends up with a non-zero leading term? This means that you have a zero row of coef-ficients set equal to a non-zero number, like Conclusion? Right, inconsistent system. Your book summarizes all this beautifully as Theorem 2, p.21.

  15. We will do one more example from the book. Exercise 11, p. 22

More Related