110 likes | 236 Views
ChE 250 Numeric Methods. Lecture #13, Chapra, Chapter 11: Gauss-Seidel 20070214. Special Matrices. There are several special matrices that are useful for engineering applications because they have simplified solutions requiring less calculation and memory
E N D
ChE 250 Numeric Methods Lecture #13, Chapra, Chapter 11: Gauss-Seidel 20070214
Special Matrices • There are several special matrices that are useful for engineering applications because they have simplified solutions requiring less calculation and memory • The banded matrix has only a diagonal strip with all zeros in the triangular upper and lower region • By representing these as vectors we can save memory • Symmetric matrix has elements aij=aji • Cholesky algorithm quickly creates the LU
Banded Matrix • Tridiagonal matrix has three diagonals and the rest of the matrix values equal zero • This is typical of continuity equations • Formulate the three diagonals as vectors e, f, g • Vectors e and g will have one less value than f (the main diagonal)
Banded Matrix • Decomposition is easily computed • Calculating new e’s and f’s • Then the r (equivalent to d) vector is calculated by substitution with new e’s • And finally x also by substitution with new f’s • So we have performed a simplified version of the LU decomposition and substitution
Banded Matrix • Example 11.1 • Questions?
Cholesky • For a symmetric matrix, the Cholesky decomposition has the property that it equals the coefficient matrix when multiplied by its transpose • In Scilab, the upper version (transpose) is returned, so be careful
Gauss-Seidel • The Gauss-Seidel method is an expansion of the fixed-point iteration ‘methodical’ way • The coefficient matrix is solved for each independent variable on the diagonal, so variable n will be solved from function n on row n • Then iterate through all x’s until they converge • Example 11.3 • Questions?
Gauss-Seidel with Relaxation • Introduce a weighting factor λ in the iteration • Calculate xi+1 as before • Then apply the weighting • Lambda depends on the system • λ=0 is old value • λ=1 is new value • 0<λ<1 underrelaxed • 1<λ<2 overrelaxed • The choice of λ depends on the system and experience • This may be built into system software! So you must understand how changes to the system will effect software performance
Part 3 summary • Linear Algebraic Equations • Gauss Elimination • Pivoting and scaling • LU decomposition and inversion • Gauss-Seidel • With relaxation • Most important • Matlab/scilab matrix arithmetic and notation • Excel goal seek and solver with n equations • Table 11.1 show useful Matlab functions • Table PT3.3 • Methods and algorithms • Potential problems listed
Case Studies • Problem 12.11 • Peristaltic pump • 7 equations and 7 unknowns • Questions?
Preparation for 16Feb • Reading • Part 4 intro to optimization • Chapter 13: One-Dimensional Unconstrained Optimization • Homework set 5 due 21Feb • Chapter 11: • 11.7, 11.14, 11.15 • Chapter 12: • 12.2, 12.9 • Chapter 13: • 13.7, 13.10, 13.17, 13.18,