230 likes | 342 Views
Methods for ill-posed & nonlinear systems. Ill-posed linear systems Equations form Matrix form No solution in classical sense!!!!. Least-square problem. Def: The least square problem is to find an n-vector to minimize Two conditions m>=n, i.e. overdetermined system
E N D
Methods for ill-posed & nonlinear systems • Ill-posed linear systems • Equations form • Matrix form • No solution in classical sense!!!!
Least-square problem • Def: The least square problem is to find an n-vector to minimize • Two conditions • m>=n, i.e. overdetermined system • A has full column rank, i.e. rank(A)=n • Thm: The least-square problem always has a solution. The solution is unique iff A is of full column rank. • Proof: See details in class (or as an exercise)
Least square problem • Thm: (Normal equation) Let be a solution of the least square problem. Then the residual vector satisfies the normal equation • Proof: See details in class (or as an exercise) • Numerical methods for least square problem • Normal equation method -- when n is small • QR method • SVD method -- most popular!!!
Normal equation method • Idea: Solve the normal equations • Methods (m>>n): • Cholesky factorization • CG method & PCG method, ….. • Drawbacks: • Condition number increases!! • Sensitive to round-off errors
QR method • Decompose • Denote • Solution • Proof: See details in class (or as an exercise)
SVD method • The solution of LS problem is • Since A has full column rank & has the following SVD • Solution
Nonlinear systems • Nonlinear systems • Equations form • Vector form • An example
Nonlinear systems • Solutions • Existence & Uniqueness • Minimization problem • Local minimizer vs Global minimizer • Numerical methods: • Picard iteration or fixed point method • Newton’s & Quasi-Newton’s methods – most popular • The Secant method • Other methods --- read yourself !! • The Fibonacci search method -- based on Fibonacci numbers • The golden section search method -- based on the golden # =0.618!!
Convergence rate • Def: A sequence of vectors converges locally to with order of r if • Comments • Any norm can be used • The constant C may depend on the norm used • The order of convergence r doesn’t depend on the norm!!! • Convergence rates: • Linear convergence: r=1 & 0<C<1 • Superlinear convergence: r>1 • Quadratic convergence: r=2
Newton’s method for 1D • In 1D, i.e. n=1 • Nonlinear equation • Minimization problem • f(x) smooth. There exist roots & each root is single!! • Newton’s method • If f(x) is linear function, solve exact!! • If f(x) is nonlinear function, approximate it by linear function
Locally convergence rate • Thm: Suppose , then the Newton’s method converges locally quadratic, i.e. • Proof: See details in class (or as an exercise) • An example:
Minimization view of Newton’s method • If g(x) is quadratic polynomials, we can find exact!!! • If g(x) is general nonlinear function, we can approximate g(x) locally by a quadratic polynomial.
Newton’s method for n dimensions • The Problem • Define • Taylor expansion to get a quadratic minimization problem • Existence & uniqueness iff the Hessian matrix is positive definite (PD) (vs positive semi-definite (PSD))
Newton’s method for n dimensions • Solution of the quadratic minimization problem • Local convergence rate -- locally quadratic • For nonlinear system • Computational cost: very expensive for computing derivatives when n>>1 !!
The secant (quasi-Newton) method • Newton’s method in 1D: • In many cases, the derivatives are not well-defined or expensive to calculate. Approximate it by secant • The secant (or quasi-Newton) method: • Convergence rate: superliner -- Exercise • Extension to high dimensions: Exercise
Comments on Newton’s method • Some drawbacks of Newton’s method • Need to compute the derivatives of the function f OR the first and second derivatives of the function g!! This can be revised by the secant (or quasi-Newton) method. • The function f & g must be smooth!! In many cases, they are not smooth!! • They are semi-smooth, regularizing technique • When n>>1, we need solve a linear system in every step!! This is very time consuming!! This can be revised by search directions & line searches.
Search directions & line searches • When G is quadratic form – coming from solving linear equations • Steepest decent method • Conjugate gradient (CG) method • When G is general form – coming from solving nonlinear equations • Nonlinear steepest decent method • Nonlinear CG method • The Problem • Find a sequence by iteration • Search directions • Line searches
Search directions & line searches • In Newton’s method, we choose • We have flexibility to choose the search directions & line searches. In general, they need satisfying • Descent directions satisfying • Line search, one-dimensional optimization problem
Search directions & line searches • Best local search direction, i.e. choose such that • Steepest descent direction
General Steepest descent method • The algorithm • Comments • If G is quadratic form, it collapses to steepest descent method • If H is positive definite (PD) or semi-positive definite (SPD) • For general cases, more research is needed!!!
Nonlinear CG method • Choose as conjugate vectors (different ways!!!) • Algorithm
Nonlinear CG method • Some comments • If G is quadratic form, it collapses to conjugate gradient (CG) method • In the computation, we need only compute the gradient of G, no need to compute the Hessian Matrix and compute its inverse !!! • In computation, no need to find the one-dimensional minimization problem very accurately in each step!!! • If H is positive definite (PD) or semi-positive definite (SPD) • For general cases, more research is needed!!