270 likes | 585 Views
2.3 共轭斜量法 ( Conjugate Gradient Methods). 属于一种迭代法,但如果不考虑计算过程的舍入误差, CG 算法只用有限步就收敛于方程组的精确解. Outline. Background Steepest Descent Conjugate Gradient. 1 Background. The min(max) problem: But we learned in calculus how to solve that kind of question!. “real world” problem.
E N D
2.3 共轭斜量法(Conjugate Gradient Methods) 属于一种迭代法,但如果不考虑计算过程的舍入误差,CG算法只用有限步就收敛于方程组的精确解
Outline • Background • Steepest Descent • Conjugate Gradient
1 Background • The min(max) problem: • But we learned in calculus how to solve that kind of question!
“real world” problem • Connectivity shapes (isenburg,gumhold,gotsman) • What do we get only from C without geometry?
Motivation- “real world” problem • First we introduce error functionals and then try to minimize them:
Motivation- “real world” problem • Then we minimize: • High dimension non-linear problem. • Conjugate gradient method is maybe the most popular optimization technique based on what we’ll see here.
Directional Derivatives: first, the one dimension derivative:
Directional Derivatives : In general direction…
The Gradient: Definition in In the plane
基本思想 • Modern optimization methods • A method to solve quadratic function minimization: (A is symmetric and positive definite)
2 最速下降法 (Steepest Descent) (1)概念:将 点的修正方向取为该点的负梯度方向 ,即为最速下降方向,该方法进而称之为最速下降法. (2)计算公式:任意取定初始向量,
3 共轭斜量法(Conjugate Gradient) • Modern optimization methods : “conjugate direction” methods. • A method to solve quadratic function minimization: (A is symmetric and positive definite)
Conjugate Gradient • Originally aimed to solve linear problems: • Later extended to general functions under rational of quadratic approximation to a function is quite accurate.
Conjugate Gradient • The basic idea: decompose the n-dimensional quadratic problem into n problems of 1-dimension • This is done by exploring the function in “conjugate directions”. • Definition: A-conjugate vectors:
Conjugate Gradient • If there is an A-conjugate basis then: • N problems in 1-dimension (simple smiling quadratic) • The global minimizer is calculated sequentially starting from x0:
Gradient Conjugate Gradient