120 likes | 348 Views
Tutorial 4. Contents: Least squares solution for overcomplete linear systems. … via normal equations … via A = QR factorization … via SVD decomposition SVD - Singular Value Decomposition , A = U Σ V T. Normal Equations. Consider the system
E N D
Tutorial 4 Contents: • Least squares solution for overcomplete linear systems. • … via normal equations • … via A = QR factorization • … via SVD decomposition • SVD - Singular Value Decomposition, A = UΣVT Tutorial 4
Normal Equations Consider the system It can be a result of some physical measurements, which usually incorporate some errors. Since, we can not solve it exactly, we would like to minimize the error: r=b-Ax r2=rTr=(b-Ax)T(b-Ax)=bTb-2xTATb+xTATAx (r2)x=0 - zero derivative is a (necessary) minimum condition -2ATb+2ATAx=0; ATAx = ATb; – Normal Equations Tutorial 4
Normal Equations 2 ATAx = ATb – Normal Equations Tutorial 4
Least squares via A=QR decomposition A(m,n)=Q(m,n)R(n,n), Q is orthogonal, therefore QTQ=I. QRx=b R(n,n)x=QT(n,m)b(m,1) -well defined linear system x=R-1QTb Q is found by Gram=Schmidt orthogonalization of A. How to find R? QR=A QTQR=QTA, but Q is orthogonal, therefore QTQ=I: R=QTA R is upper triangular, since in orthogonalization procedure only a1,..ak (without ak+1,…) are used to produce qk Tutorial 4
Least squares via A=QR decomposition 2 Let us check the correctness: QRx=b Rx=QTb x=R-1QTb Tutorial 4
Least squares via SVD Ax=b; A=UΣVT -singular value decomposition of A: UΣVTx=b; x= VΣ-1UTb Tutorial 4
Singular Value Decomposition 1 The SVD based on the fact that for any A there are orthonormal bases v1,…vr for the row space and u1,…ur for the column space, such, that Avi=σiui, while σi>0 Thus, any matrix can be represented as ,where U and V are orthogonal, and Σ is diagonal. Tutorial 4
Singular Value Decomposition 2 First we find the matrix V: ATA=(UΣVT)T(UΣVT)= VTΣTUTU ΣVT = VTΣTΣVT This is an ordinary (eigenvector) factorization of a symmetric matrix, therefore V is built of eigenvectors of ATA. The eigenvectors of ATA are rows of VT. In the same way one can prove, that U is built from eigenvectors of AAT. However, an easier way to find U and Σ is to use the equations: Avi=σiui Tutorial 4
SVD Example Let us find SVD for the matrix In order to find V, we are calculating eigenvectors of ATA: (5-λ)2-9=0; λ2-10 λ +16=0; λ1,2=8,2 Tutorial 4
SVD Example 2 Now, we obtain the U and Σ : A=UΣVT: Tutorial 4
Appendix: derivative of xTATAx Tutorial 4