480 likes | 790 Views
Ill-Posedness and Regularization of Linear Operators (1 lecture). Singular value decomposition (SVD) in finite-dimensional spaces Least squares solution; Moore-Penrose pseudo inverse Geometry of a linear inverse Ill-posed and ill-conditioned problems
E N D
Ill-Posedness and Regularization of Linear Operators (1 lecture) • Singular value decomposition (SVD) in finite-dimensional spaces • Least squares solution; Moore-Penrose pseudo inverse • Geometry of a linear inverse • Ill-posed and ill-conditioned problems • Tikhonov regularization; Truncated SVD • SVD of compact operators
Hilbert spaces (finite or infinite) OperatorA from to is a mapping that assigns to each (domain) an element (range) A is defined every where A is on operator on Linear operators We write Null space of a linear operator (it is a subspace) Basics of linear operators in function spaces (Appendix B of [RB1])
A is bounded if there exists a constant M: • Norm of A: • Adjoint operator: is the unique operator such that is a matrix and is a matrix and Note: Basics of linear operators in function spaces (Apendix B of [RB1]) • A linear operator A is continuous iff it is bounded • Adjoint operator depends on the inner product. Examples:
is equipped with the standard Euclidian inner product symmetric (self-adjoint ) Eigen-equation are real may always be chosen to form an orthogonal basis Let (U is an unitary matrix) Eigen-equation in terms of U Eigenvalues and eigenvectors of symmetric matrices
Action of a real symmetric matrix on an input vector • projects the input vector along • synthetizes by the linear combination Spectral representation of symmetric matrix A
Functions of a symmetricmatrix • From the following properties of an unitary matrix: 1. 2. If h(A) is a power series 3. If A is non-singular 4. If (A is positive semi-positive - PSD), we can define It follows that
Singular value decomposition of a real (complex) rectangular matrix equipped with the standard Euclidian inner product • but A is not self-adjoint ( ), the eigendecomposition • does not have the nice properties of self-adjoint matrices. The cyclic matrices • are an exception • the eigenvalue problem is meaningless. The Singular value • decomposition provides a generalization of the self-adjoint spectral • decomposion
and are equipped with the standard Euclidian inner products isometric left singular vectors right singular vectors singular values Singular value decomposition
matrix norms: • range and null-space of • range and null-space of Singular value decomposition: consequences
Action of a real symmetric matrix on the vector • projects the input vector along • synthetizes by the linear combination Singular value decomposition
A is invertible: • A is not invertible b) a) Inversion methods:
Orthogonal components Least-squares approach
Generalized inverse is the least-squares solution of minimum-norm, or the generalized solution
Moore-Penrose pseudo-inverse (r = n · m Minimum-norm solution (r = m < n Moore-Penrose pseudo-inverse and Minimum-norm solution
Moore-Penrose pseudo-inverse: a variational point of view is invertible Minimization of the observed data misfit Normal equations
Effect of noise The boundary of is an ellipse centered at with principal axes aligned with . The lenght of the k-th principal semi-axis is
Classification of the linear operators • If n m A is Ill-posed are sources of instabilities. • In any case “small” singular values Often, the smaller the eigenvalues the more oscilating the corresponding singular vectors (high frequences) Regularization: shrink/threshold large values of i.e, multiply the eigenvalues by a regularizer function such that as
Regularization by shrinking/thresholding the spectrum of A as Such that 1) 2 ) The larger singular are retained Truncated SVD (TSVD) Tikhonov (Wiener) regularization Regularization
TSVD Tikhonov Regularization by shrinking/thresholding the spectrum of A Lets write the singular decomposition of A as unitary Tikhonov regularization: variational formulation
Thus, the Tikhonov regularized solution is given by which is the solution of the variational problem Tikhonov regularization for any Hilbert spaces (see Appendic E of [RB 1]) Family of quadratic regularizers Does SVD plays a role?
The optimization problem with the Euler-Lagrange equation is solved by resorting to iterative methods that depend only on The operators • Example: Landweber iterations Medium/large systems • For medium/large systems, the SVD is impracticable
Singular value decomposition in infinite-dimensional spaces Singular system for a compact linear operator ( are Hilbert spaces) is a countable set of triples with the following Properties: 1. The right singular vectors forms an orthonormal basis for 2. The left singular vectors form an orthonormal basis for the closure of 3. The singular values are positive real numbers and are in nonincreasing order, 4. For each j, 5. If is infinite dimensional, 6. A has the representation
Example of compact operators • Any linear operator for which is finite dimensional is compact 2. The diagonal operator on 3. The Fredholm first kind integral on (the space of real-valued square integrable functions on - a Hilbert space )
Compact linear operators in infinite dimensional spaces are ill-posed be a compact linear operator are infinite dimensional Hilbert spaces. • If is infinite dimensional, then the operator equation is • ill-posed in the sense that • The solution is not stable 2. If is finite dimensional then the solution is not unique
Summary: SVD/least-squares based solutions • Least-squares approach Minumum-norm solution
Tikhonov (Wiener) regularization Summary: Regularized solutions • Truncated SVD (TSVD) which is the solution of the variational problem Regularizer (Penalizing function)
Example: Landweber iterations Summary: Medium/large systems with quadratic regularization • For medium/large systems, the SVD is impracticable. (periodic convolution operators are an important exception) • The optimization problem Quadratic regularizer with the Euler-Lagrange equation is solved by resorting to iterative methods that depend only on
Summary: Non-quadratic regularization • Example: discontinuity preserving regularizer penalizes oscillatory solutions
Example: deconvolution of a step Matrix A
Example: Sparse reconstruction norm Observed data - g Original data - f
Example: Sparse reconstruction regularization Pseudo-inverse
Bibliography • [Ch9.; RB1], [Ch2,Ch3; L1] • Majorization Minimization [PO1], [PO3] • Compressed Sensing [PCS1] Important topics Matlab scripts • TSVD_regularization_1D.m • TSVD_Error_1D.m • step_deconvolution.m • l2_l1sparse_regression.m