520 likes | 932 Views
Chebyshev Estimator. Presented by: Orr Srour. References. Yonina Eldar, Amir Beck and Marc Teboulle, "A Minimax Chebyshev Estimator for Bounded Error Estimation" (2007), to appear in IEEE Trans. Signal Proc.
E N D
Chebyshev Estimator Presented by: Orr Srour
References • Yonina Eldar, Amir Beck and Marc Teboulle, "A Minimax Chebyshev Estimator for Bounded Error Estimation" (2007), to appear in IEEE Trans. Signal Proc. • Amir Beck and Yonina C. Eldar, Regularization in Regression with Bounded Noise: A Chebyshev Center Approach, SIAM J. Matrix Anal. Appl. 29 (2), 606-625 (2007). • Jacob (Slava) Chernoi and Yonina C. Eldar, Extending the Chebyshev Center estimation Technique, TBA
Chebyshev Center - Agenda • Introduction • CC - Basic Formulation • CC - Geometric Interpretation • CC - So why not..? • Relaxed Chebyshev Center (RCC) • Formulation of the problem • Relation with the original CC • Feasibility of the original CC • Feasibility of the RCC • CLS as a CC relaxation • CLS vs. RCC • Constraints formulation • Extended Chebyshev Center
Notations • y– boldface lowercase = vector • yi - i’th component of the vector y • A - boldface uppercase = matrix • - hat = the estimated vector of x • = A – B is PD, PSD
The Problem • Estimate the deterministic parameter vector from observations with: • A– n x m model matrix • w– perturbation vector.
LS Solution • When nothing else is known, a common approach is to find the vector that minimizes the data error: • Known as “least squares”, this solution can be written explicitly: (Assuming A has a full column rank)
But… • In practical situations A is often ill-conditioned -> poor LS results
Regularized LS • Assume we have some simple prior information regarding the parameter vector x. • Then we can use the regularized least squares (RLS):
But… • But what if we have some prior information regarding the noise vector as well…? • What if we have some more complicated information regarding the parameter vector x?
Assumptions • From now on we assume that the noise is norm-bounded : • And that x lies in a set defined by: (hence C is the intersection of k ellipsoids)
Assumptions • The feasible parameter set of x is then given by: • (hence Q is compact) • Q is assumed to have non-empty interior
Constrained Least Squares (CLS) • Given the prior knowledge , a popular estimation strategy is: • Minimization of the data error over C • But: the noise constraint is unused… • More importantly, it doesn’t necessarily lead to small estimation error:
Chebyshev Center • The goal: estimator with small estimationerror • Suggested method: minimize the worst-case error over all feasible vectors
Chebyshev Center – Geometric Interpretation • Alternative representation: -> find the smallest ball (hence its center and its radius r ) which encloses the set Q.
Chebyshev Center • This problem is more commonly known as finding “Chebyshev’s Center”. Pafnuty Lvovich Chebyshev 16.5.1821 – 08.12.1894
Chebyshev Center –The problem • The inner maximization is non-convex • Computing CC is a hard optimization problem • Can be solved efficiently over the complex domain for intersection of 2 ellipsoids
Relaxed Chebyshev Center (RCC) • Let us consider the inner maximization first: and:
Relaxed Chebyshev Center (RCC) • Denoting , we can write the optimization problem as: • with: Concave Not Convex
Relaxed Chebyshev Center (RCC) • Let us replace G with: • And write the RCC as the solution of: Convex Convex
Relaxed Chebyshev Center (RCC) • T is bounded • The objective is concave (linear) in • The objective is convex in • We can replace the order: • min-max to max-min
Relaxed Chebyshev Center (RCC) • The inner minimization is a simple quadratic problem resulting with • Thus the RCC problem can be written as: Note: this is a convex optimization problem.
RCC as an upper bound for CC • RCC is not generally equal to the CC (except for k = 1 over the the complex domain) • Since we have: • Hence the RCC provides an upper bound on the optimal minimax value.
RCC Solution • Theorem: • The RCC estimator is given by:
RCC Solution Where are the optimal solution of: subject to:
RCC Solution – as SDP • Or as a semidefinite program (SDP): s.t.:
Feasibility of the CC • Proposition: is feasible. • Proof: • Let us write the opt. problem as: • with: 3. has a UNIQUE solution 2. strictly convex: 1. Convex in
Feasibility of the CC • Let us assume that is infeasible, and denote by y its projection onto Q. • By the projection theorem: and therefore:
Feasibility of the CC • So: • Which using the compactness of Q implies: • But this contradicts the optimality of . Hence: is unique and feasible.
Feasibility of the RCC • Proposition: is feasible. • Proof: • Uniqueness follows from the approach used earlier. • Let us prove feasibility by showing that any solution of the RCC is also a solution of the CC.
Feasibility of the RCC • Let be a solution for the RCC problem. Then: • Since: • We get:
CLS as CC relaxation • We now show that CLS is also a (looser) relaxation of the Chebyshev center. • Reminder:
CLS as CC relaxation • Note that is equivalent to • Define the following CC relaxation: unharmed relaxed
CLS as CC relaxation • Theorem: The CLS estimate is the same as the relaxed CC over V (here CCV). • Proof: • Les us assume is the CCV solution, and the RCC solution, . • The RCC is a strictly convex problem, so its solution is unique:
CLS as CC relaxation • Define • It is easy to show that (hence it is a valid solution for the CCV)
CLS as CC relaxation • Denote by the objective of the CCV. • By definition: • contradicting the optimality of . > 0
CLS vs. RCC • Now, as in the proof of the feasibility of the RCC, we know that: • And so: • Which means that the CLS estimate is the solution of a looser relaxation than that of the RCC.
Modeling Constraints • The RCC optimization method is based upon a relaxation of the set Q • Different characterizations of Q may lead to different relaxed sets. • Indeed, the RCC depends on the specific chosen form of Q. (unlike CC and CLS)
Linear Box Constraints • Suppose we want to append box constraints upon x: • These can also be written as: • Which of the two is preferable…?
Linear Box Constraints • Define:
Linear Box Constraints • Suppose , then: • Since , it follows that: • Which can be written as:
Linear Box Constraints • Hence: • T1 is a looser relaxation -> T2 is preferable.
Linear Box Constraints • An example in R2: • The constraints have been chosen as the intersection of: • A randomly generated ellipsoid • [-1, 1] x [-1, 1]
An example –Image Deblurring • x is a raw vector of a 16 x 16 image. • A is a 256 x 256 matrix representing atmospheric turbulence blur (4 HBW, 0.8 STD). • w is a WGN vector with std 0.05 . • The observations are Ax+w • We want x back…
An example –Image Deblurring • LS: • RLS: with • CLS: • RCC:
Chebyshev Center - Agenda • Introduction • CC - Basic Formulation • CC - Geometric Interpretation • CC - So why not..? • Relaxed Chebyshev Center (RCC) • Formulation of the problem • Relation with the original CC • Feasibility of the original CC • Feasibility of the RCC • CLS as a CC relaxation • CLS vs. RCC • Constraints formulation • Extended Chebyshev Center