200 likes | 347 Views
Using Bayes ’ rule to formulate your problem. Bayes Rule. Example:. A : Event that the person tested has cancer B : Event that the test is positive We are interested in P(A = has cancer | B = positive). Suppose the test is 95% accurate : Suppose we also know that
E N D
Example: • A : Event that the person tested has cancer • B : Event that the test is positive • We are interested in P(A = has cancer | B = positive). • Suppose the test is 95% accurate : • Suppose we also know that • According to Bayes’ Rule:
Advantages of Bayesian Model • Allows us to learn and/or model causal relationships between hypothesis model parameters and observations. • Can combine prior knowledge into the prior probability • Can handle incomplete data observations or noisy data observations through incorporating proper priors into the model • The Bayesian method is a probabilistic method that find the most likely solution of decision boundaries or model parameters
Bayesian and the MAP problem • If we are interested in finding a hypothesis model (or hypothesis model parameters), we can maximize the posterior or joint probability given the current observations. • In case we do not have any prior probability • MAP problem is equal to Maximum Likelihood (ML) problem
Bayesian method in Computer Vision (Image Denoising Example) • Given a noisy image, IN • we want to estimate a clear image I • assuming each pixel is corrupted by noise that is identical and independent. • Further assume the noise follows a Gaussian distribution. • Based on the iid and the Gaussian distribution assumption, we can define the likelihood probability:
Bayesian method in Computer Vision (Image Denoising Example) • Without a prior, trivial solution to the image denoising problem is I = IN • A common prior for image denoising problem is neighborhood smoothness prior • We use Gaussian distribution again to model prior probability distribution. • This is a standard MRF approach.
Bayesian method in Computer Vision (Image Denoising Example) • Results Comparison Input Gau. Filter Median Filter MRF Ground Truth
Optimization Methods • Linear Regression • Alternating Optimization (EM Algorithm) • Belief Propagation
Linear Regression • Used when the model parameters are linear • Basic assumption: • Estimation errors follow Gaussian distribution • Observations are identical and independent distribution • Global optimal solutions corresponds to • Closed form solution by solving a SVD system
Alternating Optimization • Used when model parameters are interdependent • Divide parameters into disjoint subset: {A0,…,AN} = {A0,…,Ai} U {Ai+1,…,AN} Optimize {A0,…,Ai} and {Ai+1,…,AN} alternatively and iteratively • Convergence is generally not guaranteed • Solutions converge to local maxima/minima only
AO Image Denoise Example • We also want to estimate of the Gaussian noise • Model parameters become , the two disjoint subset of parameters are and • The update rules: • Solve given by MRF • Solve
AO Image Denoise Example • Effects of in image denoising Optimal σ Large σ Small σ
Expectation Maximization • EM algorithm is a special case of AO consists of E-step and M-step • Solve for maximum-likelihood parameters • Convergence is guaranteed • Basis assumption: • Observations are iid
EM-GMM Estimation Example • Observations: incomplete data set • Parameters: probability density function for ‘missing’ data and Gaussian distribution parameters • E-step: • M-step:
Belief Propagation • Solve discrete labeling problem in pairwise Markov Random Field (MRF) • Basic formulation:
Belief Propagation • Iterative inference algorithm that propagates messages in the network • Standard solver is available.
Bayesian Model Summary • The definition of Likelihood and Prior probabilities are unique for each problem. • To apply Bayesian model in your research, you need to • Identify the problem and design your goal. • Understand what observations you can obtain from the problem and what assumptions you can make about the problem. • Define variables. • Write down Bayesian equation based on the causal relationships between variables. • Define the likelihood probabilities from observations and define the prior probabilities from assumptions. • Understand the nature of your objective functions and choose a proper method to solve the problem. • Evaluate the results. If the results are different from your expectations, you may need to reformulate the problem or double check the implementation details.
Tips in Bayes’: Make your live easier • Express everything you know as probabilities • Use Gaussians everywhere. Maybe multiple of them. • Learn from examples when you have them • Hack a noise model when you don‘t know • Leave constant when desperate • Use Ax=b when equations are quadratic • Use alternating optimization when parameters are mixed • Use MRF when ``segmentation’’ • Work in the log domain where everything is additive • Find the maximum