150 likes | 872 Views
Mean field approximation for CRF inference. CRF Inference Problem. CRF over variables: CRF distribution: MAP inference: MPM (maximum posterior marginals ) inference:. Other notation. Unnormalized distribution Variational distribution Expectation Entropy. Variational Inference.
E N D
CRF Inference Problem • CRF over variables: • CRF distribution: • MAP inference: • MPM (maximum posterior marginals) inference:
Other notation • Unnormalized distribution • Variational distribution • Expectation • Entropy
Variational Inference • Inference => minimize KL-divergence • General Objective Function
Mean field approximation • Variational distribution => product of independent marginals: • Expectations: • Entropy:
Mean field objective • Objective
Local optimality conditions • Lagrangian • Setting derivatives to 0 gives conditions for local optimality
Coordinate ascent • Sequential coordinate ascent • Initialize Q_i’s to uniform distribution • For i = 1...N, update vector Q_i by summing expectations over all cliques involving X_i (while fixing all Q_j, j!=i) • Parallel updates algorithm • As above, but perform updates in step 2 for all Q_i’s in parrallel (i.e. Generating Q^1, Q^2...)
Comparison with belief propagation • Objective • Factored energy functional • Local polytope
Comparison with belief propagation • Message updates: • Extracting beliefs (after convergence):
Comparison with belief propagation • - = => Bethe free energy for pairwise graphs • Bethe cluster graphs: General: Pairwise:
Mean field updates • Updates in dense CRF (Krahenbuhl NIPS ’11) • Evaluate using filtering • =
Higher-order potentials • Pattern-based potentials • P^n-Potts potentials
Higher-order potentials • Co-occurrence potentials • L(X) = set of labels present in X • {Y_1,...Y_L} = set of binary latent variables