370 likes | 478 Views
Mean-Field Theory and Its Applications In Computer Vision5. Global Co-occurrence Terms. Encourages global consistency and co-occurrence of objects. Without cooc. With co-occurrence. Global Co-occurrence Terms. Defined on subset of labels Associates a cost with each possible subset.
E N D
Global Co-occurrence Terms • Encourages global consistency and co-occurrence of objects Without cooc With co-occurrence
Global Co-occurrence Terms • Defined on subset of labels • Associates a cost with each possible subset
Properties of cost function Non-decreasing 5.0 3.0 3.0 3.0 0.2 0.2 0.2
Properties of cost function We represent our cost as second order cost function defined on binary vector:
Complexity • Complexity: O(NL2) • Two relaxed (approximation) of this form • Complexity: O(NL+L2)
Our model • Represent 2nd order cost by binary latent variables • Unary cost per latent variable label level variable node (0/1)
Our model • Represent 2nd order cost by binary latent variables • Pairwise cost between latent variable
Global Co-occurrence Cost • Two approximation to include into fully connected CRF
Global Co-occurrence Terms • First model
Global Co-occurrence Terms • Model
Global Co-occurrence Terms • Constraints (lets take one set of connections) If latent variable is off, no image variable take that label If latent variable is on, atleast one of image variable take that label
Global Co-occurrence Terms • Pay a cost K for violating first constraint
Global Co-occurrence Terms • Pay a cost K for violating second constrait
Global Co-occurrence Terms • Cost for first model:
Global Co-occurrence Terms • Second model • Each latent node is connected to the variable node
Global Co-occurrence Terms • Constraints (lets take one set of connections) If latent variable is off, no image variable take that label If latent variable is on, atleast one of image variable take that label
Global Co-occurrence Terms • Pay a cost K for violating the constraint
Global Co-occurrence Terms • Cost for second model:
Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 0
Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 0
Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 0
Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 1
Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 1
Global Co-occurrence Terms • Expectation evaluation for variable Yl
Global Co-occurrence Terms • Latent variable updates:
Global Co-occurrence Terms • Latent variable updates:
Global Co-occurrence Terms Pay a cost K if variable takes a label l and corresponding latent variable takes label 0
Complexity Expectation updates for latent variable Y_l
Complexity Expectation updates for latent variable Y_l Overall complexity: Does not increase original complexity:
PascalVOC-10 dataset Qualitative analysis: observe an improvement over other comparative methods
PascalVOC-10 dataset Observe an improvement of almost 2.3% improvement Almost 8-9 times faster than alpha-expansion based method
Mean-field Vs. Graph-cuts • Measure I/U score on PascalVOC-10 segmentation • Increase standard deviation for mean-field • Increase window size for graph-cuts method • Both achieve almost similar accuracy
Window sizes • Comparison on matched energy Impact of adding more complex costs and increasing window size
PascalVOC-10 dataset Per class Quantitative results
PascalVOC-10 dataset Per class Quantitative results
Mean-field Vs. Graph-cuts • Measure I/U score on PascalVOC-10 segmentation • Increase standard deviation for mean-field • Increase window size for graph-cuts method • Time complexity very high, making infeasible to work with large neighbourhood system