490 likes | 663 Views
Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields with Mean-field Inference. Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr. http://cms.brookes.ac.uk/research/visiongroup/. Labelling problem. Assign a label to each image pixel.
E N D
Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields with Mean-field Inference Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr http://cms.brookes.ac.uk/research/visiongroup/
Labelling problem Assign a label to each image pixel Object detection Stereo Object segmentation
Problem Formulation Find a labelling that maximizes the conditional probability or minimizes the energy function
Problem Formulation • Grid CRF leads to over smoothing around boundaries Inference Grid CRF construction
Problem Formulation • Grid CRF leads to over smoothing around boundaries • Dense CRF is able to recover fine boundaries Inference Grid CRF construction Inference Dense CRF construction
Inference in Dense CRF • Very high time complexity • graph-cuts based methods not feasible
Inference in Dense CRF • Filter-based mean-field inference method takes 0.2 secs* • Efficient inference under two assumptions • Mean-field approximation to CRF • Pairwise weights take Gaussian weights *Krahenbuhl et al. Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials, NIPS 11
Efficient Inference in Dense CRF • Mean-fields methods (Jordan et.al., 1999) • Intractable inference with distribution • Approximate distribution from tractable family
Naïve Mean Field • Mean-field approximation to CRF • Assume all variables are independent
Efficient Inference in Dense CRF • Assume Gaussian pairwise weight Mixture of Gaussians Spatial Bilateral
Marginal update • Marginal update involve expectation of cost over distribution Q given that x_i takes label l Expensive message passing step is solved using highly efficient permutohedral lattice based filtering approach • MPM with approximate distribution:
Q distribution Q distribution for different classes across different iterations Iter 0 Iter 1 Iter 2 Iter 10
Two issues associated with the method • Sensitive to initialisation • Restrictive Gaussian pairwise weights
Our Contributions Resolve two issues associated with the method • Sensitive to initialisation • Propose SIFT-flow based initialisation method • Restrictive Gaussian pairwise weights • Expectation maximisation (EM) based strategy to learn more general Gaussian mixture model
Sensitivity to initialisation • Experiment on PascalVOC-10 segmentation dataset • Good initialisation can lead to better solution Propose a SIFT-flow based better initialisation method
SIFT-flow based correspondence Given a test image, we first retrieve a set of nearest neighbours from training set using GIST features Test image Nearest neighbours retrieved from training set
SIFT-flow based correspondence K-nearest neighbours warped to the test image 13.31 14.31 23.31 18.38 22 22 Test image 22 30.87 27.2 Warped nearest neighbours and corresponding flows
SIFT-flow based correspondence Pick the best nearest neighbour based on the flow value Test image Nearest neighbour Warped image 13.31 Flow:
Label transfer Ground truth of test image Ground truth of the best nearest neighbour Flow Warp the ground truth according to correspondence Transfer labels from top 1 using flow Warped ground truth according to flow
SIFT-flow based initialisation Rescore the unary potential Qualitative improvement in accuracy after using rescored unary potential Without rescoring Test image After rescoring Ground truth image
SIFT-flow based initialisation Initialise mean-field solution Qualitative improvement in accuracy after initialisation of mean-field Without initialisation Test image With initialisation Ground truth image
Gaussian pairwise weights Mixture of Gaussians bilateral spatial
Gaussian pairwise weights Mixture of Gaussians • Zero mean bilateral spatial
Gaussian pairwise weights Mixture of Gaussians • Zero mean bilateral • Same Gaussian mixture model for every label pair spatial
Gaussian pairwise weights Mixture of Gaussians • Zero mean bilateral • Same Gaussian mixture model for every label pair • Arbitrary standard deviation spatial
Our approach Incorporate a general Gaussian mixture model
Gaussian pairwise weights • Learn arbitrary mean • Learn standard deviation
Gaussian pairwise weights • Learn arbitrary mean • Learn standard deviation • Learn mixing coefficients
Gaussian pairwise weights • Learn arbitrary mean • Learn standard deviation • Learn mixing coefficients • Different Gaussian mixture for different label pairs
Learning mixture model Propose piecewise learning framework
Learning mixture model • First learn the parameters of unary potential
Learning mixture model • First learn the parameters of unary potential • Learn the label compatibility function
Learning mixture model • First learn the parameters of unary potential • Learn the label compatibility function • Set the Gaussian model following Krahenbuhl et.al
Learning mixture model • First learn the parameters of unary potential • Learn the label compatibility function • Set the Gaussian model following Krahenbuhl et.al • Learn the parameters of the Gaussian mixture
Learning mixture model • First learn the parameters of unary potential • Learn the label compatibility function • Set the Gaussian model following Krahenbuhl et.al • Learn the parameters of the Gaussian mixture • Lambda is set through cross validation
Our model • Generative training • Maximise joint likelihood of pair of labels and features: : latent variable: number of mixture components
Learning mixture model • Maximize the log-likelihood function • Expectation maximization based method Our learnt mixture model Zero-mean Gaussian
Inference with mixture model • Involves evaluating M extra Gaussian terms: • Perform blurring on mean-shifted points • Increases time complexity
Experiments on PascalVOC-10 Qualitative results of SIFT-flow method Output with SIFT-flow Warped nearest ground truth image Output without SIFT-flow Image
Experiments on PascalVOC-10 Quantitative results PascalVOC-10 segmentation dataset • Our model with unary and pairwise terms achieves better accuracy than other complex models • Generally achieves very high efficiency compared to other methods
Experiments on PascalVOC-10 Qualitative results on PascalVOC-10 segmentation dataset Ours Image Alpha-expansion Dense CRF • Able to recover missing object parts
Experiments on Camvid Quantitative results on Camvid dataset • Our model with unary and pairwise terms achieve better accuracy than other complex models • Generally achieve very high efficiency compared to other methods
Experiments on Camvid Qualitative results on Camvid dataset Image Alpha-expansion Ours • Able to recover missing object parts
Conclusion • Filter-based mean-field inference promises high efficiency and accuracy • Proposed methods to robustify basic mean-field method • SIFT-flow based method for better initialisation • EM based algorithm for learning general Gaussian mixture model • More complex higher order models can be incorporated into pairwise model
Learning mixture model • For every label pair: • Maximize the log-likelihood function
Learning mixture model • For every label pair: • Maximize the log-likelihood function • Expectation maximization based method