660 likes | 955 Views
PDE methods for Image Segmentation and Shape Analysis:. From the Brain to the Prostate and Back presented by John Melonakos – NAMIC Core 1 Workshop – 30/May/2007. Outline. Bhattacharyya Segmentation Segmentation Results --------------------------------------- Shape Analysis
E N D
PDE methods for Image Segmentation and Shape Analysis: From the Brain to the Prostate and Backpresented by John Melonakos – NAMIC Core 1 Workshop – 30/May/2007
Outline • Bhattacharyya Segmentation • Segmentation Results --------------------------------------- • Shape Analysis • Shape-Driven Segmentation
Contributors • Georgia Tech- • Yogesh Rathi, Sam Dambreville, Oleg Michailovich, Jimi Malcolm, Allen Tannenbaum
Publications • S. Dambreville, Y. Rathi, and A. Tannenbaum. A framework for Image Segmentation using Shape Models and Kernel Space Shape Priors. IEEE Transactions on Pattern Analysis and Machine Intelligence, (to appear 2007). • O. Michailovich, Y. Rathi, and A. Tannenbaum. Image Segmentation using Active Contours Driven by Informaion-Based Criteria. IEEE Transactions on Image Processing, (to appear 2007). • Y. Rathi, O. Michailovich, and A. Tannenbaum. Segmenting images on the Tensor Manifold. In CVPR, 2007. • Eric Pichon, Allen Tannenbaum, and Ron Kikinis. A statistically based surface evolution method for medical image segmentation: presentation and validation. In International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI), volume 2, pages 711-720, 2003. Note: Best student presentation in image segmentation award. • Y. Rathi, O. Michailovich, J. Malcolm, and A. Tannenbaum. Seeing the Unseen: Segmenting with Distributions. In Intl. Conf. Signal and Image Processing, 2006. • J. Malcolm, Y. Rathi, A. Tannenbaum. Graph cut segmentation with nonlinear shape priors. In Intl. Conf. Signal and Image Processing, 2007.
Threshold-based Edge-based Region-based Parametric methods Implicit methods Segmentation Hierarchy Parameterized representation of the curve (shape) Implicit representation of the curve using level sets
Geometric Active Contours Image Use Calculus of variations
Our Contributions • Segmentation by separating intensity based probability distributions (not just intensity moments as in previous works). • Novel formulation of the Bhattacharyya distance in the level set framework so as to optimally separate the region inside and outside the evolving contour.
The Bhattacharyya distance gives a measure of similarity between two distributions: Bhattacharyya Distance where z Z is any photometric variable like intensity, color vector or tensors. • B can also be thought of as the cosine of the angle between two vectors.
Bhattacharyya Distance Let x R2 specify the co-ordinates in the image plane and I : R2 Z be a mapping from image plane to the space of photometric variable Z. Then the pdf is given by: This is the nonparametric density estimate of the pdf of z given the kernel K.
In the Level Set Framework The pdf’s written in terms of the level set function is given by :
The First Variation For segmentation purposes, we would like to minimize the Bhattacharyya distance. This is achieved using calculus of variations, by taking the first variation of B as follows :
The First Variation (cont.) The first variation of Pin and Pout is given by : where,
Plugging in all the components, we get the following PDE (partial differential equation) for separating the distributions : Resulting PDE
In numerical experiments, an additional regularizing term is added to the resulting PDE that penalizes the length of the contour making it smooth. Thus, the final PDE is given by: Additional Terms
Outline • Bhattacharyya Segmentation • Segmentation Results --------------------------------------- • Shape Analysis • Shape-Driven Segmentation
The Unseen! Template Image Generated Image Toy example: Region inside and outside was obtained by sampling from a Rayleigh distribution with the same mean and variance.
The Unseen! Starting distribution Final distribution Actual distribution
Application to Tensors • Intensity is not enough to segment several types of images. • Diffusion Tensor MRI images have become common, where at each pixel a tensor is computed from a set of gradients. Color coded Fractional Anisotropy image
Structure Tensors • Structure tensor reveals many features like edges, corners or texture of an image. • A structure tensor for a scalar valued image I is given by: (K is a Gaussian kernel) Color structure tensor is given by:
The Tensor Manifold • The space of n x n positive definite symmetric matrices, is not a vector space, but forms a manifold (a cone). • Many past methods by Wang-Vemuri, Lenglet et.al., have however assumed the tensor space to be Euclidean. The active contour based segmentation was performed under this assumption. Structure tensor space for a typical image.
Riemannian vs Euclidean Manifold Euclidean distance d(A,B) = d(A,C) = d(C,B) = d1 Riemannian distance dr(A,B) = d(A,C) + d(C,B) = 2d1 Thus, under Euclidean manifold assumption, one obtains an erroneous estimate for mean and variance used in many active contour based segmentation algorithms.
Basic Riemannian Geometry M Y Log Map x TxM Y’ For a tensor manifold (cone), TxM is the set of all symmetric matrices and forms a vector space.
Tensor Space • A recent method proposed by Lenglet et.al. (2006), incorporates the Riemannian geometry of the tensor space and performs segmentation by assuming a Gaussian distribution of the object and background. • By using the Bhattacharyya distance and taking into account the Riemannian structure of the tensor manifold, we propose to extend the above segmentation technique to any arbitrary and non-analytic probability distribution.
Segmentation Framework Compute Mean on the Riemannian Manifold Map all points onto the Tangent Space TM at the mean Perform Curve Evolution using the PDE described earlier. Compute Target points or bins Euclidean Space Accepted for publication in IEEE CVPR 2007
Duck Segmentation using Bhattacharyya flow, but using Riemannian metric Segmentation using Bhattacharyya flow, but assuming Euclidean distance between tensors
Tiger Segmentation using Bhattacharyya flow, but using Riemannian metric Segmentation using Bhattacharyya flow, but assuming Euclidean distance between tensors
Butterfly Color structure tensor: Segmentation assuming Riemannian metric Segmentation assuming Euclidean metric
Segmentation Summary • No assumption on the distribution of the object or background. • Computationally very fast, since we only need to update the probability distribution instead of having to map each point in the image from Riemannian space to tangent space after each iteration to compute the mean and variance (under a Gaussian assumption).
Outline • Bhattacharyya Segmentation • Segmentation Results --------------------------------------- • Shape Analysis • Shape-Driven Segmentation
Contributors • Georgia Tech- • Delphine Nain, Xavier LeFaucheur, Yi Gao, Allen Tannenbaum
Publications • D. Nain, S. Haker, A. Tannenbaum. Multiscale 3D shape representation and segmentation using spherical wavelets. IEEE Trans. Medical Imaging, 26 (2007). pp 598-618. • D. Nain, S. Haker, A. Bobick, and A. Tannenbaum. Shape-Driven 3D Segmentation using using Spherical Wavelets. In Proceedings of MICCAI, Copenhagen, 2006. Note: Best Student Paper Award in the category Segmentation and Registration. • D. Nain, S. Haker, A. Bobick, and A. Tannenbaum. Multiscale 3D Shape Analysis using Spherical Wavelets. In Proceedings of MICCAI, Palm Springs, 2005.
Overview • 3D, Parametric, Data-driven prior Caudate nucleus dataset Prostate dataset
Overview of ASM • K shapes in training set, N landmarks
Limitations of ASM • Rank of the covariance matrix DDTis at most K-1 • Small training set: only first K-1 major variations captured by shape prior • E.g. Reconstruction, given new shape s Ground Truth Reconstructed with ASM shape prior
Multi-scale prior • Hierarchical decomposition: shape is represented at different scales [Davatzikos03] • Learn variations at each scale
The Algorithm • Step 1: Find Landmarks • [Conformal Mapping] • Step 2: Multi-scale representation • [Spherical Wavelets] • Step 3: Find independent bands of variation • [Spectral Graph Partitioning]
[1] Shape Registration and Re-meshing Nonlinear area-preserving mapping [Brechbuhler95], Conformal mapping [Haker04]
[2] Spherical Wavelets A function decomposed in wavelet space is uniquely described by a Weighted sum of scaling functions and wavelet functions that are localized in space and scale Wavelet, level 1 Wavelet, level 2 Wavelet, level 3 Scaling, level 0 Spherical scaling and wavelet functions are defined on a multi-resolution grid
[2] Spherical Wavelets is signal on the sphere Analysis: Synthesis: In matrix notation:
j=0 j=1 j=2 j=3 [2] Shape Representation After the registration step, all shapes have the required mesh structure Given a shape Sk, we find a 3 1D signals: We take the wavelet transform of each signal and represent the shape as: Shape representation using a weighted combination of the lowest resolution scaling functions and wavelet functions up to jth resolution Original Shape
[2] Compression • At each scale, we can truncate least significant coefficients based on spectrum analysis of population • Results in local compression at each scale Compression: from 2562 to 649 coefficients, mean error 5.10-3
[3] Scale-Space Prior • Previous approach [Davatzikos 03] We propose a more principled approach where for each scale, we cluster highly correlated coefficients into a band, with the constraint that coefficients across bands have minimum cross-correlation
Band Decomposition 1 2 3 4 5 6 2 3 6 1 4 5 1 2 3 4 5 6 2 3 6 1 4 5 Covariance Matrix level 1 Rearranged Covariance Matrix level 1
Band Decomposition • Spectral Graph Partitioning technique [Shi00] • Fully connected graph G = (V,E) where nodes V are wavelet coefficients for a particular scale • Weight on each edge: w(i, j) is covariance between coefficients i and j • Stopping criterion: validating whether the subdivided band correspond to two independent distributions based on KL divergence
Band Decomposition Color is influence of coefficients in a band: red (high), blue (none)
Building the Prior • Assuming K shapes in training set, for each band, we obtain (K-1) eigenvectors • In total we have B(K-1) eigenvectors, where B is number of bands