1.07k likes | 1.21k Views
Knowledge Based 3D Medical Image Segmentation. Tina Kapur MIT Artificial Intelligence Laboratory http://www.ai.mit.edu/~tkapur. Outline. Goal of Segmentation Applications Why is segmentation difficult? My method for segmentation of MRI Future Work. The Goal of Segmentation.
E N D
Knowledge Based 3D Medical Image Segmentation Tina Kapur MIT Artificial Intelligence Laboratory http://www.ai.mit.edu/~tkapur tkapur@ai.mit.edu
Outline • Goal of Segmentation • Applications • Why is segmentation difficult? • My method for segmentation of MRI • Future Work tkapur@ai.mit.edu
The Goal of Segmentation tkapur@ai.mit.edu
The Goal of Segmentation tkapur@ai.mit.edu
Applications of Segmentation • Image Guided Surgery tkapur@ai.mit.edu
Applications of Segmentation • Image Guided Surgery tkapur@ai.mit.edu
Applications of Segmentation • Image Guided Surgery • Surgical Simulation tkapur@ai.mit.edu
Applications of Segmentation • Image Guided Surgery • Surgical Simulation tkapur@ai.mit.edu
Applications of Segmentation • Image Guided Surgery • Surgical Simulation • Neuroscience Studies • Therapy Evaluation tkapur@ai.mit.edu
Limitations of Manual Segmentation • slow (up to 60 hours per scan) • variable (up to 15% between experts) [Warfield 95, Kaus98] tkapur@ai.mit.edu
The Automatic Segmentation Challenge An automated segmentation method needs to reconcile • Gray-level appearance of tissue • Characteristics of imaging modality • Geometry of anatomy tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Parametric [Vannier] • Non-Parametric [Gerig] • Point distribution Models [Cootes] • Texture [Mumford] tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models • MRI inhomogeneity [Wells] tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models • Anatomy Models: Shape, Geometric/Spatial tkapur@ai.mit.edu
How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models • Anatomy Models: Shape, Geometric/Spatial • PCA [Cootes and Taylor, Gerig, Duncan, Martin] • Landmark Based [Evans] • Atlas [Warfield] tkapur@ai.mit.edu
Typical Pipeline for Segmentation of Brain MRI pre-processing (noise removal) tkapur@ai.mit.edu • Pre-processing for noise reduction • EM Segmentation • Morphological or other post-processing
Typical Pipeline for Segmentation of Brain MRI pre-processing (noise removal) intensity-based classification tkapur@ai.mit.edu • Pre-processing for noise reduction • EM Segmentation • Morphological or other post-processing
Typical Pipeline for Segmentation of Brain MRI pre-processing (noise removal) intensity-based classification post-processing (morphology/other) tkapur@ai.mit.edu • Pre-processing for noise reduction • EM Segmentation • Morphological or other post-processing
Contributions of Thesis • Developed an integrated Bayesian Segmentation Method for MRI that incorporates de-noising and global geometric knowledge using priors into EM-Segmentation • Applied integrated Bayesian method to segmentation of Brain and Knee MRI. tkapur@ai.mit.edu
Contributions of Thesis • The Priors • de-noising: novel use of a Mean-Field Approximation to a Gibbs random field in conjunction with EM-Segmentation (EM-MF) • geometric: novel statistical description of global spatial relationships between structures, used as a spatially varying prior in EM-Segmentation tkapur@ai.mit.edu
Background to My Work • Expectation-Maximization Algorithm • EM-Segmentation tkapur@ai.mit.edu
Expectation-Maximization • Relevant Literature: • [Dempster, Laird, Rubin 1977] • [Neal 1998] tkapur@ai.mit.edu
Expectation-Maximization (what?) • Search Algorithm • for Parameters of a Model • to Maximize Likelihood of Data • Data: some observed, some unobserved tkapur@ai.mit.edu
Expectation-Maximization (how?) • Initial Guess of Model Parameters • Re-estimate Model Parameters: • E Step: compute PDF for hidden variables, given observations and current model parameters • M Step: compute ML model parameters assuming pdf for hidden variables is correct tkapur@ai.mit.edu
Expectation-Maximization (how exactly?) • Notation • Observed Variables: • Hidden Variables : • Model Parameters: tkapur@ai.mit.edu
Expectation-Maximization (how exactly?) • Initial Guess: • Successive Estimation of • E Step: • M Step: tkapur@ai.mit.edu
Expectation-Maximization • Summary/Intuition: • If we had complete data, maximize likelihood • Since some data is missing, approximate likelihood with its expectation • Converges to local maximum of likelihood tkapur@ai.mit.edu
EM-Segmentation [Wells 1994] • Observed Signal is modeled as a product of the true signal and a corrupting gain field due to the imaging equipment • Expectation-Maximization is used on log-transformed observations for iterative estimation of • tissue classification • corrupting bias field (inhomogeneity correction) tkapur@ai.mit.edu
EM-Segmentation [Wells 1994] E-Step M-Step tkapur@ai.mit.edu
EM-Segmentation [Wells 1994] E-Step Compute tissue posteriors using current intensity correction. Estimate intensity correction using residuals based on current posteriors. M-Step tkapur@ai.mit.edu
EM-Segmentation [Wells 1994] • Observed Variables • log transformed intensities in image • Hidden Variables • indicator variables for classification • Model Parameters • the slowly varying corrupting bias field ( refer to variables at voxel s in image) tkapur@ai.mit.edu
EM-Segmentation [Wells 1994] • Initial Guess: • Successive Estimation of • E Step: • M Step: tkapur@ai.mit.edu
EM-Segmentation [Wells 1994] • Initial Guess: • Successive Estimation of • E Step: • M Step: tkapur@ai.mit.edu
Situating My Work • Prior in EM-Segmentation: • Independent and Spatially Stationary • My contribution is addition of two priors: • a spatially stationary Gibbs prior to model local interactions between neighbors (thermal noise) • spatially varying prior to model global relationships between geometry of structures tkapur@ai.mit.edu
The Gibbs Prior • Gibbs Random Field (GRF) • natural way to model piecewise homogeneous phenomena • used in image restoration [Geman&Geman 84] • Probability Model on a lattice • Partially Relaxes independence assumption to allow interactions between neighbors tkapur@ai.mit.edu
EM-MF Segmentation: EM + Gibbs Prior • We model tissue classification W as a Gibbs random field: tkapur@ai.mit.edu
EM-MF Segmentation: Gibbs Prior on Classification • We model tissue classification W as a Gibbs random field: tkapur@ai.mit.edu
EM-MF Segmentation: Gibbs Prior on Classification • To fully specify the Gibbs model: • define neighborhood system as a first order neighborhood system i.e. 6 closest voxels • use to define tkapur@ai.mit.edu
EM-MF Segmentation: Gibbs form of Posterior • Gibbs prior and Gaussian Measurement Models lead to Gibbs form for Posterior: tkapur@ai.mit.edu
EM-MF Segmentation: Gibbs form of Posterior • Gibbs prior and Gaussian Measurement Models lead to Gibbs form for Posterior: tkapur@ai.mit.edu
EM-MF Segmentation • For E-Step: Need values for tkapur@ai.mit.edu
EM-MF Segmentation • For E-Step: Need values for • Cannot compute directly from Gibbs form tkapur@ai.mit.edu
EM-MF Segmentation • For E-Step: Need values for • Cannot compute directly from Gibbs form • Note tkapur@ai.mit.edu
EM-MF Segmentation • For E-Step: Need values for • Cannot compute directly from Gibbs form • Note • Can approximate • Mean-Field Approximation to GRF tkapur@ai.mit.edu
Mean-Field Approximation • Deterministic Approximation to GRF [Parisi84] • the mean/expected value of a GRF is obtained as a solution to a set of consistency equations • Update Equation is obtained using derivative of partition function with respect to the external field g. [Elfadel 93] • Used in image reconstruction [Geiger, Yuille, Girosi 91] tkapur@ai.mit.edu
Mean-Field Approximation to Posterior GRF • Intuition: • denominator is normalizer • numerator captures: • effect of labels at neighbors • measurement at voxel itself tkapur@ai.mit.edu
Summary of EM-MF Segmentation • Modeled piecewise homogeneity of tissue using a Gibbs prior on classification • Lead to Gibbs form for Posteriors • Posterior Probabilities in E-Step are approximated as a Mean-Field solution tkapur@ai.mit.edu