1 / 21

報告者:張俊偉 IEEE TRANSATIONS ON IMAGE PROCESSING,VOL. 16, NO.7,JULY 2007

Image Segmentation Using Hidden Markov Gauss Mixture Models Kyungsuk (Peter) Pyun , Member, IEEE , Johan Lim, Chee Sun Won , Member, IEEE , and Robert M. Gray , Fellow, IEEE. 報告者:張俊偉 IEEE TRANSATIONS ON IMAGE PROCESSING,VOL. 16, NO.7,JULY 2007. P urpos e.

raina
Download Presentation

報告者:張俊偉 IEEE TRANSATIONS ON IMAGE PROCESSING,VOL. 16, NO.7,JULY 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Image Segmentation Using Hidden Markov Gauss Mixture ModelsKyungsuk (Peter) Pyun, Member, IEEE, Johan Lim, Chee Sun Won, Member, IEEE, and Robert M. Gray, Fellow, IEEE 報告者:張俊偉 IEEE TRANSATIONS ON IMAGE PROCESSING,VOL. 16, NO.7,JULY 2007

  2. Purpose • 透過 supervised learning 去訓練,然後輸入欲分割的圖片後,可以快速又精確的得到結果。 • Devise an automatic context-dependent segmentation algorithm yielding a reasonable classification error or Bayes risk between the original and the automatically segmented image

  3. Classification • Use Gauss mixture model (GMM). • First, the Gaussian pdf maximizes the differential entropy and has the largest Shannon distortion-rate function and the worst high-rate distortion-rate tradeoff given the mean and the covariance. • Second,GMM models lead to a robust quantizer minimizing the quantizer mismatch.

  4. To design a GMM • Employ a clustering approach using Lloyd alg. instead EM alg. • Advantages of the Lloyd algorithm over the EM algorithm • rapid convergence • typically within 20 iterations in our experiments, and roughly half of the computational complexity of EM algorithm.

  5. Accurate classification of GMM does not necessarily coincide with clear segmentation since the latter requires smooth boundaries between classes and better clustering for the same classes.

  6. Markov random field (MRF)

  7. In GMVQ design by the Lloyd algorithm • an input vector X is mapped to the closest codeword (covariance and mean) • After applying this minimum distortion mapping to the entire training set, the codeword is replaced by the centroid of data assigned to the same class • These two steps are iterated until convergence

  8. BP model

  9. Segmentation With HMGMM • use a maximum a posteriori (MAP) estimate for the true segmentation and compute the approximate maximum likelihood estimator of often referred to as the hyper parameter for Gibbs prior

More Related