350 likes | 461 Views
Model-Based Clustering by Probabilistic Self-Organizing Maps. Presenter : Chien-Hsing Chen Author: Shih-Sian Cheng Hsin-Chia Fu Hsin -Min Wang. 2009.IEEE TNN.22. Outline. Motivation Objective Method Experiments Conclusion Comment. Motivation. develop a mixture clustering model
E N D
Model-Based Clustering by Probabilistic Self-Organizing Maps Presenter:Chien-Hsing Chen Author: Shih-Sian Cheng Hsin-Chia Fu Hsin-Min Wang 2009.IEEE TNN.22
Outline • Motivation • Objective • Method • Experiments • Conclusion • Comment
Motivation • develop a mixture clustering model • EM, CEM, DAEM are applied to combine with PbSOM • Background knowledge • competition, cooperation in SOM • EM (E-step, M-step), K-means ? • likelihood • multivariate Gaussian distribution • when K-means = SOM ?
Objective introduce three approaches, and a PbSOM combine the three approaches with PbSOM PbSOM
EM 2 6 8 48 5 9 1 assume t=15 need update? expect that each xi can be close to a certain k 5 6 =p(2;θk=1) = the value is large p(48;θk=1) = the value is small
CEM 48 3 6 8 2 5 9 1 assume t=15 need update? expect that each k has good quality of data projection 5 4
EM: DAEM 2 6 8 48 5 9 1 large 5 4 small expect that do not believe f(k|xi ; θ) too much, when t=1 believe f(k|xi ; θ) larger, when t=10 initialization bias, then local optimal, <1 gradual increase to 1
EM based approaches • EM • CEM • DAEM
Overall PbSOM
Principle concept of PbSOM 3 xi xi 53 23 53 8 87 2 1 98 9 || 3- 1|| || 3-8 || || 3-9 || xi || 8-1 || || 8-1 || When selecting the winning neuron, PbSOM considers the neighborhood information; in contrast, SOM, does not.
PbSOM (Probabilistic SOM) xi xi 5 4 xi k, if k=argmink||xi - nk|| (energy function to be maximized)
Multivariate Gaussian distribution l x5 x1, x5, x7, x8~iid~N(ul, ) x1 x7 x8
PbSOM (Probabilistic SOM) 5 4 xi
PbSOM (Probabilistic SOM) 5 4 xi
Overall PbSOM
EM based approaches • EM • CEM • DAEM h h PbSOM h h h h
SOCEM (PbSOM+CEM) • CEM 48 3 h 6 8 2 h 5 9 1 conversional SOM update: ||xi - nl|| ||nk- nl|| batch update: xi / N ||nk- nl|| similar to a batch K-means algorithm with considering h
2 SOME (PbSOM+EM) 6 8 • EM 48 h 5 9 1 h similar to a batch K-means algorithm with considering h
SODAEM (PbSOM+DASOM) • DAEM h h
overall PbSOM
Experiment- SOCEM σ in hklgradually recued 0.6 to 0.15 σ= 0.6 σ= 0.45 σ= 0.3 σ= 0.15
Experiment- SOEM compare to previous page, this result is more global
Experiment- SODAEM SODAEM is almost equivalent to SOME and SOCEM, respectively. When uses different β. It is not able to obtain ordered map during learning process if the value of σ is to small.
Experiment stability without PbSOM
distinguish Experiment 1/2 KohonenSOM SOCEM SOCEM
Conclusion PbSOM
Comment • Advantage • a mixture approach, sounds solid, is presented • Drawback • less novelty • Is it better than conversional SOM? • Application • SOM