90 likes | 222 Views
A general grid-clustering approach. Presenter : Chun-Ping Wu Authors : Shinong Yue , Miaomiao Wei, Jeen-shing Wang, Huaxiang Wang . 國立雲林科技大學 National Yunlin University of Science and Technology. PRL 2008. Outline. Motivation Objective Methodology Experiments Conclusion
E N D
A general grid-clustering approach Presenter : Chun-Ping Wu Authors : ShinongYue, Miaomiao Wei, Jeen-shing Wang, Huaxiang Wang 國立雲林科技大學 National Yunlin University of Science and Technology PRL 2008
Outline • Motivation • Objective • Methodology • Experiments • Conclusion • Comments
Motivation • Hard c-means(HCM) need to predetermine the number of clusters. • It’s crucially influenced by the choice of initial cluster centers. • The performances of the HCM often are too slow to apply any large size of dataset.
Objective • Determining an optimal grid size by a designed partitioning index. • Breaking the curse of dimensionality in high-dimensional datasets.
Methodology • (1)Solve the minimal grid that encloses all data objects in M. • (2)Successively bisect GRID. • The jth round of bisecting • Solve an optimal grid size • The bisecting stops if the bisected rounds equal OPT + q, where q is a constant. • (3)Find all core grids. • (4)Merge each group of core grids. • The optimal value of q, • (5)Assign all non-core optimal grids to a cluster.
Experiments • Clustering results of the five artificial datasets.
Experiments • Clustering results of the three benchmark datasets.
Conclusion • The GGCA integrates the advantages of both the divisible and the agglomerative clustering algorithms. • The GGCA solves two critical problems of conventional grid-clustering algorithms. • Grid size • Merging condition • The GGCA is a perfect non-parametric algorithm. 8
Comments • Advantage • Improving of the iteration speed in a large dataset. • Drawback • This paper is not clear. • Application • clustering 9