150 likes | 274 Views
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric. Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL. Outlines. Motivation Objectives Methodology Experiments Conclusions Comments. Motivation.
E N D
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric Presenter : Fen-Rou, CiouAuthors : Toh Koon Charlie Neo, Dan Ventura2012, PRL
Outlines • Motivation • Objectives • Methodology • Experiments • Conclusions • Comments
Motivation • The k-nearest neighbor pattern classifier is an effective learning algorithm, it can result in large model sizes.
Objectives • The paper present a direct boosting algorithm for the k-NN classifier that creates an ensemble of models with locally modified distance weighting to increase the accuracy and condense the model size.
Methodology - Framework x Dz xi v = {+, } AdaBoost
Methodology Sensitivity data order - Randomize - Batch update
Methodology Sensitivity data order - Randomize - Batch update
Methodology Voting mechanism - simple voting - error-weigh voting
Methodology Condensing model size - optimal weight - average the weight
Experiments Fig 8. Boosted k-NN with randomized data order. Fig 9. Boosted k-NN with batch update. Fig 11. Boosted k-NN with optimal weights. Fig 10. Boosted k-NN with error-weighted voting. Fig 12. Boosted k-NN with average weights.
Conclusions • The Boosted k-NN can boost the generalization accuracy of the k-nearest neighbor algorithm. • The Boosted k-NN algorithm modifier the decision surface, producing a better solution.
Comments • Advantages • The paper describes rich experiment. • Applications • classification