320 likes | 531 Views
A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN). Presenter : Lin, Shu -Han Authors : Shen Furao ,, Osamu Hasegawa. Neuron Networks (NN, 2008). Outline. Introduction Motivation Objective Methodology Experiments Conclusion Comments.
E N D
A fast nearest neighbor classifier based onself-organizing incremental neural network(SOINN) Presenter : Lin, Shu-Han • Authors : ShenFurao,, Osamu Hasegawa Neuron Networks (NN, 2008)
Outline • Introduction • Motivation • Objective • Methodology • Experiments • Conclusion • Comments
Introduction-self-organizing incremental neural network(SOINN) Distance: Toofar Node =prototype
Introduction-self-organizing incremental neural network(SOINN) Link age
Introduction-self-organizing incremental neural network(SOINN) Age: Tooold
Introduction-self-organizing incremental neural network(SOINN) Insertnodeiferrorislarge CancelInsertionifinsertisnouse Runtwotimes
Introduction-self-organizing incremental neural network(SOINN) Deleteoutlier: Nodeswithoutneighbor(low-densityassumption) Runtwotimes
Motivation • SOINNclassifier(theirfirstresearchin2005) • Use6userdeterminedparameters • Donotmentionedaboutnoise • Toomanyprototypes • Unsupervisedlearning • Theirsecondresearch(in2007)talkabouttheseweakness
Objectives • Propose a ImprovedversionofSOINN,ASC(AdjustSOINNClassifier) • FASTER: delete/less prototype • Training phase • Classification phase • CLASSIFIER: 1-NN (prototype) rule • INCREMENTAL LEARNING • ONE LAYER: easy to understand the setting, less parameters~ • MORE STABLE: help of k-means
Methodology– Adjusted SOINN Distance: Toofar Anodeisacluster 10
Methodology– Adjusted SOINN Link age 11
Methodology– Adjusted SOINN Winner Neighbor 12
Methodology– Adjusted SOINN Age: Too old > ad 13
Methodology– Adjusted SOINN Deleteoutlier: Nodeswithoutneighbor(low-densityassumption) 14
Methodology– Adjusted SOINN Lambda = iterations 15
Methodology– k-means • Help of k-means clustering, k = # of neurons • Adjust the result prototypes: assume that each node nearby the centroid of class 16
Methodology– noise-reduction • Help of k-Edit Neighbors Classifier (ENC), k=? • Delete the node which label are differs from the majority voting of its k-neighbors: assume that are generated by noise 17
Methodology– center-cleaning Delete neurons: if it has never been the nearest neuron to other class: assume that are lies in the central part of class 18
Experiments: Artificial dataset dataset Adjusted SOINN Error: same Speed: faster 19 ASC
Experiments: Artificial dataset dataset Adjusted SOINN Error: same Speed: faster 20 ASC
Experiments: Artificial dataset dataset Adjusted SOINN Error: better Speed: faster 21 ASC
Experiments: Artificial dataset dataset Adjusted SOINN Error: better Speed: faster 22 ASC
Experiments: Real dataset Compression ratio (%) Speed up ratio (%) 23
Experiments: Compare with other prototype-based classification method Nearest Subclass Classifier (NSC) k-Means Classifier (KMC) k-NN Classifier (NNC) Learning Vector Quantization (LVQ) 24
Experiments: Compare with other prototype-based classification method 25
Conclusions • ASC • Learns the number of nodes needed to determine the decision boundary • Incremental neural network • Robust to noisy training data • Fast classification • Fewer parameters: 3 parameters
Comments • Advantage • Improve many things • A previous paper to demonstrate the thing they want to modify • Drawback • NO Suggestion of parameters • Application • A work from unsupervised learning to supervised learning