450 likes | 604 Views
Paper study-. A Very Fast Neural Learning for Classification Using Only New Incoming Datum. Saichon Jaiyen , Chidchanok Lursinsap , Suphakant Phimoltares IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 21, NO. 3, MARCH 2010. OUTLINE. Introduction VEBF Neural Network Example for Training
E N D
Paper study- A Very Fast Neural Learning for ClassificationUsing Only New Incoming Datum SaichonJaiyen, ChidchanokLursinsap, SuphakantPhimoltares IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 21, NO. 3, MARCH 2010
OUTLINE Introduction VEBF Neural Network Example for Training Experimental Results
OUTLINE Introduction VEBF Neural Network Example for Training Experimental Results
Introduction • Most current training algorithms require both new incoming data and those previously trained data together in order to correctly learn the whole data set. • This paper propose the very fast training algorithm to learn a data set in only one pass. • The structure of proposed neural network consists of three layersbut the structure is flexible and can be adjusted during the training process.
OUTLINE Introduction VEBF Neural Network Example for Training Experimental Results
VEBF Neural Network • VEBF : versatile elliptic basis function • Outline of learning algorithm • 1. add a training data to the VEBF neural network • 2. create a new neuron or not • Create:set the parameters • It can join into a current node: update • 3.detect merge condition
VEBF Neural Network • for each vector x = [x1,x2,…,xn]Tin Rnand orthonormal basis {u1,u2,…,un} for Rn • xi = xTui • the hyperellipsoidal equation unrotated and centered at the origin is defined as • Where ai is the width of the ith axis in hyperellipsoid. • The simplification can be written as • Define a new basis function as
VEBF Neural Network If the original axes of the hyperellipsoidal equation are translated from the origin to the coordinates of c= [c1,c2,…,cn]T . Consequently, the new coordinates of vector x, denoted by x’ = [x’1,x’2,…,x’n]T, with respect to the new axes can be written as The simplification can be written as
VEBF Neural Network • The VEBF as • Where {u1,u2,…,un}is the orthonormal basis, the constant ai,i = 1,…,n, is the width of the ithaxis in the hyperellipsoid, and the center vector c = [c1,c2,…,cn]Trefers to the mean vector.
VEBF Neural Network • Let X = {(xj,tj)| 1<= j <= N}be a finite set of N training data, where xjRn is a feature vector referred to as a data vector and tjis the class label of the vector xj. • We denote Ωkas a 5-tuple, (C(k),S(k),Nk,Ak,dk), • C(k) = [c1,c2,…,cn]T is the center of the kth neuron • S(k) is the covariance matrix of the kth neuron • Nkis the total number of data related to the kth neuron • Ak is the widthvector of the kth neuron • dk is the class labelof the kth neuron
VEBF Neural Network • Let cs be the index of this closest hidden neuron. • If > 0 , a new hidden neuron is allocated and added into the network. • If < 0 , joint to the closest hidden neuron.
VEBF Neural Network • Mean computation • The recursive relation can be written as follows • where is the new mean vector,, and
VEBF Neural Network • Covariance matrix computation • The recursive relation can be written as follows • where is the new covariance matrix,, and • The orthonormal axes vectors are computed by the eigenvectors of the sorted eigenvalues of the covariance matrix.
VEBF Neural Network - merge • Let Ωx = (C(x),S(x),Nx,Ax,dx) andΩy= (C(y),S(y),Ny,Ay,dy) be any two hidden neurons x and y in a VEBF neural network. • merging criterion : • If merging criterion , then these two hidden neurons are merged into one new hidden neuron Ωnew= (C(new),S(new),Nnew,Anew,d new) . • is the threshold
VEBF Neural Network - merge • The new parameters of this new hidden neuron can be computed as follows: • Where is the itheigenvalue of the new covariance matrix.
OUTLINE Introduction VEBF Neural Network Example for Training Experimental Results
Example for Training Suppose that X = {(5,16)T,0), (15,6)T,1),(10,18)T,0),(5,6)T,1),(11,16)T,0)} is a set of training data in R2. Suppose that the training data in class 0 are illustrated by “ + ” while the training data of class 1 is illustrated by “ *.”
Example for Training • 1. The training data (5,16)T,0) is presented to the VEBF neural network. • class 0 • create a new neuron
Example for Training • 2. The training data (15,6)T,1)is fed to the VEBF neural network. • class 1 • create a new neuron
Example for Training • 3. The training data (10,18)T,0)is fed to the VEBF neural network. • class 0 • find the closest neuron • detect the distance • update the parameters
Example for Training • 3. The training data (10,18)T,0)is fed to the VEBF neural network. • class 0 • find the closest neuron • detect the distance • update the parameters
Example for Training • 4. The training data (5,6)T,1)is fed to the VEBF neural network. • class 1 • find the closest neuron • detect the distance • create a new neuron
Example for Training • 4. The training data (5,6)T,1)is fed to the VEBF neural network. • class 1 • find the closest neuron • detect the distance • create a new neuron
Example for Training • 5. The training data (11,16)T,0)is fed to the VEBF neural network. • class 0 • find the closest neuron • detect the distance • update the parameters
Example for Training • 5. The training data (11,16)T,0)is fed to the VEBF neural network. • class 0 • find the closest neuron • detect the distance • update the parameters
OUTLINE Introduction VEBF Neural Network Example for Training Experimental Results
Experimental Results • In multiclass classification problem • the results are compared with the conventional RBF neural network with Gaussian RBF, multilayer perceptron (MLP). • In two-classclassification problem • the results are also compared with the support vector machine (SVM)
Experimental Results The data sets used to train and test are collected from the University of California at Irvine (UCI) Repository of the machine learning database.
Experimental Results Multiclass classification
Experimental Results Multiclass classification
Experimental Results Multiclass classification
Experimental Results Multiclass classification
Experimental Results Multiclass classification
Experimental Results Two-class classification
Experimental Results Two-class classification
Experimental Results Two-class classification
Experimental Results Two-class classification
Experimental Results Two-class classification
Experimental Results Two-class classification
Experimental Results Two-class classification
Experimental Results Two-class classification