100 likes | 123 Views
Kernel nearest means. Usman Roshan. Feature space transformation. Let Φ (x) be a feature space transformation. For example if we are in a two-dimensional vector space and x=(x 1 , x 2 ) then. Computing Euclidean distances in a different feature space.
E N D
Kernel nearest means Usman Roshan
Feature space transformation • Let Φ(x) be a feature space transformation. • For example if we are in a two-dimensional vector space and x=(x1, x2) then
Computing Euclidean distances in a different feature space • The advantage of kernels is that we can compute Euclidean and other distances in different features spaces without explicitly doing the feature space conversion.
Computing Euclidean distances in a different feature space • First note that the Euclidean distance between two vectors can be written as • In feature space we have where K is the kernel matrix.
Computing distance to mean in feature space • Recall that the mean of a class (say C1) is given by • In feature space the mean Φm would be
Computing distance to mean in feature space • Replace K(m,m) and K(m,x) with calculations from previous slides
Kernel nearest means algorithm • Compute kernel • Let xi (i=0..n-1) be the training datapoints and yi (i=0..n’-1) the test. • For each mean mi compute K(mi,mi) • For each datapoint yi in the test set do • For each mean mj do • dj = K(mj,mj) + K(yi,yi) - 2K(mi,yj) • Assign yi to the class with the minimum dj