140 likes | 297 Views
Estimation Techniques. CSC3417 Semester 1, 2007. This is a snapshot from the course website. We will investigate!. The Resources. Study Book Chapter 5. Table of Contents Techniques, algorithms, and ideas. Solve equations to find the coefficients. Read sb5, Section 3.
E N D
Estimation Techniques CSC3417 Semester 1, 2007
This is a snapshot fromthe course website. We will investigate! The Resources
Study Book Chapter 5 • Table of Contents • Techniques, algorithms, and ideas
Solve equations to find the coefficients Read sb5, Section 3 • Setting up and solving equations to find the coefficients is remarkably easy. • Let’s do it, in scilab. • For simplicity, we will use only radial basis functions – no linear or constant terms at all. • Let’s start by plotting r2 log r
Here is the code to plot a radial basis function x=-1:0.05:1 y=x; for k=1:length(x) for j=1:length(y) r = sqrt(x(k)^2+y(j)^2); z(k,j)=(r^2) * log(r+0.0001); end end Plot3d1(x,y,z); R is the radial distance from (0,0)
Generate some data (and plot it) xydata = zeros(20*20,3); zclean = zeros(20,20); z=zclean; for x1 = 1:20 for x2 = 1:20 xydata(x2+20*(x1-1),2) = x1; xydata(x2+20*(x1-1),3) = x2; xydata(x2+20*(x1-1),1) = func2(x1,x2); zclean(x1,x2) = xydata(x2+20*(x1-1),1) ; end end xx1 = 1:20; xx2 = 1:20; plot3d1(xx1,xx2,zclean);
Add noise rand(“normal”); for x1 = 1:20 for x2 = 1:20 xydata(x2+20*(x1-1),1) = func2(x1,x2) + 2*rand(); z(x1,x2) = xydata(x2+20*(x1-1),1) ; end end plot3d1(xx1,xx2,z); func2.sci is a handyfunction storedseparately (see resources)
Generate radial basis fn eqns and solve them phi.sci handlesvector parameters (see resources) phimat=phi(xydata(:,2:3),xydata(:,2:3)); [U,S,V]=svd(phimat); for k=1:400 S(k,k)=1/(S(k,k)+2); // shrink rather than invert end A = V*S*U.'*xydata(:,1); // we solve for the coefficients, A, but smooth // our estimates by the shrinking Phimat*A=xydata(:,3) is the matrixto solve; A is the soln.
Evaluate and then plot the solution for j=1:20 for k=1:20 B = (phi([xx1(j),xx2(k)],xydata(:,2:3))); soln(j,k) = B*A ; end end plot3d1(xx1,xx2,soln);
Plot the estimated signal • Here it is: rms(z) rms(zclean) rms(z-soln) rms(zclean-soln)
Algorithmic Complexity • As discussed in the SB, the main issue is the dimensionality. • This is a good reason to avoid use of the linear component in a RDF model. • Note: RBF models without the linear component are not so different from NN models.
NN Implementation • Key issue: how to find the nearest neighbours quickly. • This is solved by storing the dataset in an appropriate datastructure • Decision trees provide such a structure • This also implies a better method for identifying the nearest neighbours.
Nearest Neighbour Methods Very easy toimplement. • Simplest Method • Choose k, how many neighbours to use (this is the fitting) • To predict (plot the fitted surface) • Find the k nearest neighbours and use their “average” . • Key advantage: • Inherently non-parametric • Not very sensitive to the number x coordinates.
Next Week: Neural Networks and Self-organising maps • These are alternative ways to define hypersurfaces. • They rely on the concept of “learning” the response function from a “training set”. • Learning is a natural concept. • But it’s slower than fitting by linear equations.