1 / 9

Neural Networks and SVMs: Image Recognition - Exam Preparation

Learn about SVMs, multilayer feedforward neural networks, perceptrons, layers, backpropagation, and more for image recognition. Discover the power of neural networks in classifying images. MATLAB demonstrations included.

bhenderson
Download Presentation

Neural Networks and SVMs: Image Recognition - Exam Preparation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSSE463: Image Recognition Day 21 • Upcoming schedule: • Exam covers material through SVMs

  2. Multilayer feedforward neural nets • Many perceptrons • Organized into layers • Input (sensory) layer • Hidden layer(s): 2 proven sufficient to model any arbitrary function • Output (classification) layer • Powerful! • Calculates functions of input, maps to output layers • Example y1 x1 x2 y2 x3 y3 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) Q4

  3. XOR example • 2 inputs • 1 hidden layer of 5 neurons • 1 output

  4. Backpropagation algorithm Initialize all weights randomly • For each labeled example: • Calculate output using current network • Update weights across network, from output to input, using Hebbian learning • Iterate until convergence • Epsilon decreases at every iteration • Matlab does this for you. • matlabNeuralNetDemo.m y1 x1 x2 y2 x3 y3 a. Calculate output (feedforward) R peat b. Update weights (feedback) Q5

  5. Parameters • Most networks are reasonably robust with respect to learning rate and how weights are initialized • However, figuring out how to • normalize your input • determine the architecture of your net • is a black art. You might need to experiment. One hint: • Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best.

  6. References • This is just the tip of the iceberg! See: • Sonka, pp. 404-407 • LaureneFausett. Fundamentals of Neural Networks. Prentice Hall, 1994. • Approachable for beginner. • C.M. Bishop. Neural Networks for Pattern Classification. Oxford University Press, 1995. • Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.) • Matlab neural net help is very good

  7. SVMs vs. Neural Nets • SVM: • Training can take a long time with large data sets. Consider that you’ll want to experiment with parameters… • But the classification runtime and space are O(sd), where s is the number of support vectors, and d is the dimensionality of the feature vectors. • In the worst case, s = size of whole training set (like nearest neighbor) • But no worse than implementing a neural net with sperceptrons in the hidden layer. • Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge. • Neural networks: • can tune architecture. Lots of parameters! Q3

  8. How does svmfwd compute y1? y1 is just the weighted sum of contributions of individual support vectors: d = data dimension, e.g., 294, s = kernel width. • Much easier computation than training • Could implement on a device without MATLAB (e.g., a smartphone) easily • Check out sample Java code numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project)

  9. CSSE463 Roadmap

More Related