1 / 0

Advanced Computer Vision

Advanced Computer Vision. Lecture 05 Roger S. Gaborski. Artificial Neural Network Classifier. Multilayer Neural Network Backward Error Propagation MATLAB: command line GUI. Procedure. Collect data Create the network Configure the network Initialize the weights and biases

shaw
Download Presentation

Advanced Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Computer Vision

    Lecture 05 Roger S. Gaborski Roger S. Gaborski
  2. Artificial Neural Network Classifier Multilayer Neural Network Backward Error Propagation MATLAB: command line GUI Roger S. Gaborski
  3. Procedure Collect data Create the network Configure the network Initialize the weights and biases Train the network Validate the network (post-training analysis) Roger S. Gaborski
  4. Collect data Feature values for different classes Example: Problem: classify image as inside or outside Collect database of labeled indoor and outdoor images Approximately equal number of indoor and outdoor images Divide database into training set, validation and testing set Extract (calculate) features Roger S. Gaborski
  5. Create the network Assume we have n different features Two output classes – indoor or outdoor Our network has n input neurons and 2 output neurons We need to specify the number of hidden layers of neurons and how many neurons in each layer For this example, assume n = 10 and we are using only one hidden layer Number of neurons in hidden layer = 6 Roger S. Gaborski
  6. Initialize the weights and biases Use random numbers for weights and biases Roger S. Gaborski
  7. Train the network The Matlab Toolbox has several different training algorithms built into the toolbox See Neural Network Toolbox Reference for additional information Roger S. Gaborski
  8. Validate the network (post-training analysis) After training and validation (used to determine weights during training) test on test data Roger S. Gaborski
  9. Example for Documentation house_dataset: The input matrix houseInputs consists of 506 column vectors of 13 real estate variables for 506 different houses. The target matrix houseTargets consists of the corresponding 506 relative valuations. >> load house_dataset houseInputs 13 x 506 houseTargets 1 x 506 Roger S. Gaborski
  10. Create the Network net = feedforwardnet; feedforwardnet(hiddenSizes,trainFcn) Default: one hidden layer with 10 neurons Training function (default = 'trainlm') 13:10:1 net = configure(net,houseInputs,houseTargets); Sets random weights Roger S. Gaborski
  11. Training a Network >> load house_dataset >> net = feedforwardnet; >> [net,tr] = train(net,houseInputs,houseTargets); Note ‘configure’ included in train Roger S. Gaborski
  12. plotperf(tr) plotperf(tr) – displays training performance Roger S. Gaborski
  13. nnstart GUI interface Roger S. Gaborski
More Related