1 / 21

Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer

Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer. Artificial Neural Networks (ANN) / 1. Output Y is 1 if at least two of the three inputs are equal to 1. Artificial Neural Networks (ANN) / 2. Artificial Neural Networks (ANN) / 3.

kaycee
Download Presentation

Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bab 5Classification: Alternative TechniquesPart 4Artificial Neural Networks Based Classifer

  2. Artificial Neural Networks (ANN) / 1 Output Y is 1 if at least two of the three inputs are equal to 1.

  3. Artificial Neural Networks (ANN) / 2

  4. Artificial Neural Networks (ANN) / 3 • Model is an assembly of inter-connected nodes and weighted links • Output node sums up each of its input value according to the weights of its links • Compare output node against some threshold t Perceptron Model or

  5. General Structure of ANN Training ANN means learning the weights of the neurons

  6. Algorithm for Learning ANN • Initialize the weights (w0, w1, …, wk) • Adjust the weights in such a way that the output of ANN is consistent with class labels of training examples • Objective function: • Find the weights wi’s that minimize the above objective function • e.g., backpropagation algorithm

  7. Artificial Neural Networks (ANN) / 2

  8. Perceptron

  9. Perceptron Learning Rule / 1 • Let D = {(xi, yi) | i= 1,2,…,N} be the set of training examples • Initialize the weights • Repeat • For each training example (xi, yi) do • Compute f(w, xi) • For each weight wj do Update the weight • Until stopping condition is met

  10. Perceptron Learning Rule / 2 • Weight update formula: • Intuition: • Update weight based on error • If y = f(w,x), e = 0, no update is needed • If y > f(w,x), e = 2, weight must be increased so that f(w,x) will increase • If y < f(w,x), e = -2, weight must be decreased so that f(w,x) will decrease

  11. Perceptron Learning Rule / 3 • Terminating condition: Training stops when either • all wij in the previous epoch (i.e., iteration) were so small as to be below some specified threshold, or • the percentage of samples misclassified in the previous epoch is below some threshold, or • a pre-specified number of epochs has expired. In practice, several hundreds of thousands of epochs may be required before the weights will converge

  12. Example of Perceptron Learning

  13. Perceptron Learning

  14. Nonlinearly Separable Data

  15. Multilayer Neural Network / 1

  16. Multilayer Neural Network / 2

  17. Learning Multilayer Neural Network

  18. Gradient Descent for Multilayer NN / 1

  19. Gradient Descent for Multilayer NN / 2

  20. Design Issues in ANN

  21. Characteristics of ANN

More Related