1 / 33

Pattern Recognition: Statistical and Neural

This lecture explores the training and analysis of neural networks, including topics such as the perceptron algorithm, local delta training algorithms, and general definitions of neural networks. Examples of neural network structures and methods for analyzing and synthesizing neural networks are also discussed.

ledford
Download Presentation

Pattern Recognition: Statistical and Neural

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nanjing University of Science & Technology Pattern Recognition:Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005

  2. Lecture 20 Topics 1. Perceptron Algorithm Revisited 2. Local Delta Training Algorithm for ANE 3. General Definition of Neural Networks 4. Basic Neural Network Structures-Examples 5. Analysis and Synthesis of Neural Networks

  3. Signum Function Activation Training Algorithm(Perceptron) Review y = +1 if input vector x is from C1 y = -1 if input vector x is from C2 Weight Update Algorithm

  4. Question How do we train an Artificial Neural Element(ANE) to do classification ??? Answer Use the Delta Training Algorithm !!!

  5. Given an Artificial Neural Element as follows Wish to find weight vector such that training patterns are correctly classified

  6. Given: x(p) ε { x1, x2, … , xK } d( x(p) ) = { d(x1), d(x2), … , d(xK) } Define a performance measure Ep for sample x(p) and decision d[ x(p) ] as

  7. Derivation of Delta weight update Equation Use the gradient method to minimize Ep New Weight wk+1 in terms of previous weight wk where the Gradient is

  8. Substituting the gradient vector into the weight update gives the GeneralLocal Delta Algorithm or rewriting gives w(p+1) = w(p) + {d[x(p)] – f(net)} f /(net)) x(p) where net = wT(p)x(p) General Local Delta Algorithm Weight Update Equation

  9. Sometimes called the Continuous Perceptron Training Algorithm

  10. Case 1: Local Delta Algorithm for Training an ANE with Logistic Activation Function Given: Solution:

  11. Substituting the derivative gives the Local algorithm for the Logistic Activation function as Local Weight Update Equation for Logistic Activation Function

  12. Case 2: Local Delta Algorithm for for Training an ANE - Hyperbolic Tangent Activation Function Given: Solution; Taking derivative of the nonlinearity and substituting into the general update equation yields the following Local Weight Update Equation for Hyperbolic Activation Function

  13. Scale Factors for Case 2: Tanh Activation Function SF = ( d[x(p) ] –f(net) )(1 – f 2(net) ) d[x(p)]= 1 SF1 = ( 1 – f(net) )(1 – f 2(net) ) d[x(p)] = -1 SF-1 = ( -1 – f(net) )(1 – f 2(net) ) d[x(p)]= 1 d[x(p)] = -1

  14. Scale Factors for Case 2: Tanh Activation Function (desired values = +0.9 and -0.9 )

  15. Case 3: Local Delta Algorithm for Training an ANE - Linear Activation Function Given: Solution:Taking derivative and substituting in general update equation gives Local Weight Update Equation for Linear Activation Function ( Widrow-Hoff Training Rule )

  16. General Global Delta Algorithm Define a performance measure ETOT for all samples xk and decisions d[ xk) ] as Using Gradient technique gives the Global Delta Algorithm as Global Weight Update Equation

  17. Definitions A Neural Network is defined as any connection of Neural Elements. An Artificial Neural Networkis defined as any connection of Artificial Neural Elements.

  18. Examples of Artificial Neural Networks Feed Forward Artificial Neural Networks (a) Two Layer neural Network (b) Special Three Layer Form: Hyperplane-AND-OR structure (c) General 3-Layer Feedforward structure and nomenclature Feedback Artificial Neural Networks (d) One Layer Hopfield Net (e) Two Layer Feedback

  19. (a) Example - Two Layer Neural Network Using Signum Nonlinearity

  20. (b) Special Hyperplane-AND-OR structure input output Layer 1 Layer 2 Layer 3 y x Hyperplanes Logical AND Logical OR

  21. Building Block- Hyperplane μ

  22. Building Block- AND μ -(n-½)

  23. Building Block- OR μ ½

  24. (b) Example- Hyperplanes-AND-OR Structure Hyperplanes Layer allf(·) = u(·) unit step AND Layer OR Layer

  25. (c) General Feedforward Structure

  26. (d) Example: Feedback Structure one Layer

  27. (e) Example: Feedback Structure Two Layer /

  28. Definitions: Analysis of Neural Networks- Given a Neural Network describe the output for all inputs ( Mathematical or computer generated) Synthesis of Neural Networks- Given a list of properties and requirements build a Neural Network to satisfy the requirements ( Mathematical or computer generated)

  29. Example:Analyze the following Neural Network -1 0 1 1 1 0 0 -1 1 Solution: Determine the output y1(2) for all (x1,x2). (Next Lecture)

  30. Example:Synthesize a Neural Network Given the following decision regions build a neural network to perform the classification process Solution: Use Hyperplane-AND-OR Structure (Next Lecture)

  31. Summary Lecture 20 1. Perceptron Algorithm Revisited 2. Local Delta Training Algorithms for ANE 3. General Definition of Neural Networks 4. Basic Neural Network Structures-Examples 5. Analysis and Synthesis of Neural Networks

  32. Question How do we train an Artificial Neural Network to perform the classification problem??? Answer Not a simple answer but we will look at one way that uses the backpropagation algorithm to do the Training. Not Today, we have to wait until Friday. ☺☻☺☻☺☻☺☻☺

  33. End of Lecture 20

More Related