1 / 36

COMP305. Part I .

COMP305. Part I. Artificial neural networks. Topic 3. Learning Rules of the Artificial Neural Networks. ANN Learning rules. McCulloch-Pitts neuron capable of: storing information and producing logical and arithmetical operations on it .

verity
Download Presentation

COMP305. Part I .

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COMP305. Part I. Artificial neural networks.

  2. Topic 3. Learning Rules of the Artificial Neural Networks.

  3. ANN Learning rules. McCulloch-Pitts neuron capable of:storing information and producing logical and arithmetical operations on it. The next step must be to realise another important function of the brain, which is to acquire new knowledge through experience, i.e. learning.

  4. ANN Learning rules. Learning means to change in response to experience. In a network of MP-neurons binary weights of connections and thresholds are fixed. The only change can be the change of pattern of connections, which is technically expensive. Some easily changeable free parameters are needed.

  5. ANN Learning rules. Abstract neuron. • The ideal free parameters to adjust, and so to resolve learning without changing pattern of connections, are the weights of connectionswji.

  6. ANN Learning rules. • Definition: ANN learning rule defines how to adjust the weights of connections to get desirable output.

  7. Hebb’s rule (1949). • Hebb conjectured that a particular type of use-dependent modificationof the connection strength of synapses might underlie learning in the nervous system.

  8. Hebb’s rule (1949). • Hebb introduced a neurophysiological postulate: “…When an axon of cell A is near enough to excite a cell B and repeatedly and persistently tales part in firing it, some growth process or metabolic change takes place in one or both cells, such that A’s efficiency as one of the cells firing B, is increased.”

  9. Hebb’s rule (1949). The simplest formalisation of Hebb’s rule is to increase weight of connection at every next instant in the way: (1) where (2)

  10. Hebb’s rule (1949). (1) where (2) here wjik is the weight of connection at instant k, wjik+1 is the weight of connection at the following instant k+1, Dwjikis increment by which the weight of connection is enlarged, Cis positive coefficient which determines learning rate, aikis input value from the presynaptic neuron at instant k, Xjkis output of the postsynaptic neuron at the same instant k.

  11. Hebb’s rule (1949). (1) where (2) • Thus, the weight of connection changes at the next instant only if both preceding input via this connection and the resulting output simultaneously are not equal to 0.

  12. Hebb’s rule (1949). (1) where (2) • Equation (2) emphasises the correlation nature of a Hebbian synapse. It is sometimes referred to as the activity product rule.

  13. Hebb’s rule (1949). (1) where (2) • For this reason, Hebb’s rule plays an important role in studies of ANN algorithms much “younger” than the rule itself, such as unsupervised learning or self-organisation, which we shall consider later.

  14. Hebb’s rule in practice. a Input unit No 1 2 3 4 w 1 1 w a 2 2 q X a w 3 3 w 4 a 4

  15. Hebb’s rule in practice. t=0C=1 Input unit No 1 2 3 4 1 1 q =2 X 1 1

  16. Hebb’s rule in practice. t=0C=1 Input unit No 1 2 3 4 1 1 1 0 q =2 X 1 1 1 0

  17. Hebb’s rule in practice. t=0C=1 Input unit No 1 2 3 4 1 1 1 0 q =2 X 1 1 1 0

  18. Hebb’s rule in practice. t=0C=1 Input unit No 1 2 3 4 1 1 1 0 q =2 X 1 1 1 0

  19. Hebb’s rule in practice. t=0C=1 Input unit No 1 2 3 4 1 1 1 0 q =2 1 1 1 1 0

  20. Hebb’s rule in practice. t=0C=1 Input unit No 1 2 3 4 1 1 1 0 q =2 1 1 1 1 0

  21. Hebb’s rule in practice. t=1 C=1 Input unit No 1 2 3 4 2 1 q =2 X 2 1

  22. Hebb’s rule in practice. t=1 C=1 Input unit No 1 2 3 4 2 1 q =2 X 2 1

  23. Hebb’s rule in practice. t=1C=1 Input unit No 1 2 3 4 1 2 1 0 q =2 X 1 2 1 0

  24. Hebb’s rule in practice. t=1C=1 Input unit No 1 2 3 4 1 2 1 0 q =2 X 1 2 1 0

  25. Hebb’s rule in practice. t=1C=1 Input unit No 1 2 3 4 1 2 1 0 q =2 1 1 2 1 0

  26. Hebb’s rule in practice. t=1C=1 Input unit No 1 2 3 4 1 2 1 0 q =2 1 1 2 1 0

  27. Hebb’s rule in practice. t=2 C=1 Input unit No 1 2 3 4 3 1 q =2 X 3 1

  28. Hebb’s rule in practice. t=2 C=1 Input unit No 1 2 3 4 3 1 q =2 X 3 1

  29. Hebb’s rule in practice. t=2C=1 Input unit No 1 2 3 4 3 1 q =2 X 3 1

  30. Hebb’s rule in practice. t=2C=1 Input unit No 1 2 3 4 1 3 1 0 q =2 X 1 3 1 0

  31. Hebb’s rule in practice. t=2C=1 Input unit No 1 2 3 4 1 3 1 1 q =2 X 0 3 1 0

  32. Hebb’s rule in practice. t=2C=1 Input unit No 1 2 3 4 1 3 1 1 q =2 1 0 3 1 0

  33. Hebb’s rule in practice. t=2C=1 Input unit No 1 2 3 4 1 3 1 1 q =2 1 0 3 1 0

  34. Hebb’s rule in practice. t=3C=1 Input unit No 1 2 3 4 4 2 q =2 X 3 1

  35. Hebb’s rule in practice. t=3C=1 Input unit No 1 2 3 4 4 2 q =2 X 3 1 And so on…

  36. Next - Perceptron (1958). • Rosenblatt(1958) explicitly considered the problem of pattern recognition, where a “teacher” is essential.

More Related