1 / 18

INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP

INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP. (Text book: section 2.11, pp.146-155; section 3.7.3., pp.218-221); section 4.2, pp.267-282;catch-up reading: pp.251-266). Machine learning. Issues in machine learning

mardi
Download Presentation

INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INFO331Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP (Text book: section 2.11, pp.146-155; section 3.7.3., pp.218-221); section 4.2, pp.267-282;catch-up reading: pp.251-266)

  2. Machine learning • Issues in machine learning • Learning from static versus learning from dynamic data • Incremental learning • On-line learning, adaptive learning • Life-long learning • Cognitive learning processes in humans

  3. Inductive learning • learning from examples • Inductive decision trees and the ID3 algorithm • Information gain evaluation

  4. Other methods of machine learning • Learning by doing • Learning from advice • Learning by analogy • Case-based learning and reasoning • Template-based learning (Kasabov and Clarke) - Iris example

  5. Learning fuzzy rules from data • Cluster-based methods • Fuzzy template -based method (Kasabov, 96), pp.218-219 • Wang’s method (pp.220-221) • Advantages and disadvantages

  6. Supervised learning in neural networks • Supervised learning in neural networks • Perceptrons • Multilayer perceptrons (MLP) and the backpropagation algorithm • MLP as universal approximators • Problems and features of the MPL

  7. Supervised learning in neural networks • The learning principle is to provide the input values and the desired output values for each of the training examples. • The neural network changes its connection weights during training. • Calculate the error: • training error - how well a NN has learned the data • test error - how well a trained NN generalises over new input data.

  8. Perceptrons • fig.4.8

  9. Perceptrons • fig.4.9

  10. Perceptrons • fig.4.10

  11. MLP and the backpropagation algorithm • fig.4.11

  12. MLP and the backpropagation algorithm • fig.4.12

  13. MLP and the backpropagation algorithm • fig.4.13

  14. MLPs as statistical tools • A MLP with one hidden layer can approximate any continuous function to any desired accuracy (Hornik et al, 1989) • MLP are multivariate non-linear regression models • MLP can learn conditional probabilities

  15. Problems and features of the MPL • How to chose the number of the hidden nodes • Catastrophic forgetting • Introducing hints in neural networks • Overfitting (overlearning)

  16. Problems and features of the MPL • Catastrophic forgetting • fig. 4.14

  17. Problems and features of the MPL • Introducing hints • fig.4.15

  18. Problems and features of the MPL • Overfitting • fig. 4.16

More Related