190 likes | 402 Views
INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP. (Text book: section 2.11, pp.146-155; section 3.7.3., pp.218-221); section 4.2, pp.267-282;catch-up reading: pp.251-266). Machine learning. Issues in machine learning
E N D
INFO331Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP (Text book: section 2.11, pp.146-155; section 3.7.3., pp.218-221); section 4.2, pp.267-282;catch-up reading: pp.251-266)
Machine learning • Issues in machine learning • Learning from static versus learning from dynamic data • Incremental learning • On-line learning, adaptive learning • Life-long learning • Cognitive learning processes in humans
Inductive learning • learning from examples • Inductive decision trees and the ID3 algorithm • Information gain evaluation
Other methods of machine learning • Learning by doing • Learning from advice • Learning by analogy • Case-based learning and reasoning • Template-based learning (Kasabov and Clarke) - Iris example
Learning fuzzy rules from data • Cluster-based methods • Fuzzy template -based method (Kasabov, 96), pp.218-219 • Wang’s method (pp.220-221) • Advantages and disadvantages
Supervised learning in neural networks • Supervised learning in neural networks • Perceptrons • Multilayer perceptrons (MLP) and the backpropagation algorithm • MLP as universal approximators • Problems and features of the MPL
Supervised learning in neural networks • The learning principle is to provide the input values and the desired output values for each of the training examples. • The neural network changes its connection weights during training. • Calculate the error: • training error - how well a NN has learned the data • test error - how well a trained NN generalises over new input data.
Perceptrons • fig.4.8
Perceptrons • fig.4.9
Perceptrons • fig.4.10
MLP and the backpropagation algorithm • fig.4.11
MLP and the backpropagation algorithm • fig.4.12
MLP and the backpropagation algorithm • fig.4.13
MLPs as statistical tools • A MLP with one hidden layer can approximate any continuous function to any desired accuracy (Hornik et al, 1989) • MLP are multivariate non-linear regression models • MLP can learn conditional probabilities
Problems and features of the MPL • How to chose the number of the hidden nodes • Catastrophic forgetting • Introducing hints in neural networks • Overfitting (overlearning)
Problems and features of the MPL • Catastrophic forgetting • fig. 4.14
Problems and features of the MPL • Introducing hints • fig.4.15
Problems and features of the MPL • Overfitting • fig. 4.16