1k likes | 1.23k Views
Neural networks for data mining. Eric Postma MICC-IKAT Universiteit Maastricht. Overview. Introduction: The biology of neural networks the biological computer brain-inspired models basic notions Interactive neural-network demonstrations Perceptron Multilayer perceptron
E N D
Neural networksfor data mining Eric Postma MICC-IKAT Universiteit Maastricht
Overview Introduction: The biology of neural networks • the biological computer • brain-inspired models • basic notions Interactive neural-network demonstrations • Perceptron • Multilayer perceptron • Kohonen’s self-organising feature map • Examples of applications
Two types of learning • Supervised learning • curve fitting, surface fitting, ... • Unsupervised learning • clustering, visualisation...
The history of neural networks • A powerful metaphor • Several decades of theoretical analyses led to the formalisation in terms of statistics • Bayesian framework • We discuss neural networks from the original metaphorical perspective
(Artificial) neural networks The digital computer versus the neural computer
Digital versus biological computers 5 distinguishing properties • speed • robustness • flexibility • adaptivity • context-sensitivity
Speed: The “hundred time steps” argument The critical resource that is most obvious is time. Neurons whose basic computational speed is a few milliseconds must be made to account for complex behaviors which are carried out in a few hudred milliseconds (Posner, 1978). This means that entire complex behaviors are carried out in less than a hundred time steps. Feldman and Ballard (1982)
Graceful Degradation performance damage
Adaptivitiy processing implies learning in biological computers versus processing does not imply learning in digital computers
Context-sensitivity: patterns emergent properties
The neural computer • Is it possible to develop a model after the natural example? • Brain-inspired models: • models based on a restricted set of structural en functional properties of the (human) brain
Neural activity out in
Learning: Hebb’s rule neuron 1 synapse neuron 2
Connectivity An example: The visual system is a feedforward hierarchy of neural modules Every module is (to a certain extent) responsible for a certain function
(Artificial) Neural Networks • Neurons • activity • nonlinear input-output function • Connections • weight • Learning • supervised • unsupervised
Artificial Neurons • input (vectors) • summation (excitation) • output (activation) i
1 f(x) = 1 + e -x/a Input-output function • nonlinear function: a 0 f(e) a e
wAB A B Artificial Connections (Synapses) • wAB • The weight of the connection from neuron A to neuron B
Learning in the Perceptron • Delta learning rule • the difference between the desired output tand the actual output o, given inputx • Global error E • is a function of the differences between the desired and actual outputs
The history of the Perceptron • Rosenblatt (1959) • Minsky & Papert (1961) • Rumelhart & McClelland (1986)
The multilayer perceptron input output one or more hidden layers
Training the MLP • supervised learning • each training pattern: input + desired output • in each epoch: present all patterns • at each presentation: adapt weights • after many epochs convergence to a local minimum
phoneme recognition with a MLP Output: pronunciation input: frequencies