240 likes | 1.58k Views
Supervised Hebbian Learning. Hebb’s Postulate.
E N D
Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” D. O. Hebb, 1949 B A
Linear Associator SxR SxR SxR Training Set:
Hebb Rule Presynaptic Signal Postsynaptic Signal Simplified Form: Supervised Form: Matrix Form:
T p p p p ¼ P = 1 1 2 Q T p T W T P = = t t t 2 ¼ 1 2 Q ¼ T t t t ¼ T = p 1 2 Q Q Batch Operation (Zero Initial Weights) Matrix Form:
= 0 q ¹ k Performance Analysis Case I, input patterns are orthogonal. Therefore the network output equals the target: Case II, input patterns are normalized, but not orthogonal. Error
Example Banana Apple Normalized Prototype Patterns Weight Matrix (Hebb Rule): Tests: Banana Apple
t t t p p p ¼ ¼ T P = = 1 2 Q 1 2 Q 2 2 å å E || || = e i j i j Pseudoinverse Rule - (1) Performance Index: Matrix Form:
Pseudoinverse Rule - (2) Minimize: If an inverse exists for P, F(W) can be made zero: When an inverse does not exist F(W) can be minimized using the pseudoinverse:
T W T P = Relationship to the Hebb Rule Hebb Rule Pseudoinverse Rule If the prototype patterns are orthonormal:
Tests 50% Occluded 67% Occluded Noisy Patterns (7 pixels)
Variations of Hebbian Learning Basic Rule: Learning Rate: Smoothing: Delta Rule: Unsupervised: