700 likes | 1.01k Views
Memory. Hopfield Network. Content addressable Attractor network. Hopfield Network. Hopfield Network. General Case: Lyapunov function. Neurophysiology. Mean Field Approximation. Null Cline Analysis. What are the fixed points?. E. I. C I. C E. Null Cline Analysis.
E N D
Hopfield Network • Content addressable • Attractor network
Hopfield Network • General Case: • Lyapunov function
Null Cline Analysis • What are the fixed points? E I CI CE
Null Cline Analysis • What are the fixed points?
I I E Null Cline Analysis Unstable fixed point E Stable fixed point
I Null Cline Analysis E E
Null Cline Analysis I E E
Null Cline Analysis I E E
Null Cline Analysis I E E
Null Cline Analysis Stable branches I Unstable branch E E
Null Cline Analysis I E E
Null Cline Analysis I Stable fixed point I E
E Null Cline Analysis I I E
E Null Cline Analysis I I E
Null Cline Analysis Inhibitory null cline I Excitatory null cline E Fixed points
E I CI CE Binary Memory I E
E I CI CE Binary Memory Storing I Decrease inhibition (CI) E
E I CI CE Binary Memory Storing I Back to rest E
E I CI CE Binary Memory Reset I Increase inhibition E
E I CI CE Binary Memory Reset I Back to rest E
Networks of Spiking Neurons • Problems with the previous approach: • Spiking neurons have monotonic I-f curves (which saturate, but only at very high firing rates) • How do you store more than one memory? • What is the role of spontaneous activity?
Networks of Spiking Neurons Ij R(Ij)
Networks of Spiking Neurons • A memory network must be able to store a value in the absence of any input:
Networks of Spiking Neurons cR(Ii) Ii Iaff
Networks of Spiking Neurons • With a non saturating activation function and no inhibition, the neurons must be spontaneously active for the network to admit a non zero stable state: cR(Ii) I2* Ii
Networks of Spiking Neurons • To get several stable fixed points, we need inhibition: Unstable fixed point Stable fixed points I2* Ii
Networks of Spiking Neurons • Clamping the input: inhibitory Iaff Ii Iaff
Networks of Spiking Neurons • Clamping the input: excitatory Iaff cR(Ii) Ii I2* Iaff
Networks of Spiking Neurons Ij R(Ij)
Networks of Spiking Neurons • Major Problem: the memory state has a high firing rate and the resting state is at zero. In reality, there is spontaneous activity at 0-10Hz and the memory state is around 10-20Hz (not 100Hz) • Solution: you don’t want to know (but it involves a careful balance of excitation and inhibition)…
Line Attractor Networks • Continuous attractor: line attractor or N-dimensional attractor • Useful for storing analog values • Unfortunately, it’s virtually impossible to get a neuron to store a value proportional to its activity
Line Attractor Networks • Storing analog values: difficult with this scheme…. cR(Ii) Ii
Line Attractor Networks Implication for transmitting rate and integration… cR(Ii) Ii
Line Attractor Networks • Head direction cells DH 100 80 60 Activity 40 20 0 -100 0 100 Preferred Head Direction (deg)
Line Attractor Networks • Attractor network with population code • Translation invariant weights DH 100 80 60 Activity 40 20 0 -100 0 100 Preferred Head Direction (deg)
Line Attractor Networks • Computing the weights:
Line Attractor Networks • The problem with the previous approach is that the weights tend to oscillate. Instead, we minimize: • The solution is:
Line Attractor Networks • Updating of memory: bias in the weights, integrator of velocity…etc.
Line Attractor Networks • How do we know that the fixed points are stable? With symmetric weights, the network has a Lyapunov function (Cohen, Grossberg 1982):
Line Attractor Networks • Line attractor: the set of stable points forms a line in activity space. • Limitations: Requires symmetric weights… • Neutrally stable along the attractor: unavoidable drift
Memorized Saccades + + T1 T2
Memorized Saccades + + R1 R2 T1 T2 S1 R2 S2 S1=R1 S2=R2-S1
Memorized Saccades + + R1 R2 T1 T2 S1 S2 S1 T1 T2 S2