1 / 88

Lecture 13: Associative Memory

Lecture 13: Associative Memory. References: D Amit, N Brunel, Cerebral Cortex 7 , 237-252 (1997) N Brunel, Network 11 , 261-280 (2000) N Brunel, Cerebral Cortex 13 , 1151-1161 (2003)

fausto
Download Presentation

Lecture 13: Associative Memory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 13: Associative Memory References: D Amit, N Brunel, Cerebral Cortex7, 237-252 (1997) N Brunel, Network11, 261-280 (2000) N Brunel, Cerebral Cortex13, 1151-1161 (2003) J Hertz, in Models of Neural NetworksIV (L van Hemmen, J Cowan and E Domany, eds) Springer Verlag, 2002; sect 1.4

  2. What is associative memory?

  3. What is associative memory? • “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”)

  4. What is associative memory? • “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”) • “Store” patterns in synaptic strengths

  5. What is associative memory? • “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”) • “Store” patterns in synaptic strengths • Recall: Given input (initial activity pattern) not equal to any stored pattern, network dynamics should take it to “nearest” (most similar) stored pattern

  6. What is associative memory? • “Patterns”: firing activity of specific sets of neurons (Hebb: “assemblies”) • “Store” patterns in synaptic strengths • Recall: Given input (initial activity pattern) not equal to any stored pattern, network dynamics should take it to “nearest” (most similar) stored pattern (categorization, error correction, …)

  7. Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): pnon-overlapping excitatory subpopulations

  8. Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): pnon-overlapping excitatory subpopulations each of size n = fN (fp < 1)

  9. Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): pnon-overlapping excitatory subpopulations each of size n = fN (fp < 1) stronger connections within subpopulations (“assemblies”)

  10. Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): pnon-overlapping excitatory subpopulations each of size n = fN (fp < 1) stronger connections within subpopulations (“assemblies”) weakened connections between subpopulations

  11. Implementation in balanced excitatory-inhibitory network Model (Amit & Brunel): pnon-overlapping excitatory subpopulations each of size n = fN (fp < 1) stronger connections within subpopulations (“assemblies”) weakened connections between subpopulations Looking for selective states: higher rates in a single assembly

  12. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses:

  13. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly:

  14. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly:

  15. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: (strengthened, “Hebb” rule)

  16. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: (strengthened, “Hebb” rule) (weakened, “anti-Hebb”)

  17. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: Otherwise: no change (strengthened, “Hebb” rule) (weakened, “anti-Hebb”)

  18. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: Otherwise: no change (strengthened, “Hebb” rule) (weakened, “anti-Hebb”) To conserve average strength:

  19. Model Like Amit-Brunel model (Lecture 9) except for exc-exc synapses: From within the same assembly: From outside the assembly: Otherwise: no change (strengthened, “Hebb” rule) (weakened, “anti-Hebb”) To conserve average strength: =>

  20. Mean field theory Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons

  21. Mean field theory Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons Input current to neurons in the active assembly:

  22. Mean field theory Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons Input current to neurons in the active assembly: to rest of assemblies:

  23. Mean field theory Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons Input current to neurons in the active assembly: to rest of assemblies: to other excitatory neurons:

  24. Mean field theory Rates: active assembly inactive assemblies rest of excitatory neurons inhibitory neurons ext input neurons Input current to neurons in the active assembly: to rest of assemblies: to other excitatory neurons: to inhibitory neurons:

  25. Mean field theory (2) Noise variances (white noise approximation):

  26. Mean field theory (2) Noise variances (white noise approximation):

  27. Mean field theory (2) Noise variances (white noise approximation):

  28. Mean field theory (2) Noise variances (white noise approximation):

  29. Mean field theory (2) Noise variances (white noise approximation):

  30. Mean field theory (2) Noise variances (white noise approximation): Rate of an I&F neuron driven by white noise:

  31. Mean field theory (2) Noise variances (white noise approximation): Rate of an I&F neuron driven by white noise:

  32. Spontaneous activity: All assemblies inactive:

  33. Spontaneous activity: All assemblies inactive:

  34. Spontaneous activity: All assemblies inactive: becomes

  35. Spontaneous activity: All assemblies inactive: becomes Similarly,

  36. Spontaneous activity: All assemblies inactive: becomes Similarly,

  37. Spontaneous activity: All assemblies inactive: becomes Similarly, and

  38. Spontaneous activity: All assemblies inactive: becomes Similarly, and

  39. Spontaneous activity: All assemblies inactive: becomes Similarly, and Solve for

  40. Simplified model (Brunel 2000)

  41. Simplified model (Brunel 2000) • pf << 1

  42. Simplified model (Brunel 2000) • pf << 1 • g+ ~1/f >> 1

  43. Simplified model (Brunel 2000) • pf << 1 • g+ ~1/f >> 1 • variances s+ = sact, s1 as in spontaneous-activity state

  44. Simplified model (Brunel 2000) • pf << 1 • g+ ~1/f >> 1 • variances s+ = sact, s1 as in spontaneous-activity state Define L = fJ11g+

  45. Simplified model (Brunel 2000) • pf << 1 • g+ ~1/f >> 1 • variances s+ = sact, s1 as in spontaneous-activity state Define L = fJ11g+ Then (1) spontaneous activity state has r+= r1,

  46. Simplified model (Brunel 2000) • pf << 1 • g+ ~1/f >> 1 • variances s+ = sact, s1 as in spontaneous-activity state Define L = fJ11g+ Then (1) spontaneous activity state has r+= r1, (2) In recall state with ract > r+, r1 andr2 are same as in spontaneous activity state

  47. Simplified model (Brunel 2000) • pf << 1 • g+ ~1/f >> 1 • variances s+ = sact, s1 as in spontaneous-activity state Define L = fJ11g+ Then (1) spontaneous activity state has r+= r1, (2) In recall state with ract > r+, r1 andr2 are same as in spontaneous activity state (3) ract is determined by

  48. Graphical solution (r -> n) (This L = (our L) x tm)

  49. Graphical solution (r -> n) (This L = (our L) x tm) 1-assembly memory/recall state stable for big enough L (or g+ ) ~ describes “working memory” in prefrontal cortex

  50. Capacity problem In this model, memory assemblies were non-overlapping. This is unrealistic.

More Related