1 / 68

Scaling-up Cortical Representations in Fluctuation-Driven Systems

Scaling-up Cortical Representations in Fluctuation-Driven Systems. David W. McLaughlin Courant Institute & Center for Neural Science New York University http://www.cims.nyu.edu/faculty/dmac/ Cold Spring Harbor -- July ‘04. In collaboration with:   David Cai Louis Tao Michael Shelley

kordell
Download Presentation

Scaling-up Cortical Representations in Fluctuation-Driven Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scaling-up Cortical Representationsin Fluctuation-Driven Systems David W. McLaughlin Courant Institute & Center for Neural Science New York University http://www.cims.nyu.edu/faculty/dmac/ Cold Spring Harbor -- July ‘04

  2. In collaboration with:   David Cai Louis Tao Michael Shelley Aaditya Rangan

  3. Lateral Connections and Orientation -- Tree Shrew Bosking, Zhang, Schofield & Fitzpatrick J. Neuroscience, 1997

  4. Coarse-Grained Asymptotic Representations Needed for “Scale-up”

  5. Cortical networks have a very noisy dynamics Strong temporal fluctuations On synaptic timescale Fluctuation driven spiking

  6. Experiment Observation Fluctuations in Orientation Tuning (Cat data from Ferster’s Lab) Ref: Anderson, Lampl, Gillespie, Ferster Science, 1968-72 (2000) threshold (-65 mV)

  7. Fluctuation-driven spiking (very noisy dynamics, on the synaptic time scale) Solid: average ( over 72 cycles) Dashed: 10 temporal trajectories

  8. To accurately and efficiently describe these networks requires that fluctuations be retained in a coarse-grained representation. • “Pdf ” representations – (v,g;x,t),  = E,I will retain fluctuations. • But will not be very efficient numerically • Needed – a reduction of the pdf representations which retains • Means & • Variances • PT #1: Kinetic Theory provides this representation Ref: Cai, Tao, Shelley & McLaughlin, PNAS, pp 7757-7762 (2004)

  9. First, tile the cortical layer with coarse-grained (CG) patches

  10. Kinetic Theory begins from PDF representations (v,g;x,t),  = E,I • Knight & Sirovich; • Tranchina, Nykamp & Haskell;

  11. First, replace the 200 neurons in this CG cell by an effective pdf representation • Then derive from the pdf rep, kinetic thry • For convenience of presentation, I’ll sketch the derivation a single CG cell, with 200 excitatory Integrate & Fire neurons • The results extend to interacting CG cells which include inhibition – as well as “simple” & “complex” cells.

  12. N excitatory neurons (within one CG cell) • Random coupling throughout the CG cell; • AMPA synapses (with time scale ) t vi = -(v – VR) – gi (v-VE)  t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk)

  13. N excitatory neurons (within one CG cell) • All-to-all coupling; • AMPA synapses (with time scale ) t vi = -(v – VR) – gi (v-VE)  t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk) (g,v,t)  N-1 i=1,N E{[v – vi(t)] [g – gi(t)]}, Expectation “E” over Poisson spike train

  14. t vi = -(v – VR) – gi (v-VE)  t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk) Evolution of pdf -- (g,v,t): (i) N>1; (ii) the total input to each neuron is (modulated) Poisson spike trains. t  = -1v {[(v – VR) + g (v-VE)] } + g {(g/) } + 0(t) [(v, g-f/, t) - (v,g,t)] + N m(t) [(v, g-Sa/N, t) - (v,g,t)], 0(t) = modulated rate of Poisson spike train from LGN; m(t) = average firing rate of the neurons in the CG cell =  J(v)(v,g; )|(v= 1) dg, and where J(v)(v,g; ) = -{[(v – VR) + g (v-VE)] }

  15. Kinetic Theory Begins from Moments • (g,v,t) • (g)(g,t) =  (g,v,t) dv • (v)(v,t) =  (g,v,t) dg • 1(v)(v,t) =  g (g,tv) dg where (g,v,t) = (g,tv) (v)(v,t). t  = -1v {[(v – VR) + g (v-VE)] } + g {(g/) } + 0(t) [(v, g-f/, t) - (v,g,t)] + N m(t) [(v, g-Sa/N, t) - (v,g,t)],

  16. Under the conditions, N>1; f < 1; 0 f = O(1), And the Closure: (i) v2(v) = 0; (ii) 2(v) = g2 where 2(v) = 2(v) – (1(v))2 , g2 = 0(t) f2 /(2) + m(t) (Sa)2 /(2N) G(t) = 0(t) f + m(t) Sa One obtains:

  17. t (v) = -1v [(v – VR) (v)+ 1(v)(v-VE) (v)] t 1(v) = - -1[1(v) – G(t)] + -1{[(v – VR) + 1(v)(v-VE)] v 1(v)} + g2 / ((v)) v [(v-VE) (v)] Together with a diffusion eq for (g)(g,t):  t (g) =g {[g – G(t)]) (g)} + g2 gg (g)

  18. Fluctuations in g are Gaussian  t (g) =g {[g – G(t)]) (g)} + g2 gg (g)

  19. Fluctuation-Driven Dynamics PDF of v Theory→ ←I&F (solid) Fokker-Planck→ Theory→ ←I&F ←Mean-driven limit ( ): Hard thresholding N=75 firing rate (Hz) N=75 σ=5msec S=0.05 f=0.01

  20. Bistability and Hysteresis • Network of Simple, Excitatory only N=16! N=16 Mean­Driven: Fluctuation­Driven: Relatively Strong Cortical Coupling:

  21. Bistability and Hysteresis • Network of Simple, Excitatory only N=16! Mean­Driven: Relatively Strong Cortical Coupling:

  22. Computational Efficiency • For statistical accuracy in these CG patch settings, Kinetic Theory is 103 -- 105 more efficient than I&F;

  23. Average firing rates Vs Spike-time statistics

  24. Coarse-grained theories involve local averaging in both space and time. • Hence, coarse-grained theories average out detailed spike timing information. • Ok for “rate codes”, but if spike-timing statistics is to be studied, must modify the coarse-grained approach

  25. PT #2:Embedded point neurons will capture these statistical firing properties[Ref: Cai, Tao & McLaughlin, PNAS (to appear)] • For “scale-up” – computer efficiency • Yet maintaining statistical firing properties of multiple neurons • Model especially relevant for biologically distinguished sparse, strong sub-networks – perhaps such as long-range connections • Point neurons -- embedded in, and fully interacting with, coarse-grained kinetic theory, • Or, when kinetic theory accurate by itself, embedded as “test neurons”

  26. I&F vs. Embedded Network Spike Rasters a) I&F Network: 50 “Simple” cells, 50 “Complex” cells. “Simple” cells driven at 10 Hz b)-d) Embedded I&F Networks: b) 25 “Complex” cells replaced by single kinetic equation; c) 25 “Simple” cells replaced by single kinetic equation; d) 25 “Simple” and 25 “Complex” cells replaced by kinetic equations. In all panels, cells 1-50 are “Simple” and cells 51-100 are “Complex”. Rasters shown for 5 stimulus periods.

  27. Embedded Network Full I & F Network Raster Plots, Cross-correlation and ISI distributions. (Upper panels) KT of a neuronal patch with strongly coupled embedded neurons; (Lower panels) Full I&F Network. Shown is the sub-network, with neurons 1-6 excitatory; neurons 7-8 inhibitory; EPSP time constant 3 ms; IPSP time constant 10 ms.

  28. “Test neuron” within a CG Kinetic Theory ISI distributions for two simulations: (Left) Test Neuron driven by a CG neuronal patch; (Right) Sample Neuron in the I&F Network.

  29. The Importance of Fluctuations Cycle-averaged Firing Rate Curves [Shown: Exc Cmplx Pop in a 4 population model): Full I&F network (solid) , Full I&F + KT (dotted); Full I&F coupled to Full KT but with mean only coupling (dashed).] In both embedded cases (where the I&F units are coupled to KT), half the simple cells are represented by Kinetic Theory

  30. Reverse Time Correlations • Correlates spikes against driving signal • Triggered by spiking neuron • Frequently used experimental technique to get a handle on one description of the system • P(,) – probability of a grating of orientation , at a time  before a spike -- or an estimate of the system’s linear response kernel as a function of (,)

  31. Reverse Correlation Left: I&F Network of 128 “Simple” and 128 “Complex” cells at pinwheel center. RTC P() for single Simple cell. Below: Embedded Network of 128 “Simple” cells, with 128 “Complex” cells replaced by single kinetic equation. RTC P() for single Simple cell.

  32. Computational Efficiency • For statistical accuracy in these CG patch settings, Kinetic Theory is 103 -- 105 more efficient than I&F; • The efficiency of the embedded sub-network scales as N2, where N = # of embedded point neurons; (i.e. 100  20 yields 10,000 400)

  33. Conclusions • Kinetic Theory is a numerically efficient, and remarkably accurate, method for “scale-up” – Ref: PNAS, pp 7757-7762 (2004) • Kinetic Theory introduces no new free parameters into the model, and has a large dynamic range from the rapid firing “mean-driven” regime to a fluctuation drivenregime. • Kinetic Theory does not capture detailed “spike-timing” statistics • Sub-networks of point neurons can be embedded within kinetic theory to capture spike timing statistics, with a range from test neurons to fully interacting sub-networks. Ref: PNAS, to appear (2004)

  34. Conclusions and Directions • Constructing ideal network models to discern and extract possible principles of neuronal computation and functions Mathematical methods for analytical understanding Search for signatures of identified mechanisms • Mean-driven vs. fluctuation-driven kinetic theories New closure, Fluctuation and correlation effects Excellent agreement with the full numerical simulations • Large-scale numerical simulations of structured networks constrained by anatomy and other physiological observations to compare with experiments Structural understanding vs. data modeling New numerical methods for scale-up --- Kinetic theory

  35. Three Dynamic Regimes of Cortical Amplification: • 1) Weak Cortical Amplification • No Bistability/Hysteresis • 2) Near Critical Cortical Amplification • 3) Strong Cortical Amplification • Bistability/Hysteresis • (2) (1) • (3) • I&F • Excitatory Cells Shown • Possible Mechanism • for Orientation Tuning of Complex Cells • Regime 2 for far-field/well-tuned Complex Cells • Regime 1 for near-pinwheel/less-tuned • Summed Effects (2) (1)

  36. Summary & Conclusion

  37. Summary Points for Coarse-Grained Reductions needed for Scale-up • Neuronal networks are very noisy, with fluctuation driven effects. • Temporal scale-separation emerges from network activity. • Local temporal asynchony needed for the asymptotic reduction, and it results from synaptic failure. • Cortical maps -- both spatially regular and spatially random -- tile the cortex; asymptotic reductions must handle both. • Embedded neuron representations may be needed to capture spike-timing codes and coincidence detection. • PDF representations may be needed to capture synchronized fluctuations.

  38. Scale-up & Dynamical Issuesfor Cortical Modeling of V1 • Temporal emergence of visual perception • Role of spatial & temporalfeedback -- within and between cortical layers and regions • Synchrony & asynchrony • Presence (or absence) and role of oscillations • Spike-timing vs firing rate codes • Very noisy, fluctuation driven system • Emergence of an activity dependent, separation of time scales • But often no (or little) temporal scale separation

  39. Under ASSUMPTIONS: 1) 2) Summed intra-cortical low rate spike events become Poisson:

  40. Closures:

More Related