480 likes | 743 Views
NENS220 Computational methods in Neuroscience. John Huguenard and Terry Sanger. Goals of the course. Overview of computational methods Mathematical techniques for creating models of neural behavior - the tools of computational methods. Computational Modeling.
E N D
NENS220 Computational methods in Neuroscience John Huguenard and Terry Sanger
Goals of the course • Overview of computational methods • Mathematical techniques for creating models of neural behavior - the tools of computational methods
Computational Modeling • The ultimate purpose is to relate different levels (scales) of neural behavior • e.g.: how do properties of ion channels determine the spiking behavior in response to synaptic input? • e.g.: what is the relationship between spike activity in a population of M1 neurons and movement of the arm?
Scope of the course • This is essentially an overview of some (but not all) of the general methods • Intended for graduate students in neuroscience • In order to learn how this is done, you will have to practice • Necessarily involves knowledge of statistics, mathematics, and some computer programming (matlab, NEURON)
Background material • Probability theory • Information theory • Matrix algebra • Correlation integrals • Fourier analysis • Matlab programming • Membrane potentials • Cable theory
Background review • We will do much of this as we go. • Additional help in TA sessions • You may need to do extra reading
Two major areas • I: Neurons • How information is processed at the level of synapses, membranes, and dendrites • Relationship between inputs, membrane potentials, and spike generation • II: Spikes: • What information is carried in single spikes, temporal sequences of spikes, and spikes over populations • How learning results in changes in spike patterns
Textbook • Theoretical Neuroscience, Peter Dayan and Larry Abbott, (MIT Press: Cambridge MA), 2001. • Available from Amazon.com and the Stanford bookstore, about $45 • Other useful references: • Neural Engineering, Andersen and Elliasmith • Spikes, Bialek • Computational Neuroscience, Churchland and Sejnowski • Handbook of computational neuroscience, Arbib • Foundations of Cellular Neurophysiology, Johnston and Wu • You must have access to a workstation with matlab/NEURON. • Matlab available on cluster computers (firebirds, etc) • NEURON available for multiple platforms via free download • We can set up accounts on linux machines with NEURON installed.
Class structure • Tuesdays and Thursdays, 3:15-5:00pm. Room H3150. • Tuesdays will be lectures • Lecture will usually follow the text chapters; you may want to read these in advance • A paper will be assigned, to be read before Thursday (first paper assigned next Tuesday) • Thursdays will be discussions of the assigned paper and the lecture, led by the TA. THESE ARE REQUIRED.
Homework assignments • 4-6 problems sets during quarter. They will be assigned on Tuesday and due the Thursday of the following week (9 days later) • Will usually require simulation of some component of the paper being discussed. • Will require use of matlab/NEURON. You should submit the program output and source code with detailed comments • Should require 1-3 hours, depending on how good you are with matlab/NEURON
Grading • 70% weekly assignments • Based on output plots, code, and comments • 30% Class participation • Based on contributions to discussion groups
Part I Modeling of realistic neurons and networks John Huguenard
Neuroelectronics, Part I John Huguenard
The big picture, a la Terry Sanger • An “external signal” x(t) is something that the experimenter controls (either a sensory stimulus or a learned motor task) • We observe spikes that are the result of a transformation of the external signal External World Sensors Spike Generator x(t) spikes
The big picture, a la John Huguenard • Neurons receive synaptic input • Neurons produce output • The currency of neuronal communication is spikes (action potentials) • Spike generation is in many cases a nonlinear function of synaptic input External World Sensors Spike Generator x(t) spikes
Why it is important to consider neuronal properties.. • STDP, dendritic back propagation, dendritic signaling • Resonance • Oscillations • Synchronization • Gain control • Persistent activity • Phase precession • Coincidence detection vs. integration
Pyramidal Neurons in Layer V thy1-YFP mouse Feng et al., (2000) Neuron 28:41 200 µm
Canonical Microcircuits Recurrent excitatory connections are prominent. Function: Amplification of signals for enhanced feature detection. Rodney Douglas & Kevan Martin
Inhibitory cells are sparse I II III IV V VI
Inhibitory interneuron diversity Modified From: Karube et al., (2004) J Neurosci 24:2853-65
Morphology can influence firing pattern Mainen & Sejnowski, 1996
Electrical properties of neurons Dominated by membrane capacitance Neurons are integrators whose time constant is dynamically variable Spike output depends on voltage-dependent gating of ion channels
Passive properties of neurons Semipermeable lipid bilayer membrane with high [K+]i maintained by electrogenic pump (ATPase) Equivalent radius ~ 25 mm, Surface Area ~ 8000 mm2=.008 mm2=8e-5cm2
_ + Electrical capacitance • Ability to store charge • Charge required to create potential difference between two conductors • A 1 Farad capacitor will store 1 Coulomb/Volt Hille, 2001
_ + Capacitance of cell membranes • Capacitance is a function of • Surface area (A) • Dielectric constant (e) • Distance between plates (d) • For membranes specific capacitance 1 mF/cm2 • is for the most part invariant • for a 0.8e-5cm2 cell ~ 80 pF Hille, 2001
Resting potential, single permeant ion Nernst Potentials EK = -75 mV ENa = +50 mV ECl = -60 mV ECa = +100 mV Nernst equation:
Uncompensated charge • [K+]i 130 mM • [K+]o 3 mM • EK –100 mV • q=CV • = 80 pF*100mV • = 8pC • = 50e6 K+ ions • Total K+I = 5e12 ions • Fraction uncompensated = 0.001% • Will vary with surface to volume ratio
Membrane Resistance • Ion selective pores • Ohm’s law E = IR • 1V is the potential difference produced by 1A passing through 1 Ohm • Conductance is reciprocal of resistance,1 Siemen = 1 Ohm-1 • Resistance is dependent on length of conductive path, cross sectional area, and resistivity of the media • Ion channels have conductances in the 2-250 pS range, but may open only briefly
Input Resistance • “Leak” channels are open at rest and determine the input resistance • i.e. the impedance to extrinsic current injection • Specific input resistance for neurons is in the range of 1 MW mm2 or 1 mS/mm2 • 50,000 20 pS leak channels/mm2 = 1 channel / 20 mm2 • Our “typical” cell of 0.008 mm2 would have an input resistance of 125 MW, or input conductance of 8 nS (equivalent to ~400 open leak channels).
Resting membrane potential>1 permeant ion Parallel conductance model
Ohmic channels • Characterized by an open channel I/V that is linear • I = (Vrev- Vm)/R • I = (Vrev- Vm) * g g = slope Vrev
Non-ohmic channels • Goldman-Hodgkin-Katz (GHK) theory • Ions pass independently • Electrical field within membrane is constant • sometimes known as the constant field equations • GHK current equation (flux in two directions) • GHK Voltage equation
Nonlinear driving force • GHK current equation [S]i>[S]o Better description than ohmic for some channels e.g. Ca2+, K+
Voltage dependent conductances • Channel opening is a function of transmembrane voltage
Latching, up- and down-states • Stable systems have positive slope to I/V curve. E.g., neurons with only leak currents. • Voltage dependent conductances can lead to regions of negative slope conductance with two stable states.
Two types of channels • gK:gNa =3:1, both linear
Two types of channels, one with V dependent conductance • gK:gNa =3:5
Signalling • Synaptic input • Transient inputs from other sources • i.e. sensors or lower level neurons • Spike output • Generation of action potentials, which will then propagate the signal to the neurons at the next level, again via synapses
Chemical synapses • Excitatory (ES > Em) • Inhibitory (Es < Em) • shunting • Rapid increase in gs followed by exponential decay (tD = 1 - 100 ms) • Approximated by alpha function • Or sum of exponentials (more realistic)
Spike generation • Nonlinear, “all or none” response • Based on avalanche type reaction • Characterized by • Threshold • High conductance reset • Refractory period • Can be simulated by integrate and fire synthetic neuron