640 likes | 783 Views
NeuroComputing CENG569, Fall 2012-2013 1 st Lecture. Instructor: Erol Şahin. Overview. The neuron. A model of neural networks. X : activity tau : time constant f () : activation function b : external input W : synaptic strength. Spikes vs. rates. “Quantum” neurodynamics
E N D
NeuroComputingCENG569, Fall 2012-20131st Lecture Instructor: ErolŞahin
Overview • The neuron
A model of neural networks • X : activity • \tau : time constant • f() : activation function • b: external input • W : synaptic strength
Spikesvs.rates • “Quantum” neurodynamics • neural activity is quantized • comes in packets called “spikes” • “Classical” neurodynamics • neural activity is continuously graded • described by rates
A model neuron • Each synapse is a current source controlled by presynaptic spiking. • The dendrite adds the currents of multiple synapses. • The total current drives spiking in the axon.
A synapse as a current source • Each action potential causes a synapse to inject a stereotyped current into the postsynaptic neuron. • Let the “synaptic strength” W denote the total charge injected by the synapse due to a single presynaptic spike. • Decaying exponential model
Temporal summation • Approximation: currents from successive spikes add linearly.
Normalized current • Assumption: all synapses have the same time constant \tau
Leakyintegrator model • If there is a spike, • Otherwise, exponential decay with time constant \tau • Normalized current is a low-pass filtered version of the presynaptic spike train.
Firing rate approximation • Normalized current is a low-pass filtered version of the firing rate. • Accurate if rate is large compared to 1/\tau
Summation of currents from convergent synapses • Approximation: linear superposition of currents from different synapses onto the same postsynaptic neuron.
Spike generation • Regard the soma as a device that transduces current into rate of action potential firing. • The detailed dynamics of action potential generation (voltage, etc.) is implicit.
Somatic current injection • Simulation of synaptic current
Repetitive spiking • Pyramidal neurons in rat sensorimotorcortex • Schwindt, O’Brien, and Crill. J. Neurophysiol. 77:2484 (1997)
F-I curve Schwindt, O’Brien, and Crill. J. Neurophysiol. 77:2484 (1997)
Biophysical interpretation • f is firing rate as a function of current • x is normalized synaptic current (measured in Hz) • \tau is synaptic time constant • W is the charge/presynaptic spike
(Half-wave) rectification • Threshold for firing. • Rises smoothly from zero. • No saturation. • Popular for modeling the visual system. • “Semilinear”: linear concepts applicable.
Step function • Jumps to nonzero value at threshold. • No increase in rate thereafter. • Computation with binary variables. • Popular with computer scientists.
Sigmoidal function • Compromise • Binary for large inputs. • Linear for small inputs. • “S-shaped” • Logistic function • 0 to 1 • Hyperbolic tangent • -1 to 1
Rates vs. spikes • When can the quantization of light be neglected? • “lots of photons” and little synchrony • When can the quantization of neural activity be neglected? • “lots of spikes” and little synchrony
Deficiencies of the rate model • Soma • Spike frequency adaptation • Dendrite • Passive nonlinearities • Active nonlinearities • Short-term synaptic plasticity
Spike frequency adaptation Schwindt, O’Brien, and Crill. J. Neurophysiol. 77:2484 (1997)
Short term depression Tsodyks & Markram, PNAS 94:719 (1997)
Significance of the LT neuron • Model of neuron as a decision maker • weigh the evidence • compare with a threshold • Used alone: popular pattern classifier. • Used as a building block: • complex networks • e.g., multilayer perceptrons.
Boolean functions • The output variable and N input variables take on binary values • 1 = true • 0 = false • What Boolean functions can be realized by an LT neuron?
Excitatory input • Suppose that w_i=1 for all I • It doesn’t matter which inputs are active • only how many are active.
Logical AND • “conjunction”
Logical OR • “disjunction”
Selectivity is controlled by threshold • AND • highly selective • high threshold • OR • Indiscriminate • low threshold
Inhibition as a dynamic threshold • Suppose that w_i=±1 for all I • Then what matters is the number of active excitatory inputs minus the number of active inhibitory inputs
Inhibition as negation • Nonmonotoneconjunction
Claim: • Conjunctions and disjunctions of N variables or their negations can be realized by an LT neuron.
Vector notation • weight vector w = (w_1,w_2,… w_N) • input vector x= (x_1,x_2,…,x_N) • threshold θ • inner/scalarproduct
Separating hyperplane • w and θdefinea hyperplane that divides the input space into halfspaces • This hyperplaneis sometimes called the “decision boundary.”
Multi-Layered networks (MLPs) • Two layers of LT neurons • (three layers if input neurons are included) • Two layers of synapses. • Feed-forward • No loops
Two questions about representational power • What can an MLP compute? • any boolean function • What can an MLP compute efficiently? • a vanishingly small fraction of booleanfunctions
Some questions • How large is the set of boolean functions of N variables? • How large is the set of boolean functions realized by an LT neuron?
Some questions • How large is the set of boolean functions of N variables? • How large is the set of boolean functions realized by an LT neuron? • Almost all booleanfunctions cannot be computed by an LT neuron • But… any Boolean function can be computed by a perceptron with two layers of synapses.
Disjunctive normal form (DNF) • Any boolean function can be written in disjunctivenormal form. • Disjunction of conjunctions
Any DNF can be written as a 2-layer perceptron • AND of variables or their negations • LT neuron with excitatory and inhibitory synapses. • OR • LT neuron with excitatory synapses and low threshold.