1 / 112

NEURONAL DYNAMICS 2: ACTIVATION MODELS

NEURONAL DYNAMICS 2: ACTIVATION MODELS. Chapter 3. Neuronal Dynamics 2 :Activation Models. 3.1 Neuronal dynamical system. Neuronal activations change with time. The way they change depends on the dynamical equations as following:. (3-1) (3-2).

Download Presentation

NEURONAL DYNAMICS 2: ACTIVATION MODELS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NEURONAL DYNAMICS 2: ACTIVATION MODELS

  2. Chapter 3. Neuronal Dynamics 2 :Activation Models 3.1 Neuronal dynamical system Neuronal activations change with time. The way they change depends on the dynamical equations as following: (3-1) (3-2)

  3. 3.1 ADDITIVE NEURONAL DYNAMICS • first-order passive decay model In the absence of external or neuronal stimuli, the simplest activation dynamics model is: (3-3) (3-4)

  4. 3.1 ADDITIVE NEURONAL DYNAMICS since for any finite initial condition The membrane potential decays exponentially quickly to its zero potential.

  5. Passive Membrane Decay • Passive-decay rate scales the rate to the membrane’s resting potential. • solution : Passive-decay rate measures: the cell membrane’s resistance or “friction” to current flow.

  6. property • Pay attention to property The larger the passive-decay rate,the faster the decay--the less the resistance to current flow.

  7. Membrane Time Constants • The membrane time constant scales the time variable of the activation dynamical system. • The multiplicative constant model: (3-8)

  8. Solution and property • solution property The smaller the capacitance ,the faster things change As the membrane capacitance increases toward positive infinity,membrane fluctuation slows to stop.

  9. Membrane Resting Potentials Definition Define resting Potential as the activation value to which the membrane potential equilibrates in the absence of external or neuronal inputs: (3-11) • Solutions (3-12)

  10. Note The capacitance appear in the index of the solution, it is called time-scaling capacitance. It does not affect the asymptotic or steady-state solution and does not depend on the finite initial condition.

  11. Additive External Input • Add input Apply a relatively constant numeral input to a neuron. (3-13) • solution (3-14)

  12. Meaning of the input • Input can represent the magnitude of directly • experiment sensory information or directly apply control information. • The input changes slowly,and can be assumed • constant value.

  13. 3.2 ADDITIVE NEURONAL FEEDBACK Neurons do not compute alone. Neuron modify their state activations with external input and with the feedback from one another. This feedback takes the form of path-weighted signals from synaptically connected neurons.

  14. Synaptic Connection Matrices • n neurons in field p neurons in field The ith neuron axon in a synapse jth neurons in is constant,can be positive,negative or zero.

  15. Meaning of connection matrix • The synaptic matrix or connection matrix M is an • n-by-p matrix of real number whose entries are the • synaptic efficacies. the ijth synapse is excitatory • if inhibitory if • The matrix M describes the forward projections from • neuron field to neuron field • The matrix N describes the backward projections • from neuron field to neuron field

  16. Bidirectional and Unidirectional connection Topologies • Bidirectional networks M and N have the same or approximately the same structure. • Unidirectional network A neuron field synaptically intraconnects to itself.M nxn. • BAM M is symmetric, the unidirectional network is BAM

  17. Augmented field and augmented matrix • Augmented field M connects to ,N connects to then the augmented field intraconnects to itself by the square block matrix B

  18. Augmented field and augmented matrix • In the BAM case,when then hence a BAM symmetries an arbitrary rectangular matrix M. • In the general case, P is n-by-n matrix. Q is p-by-p matrix. If and only if, the neurons in are symmetrically intraconnected

  19. 3.3 ADDITIVE ACTIVATION MODELS • Define additive activation model • n+p coupled first-order differential equations defines the additive activation model (3-15) (3-16)

  20. additive activation model define • The additive autoassociative model correspond to a system of n coupled first-order differential equations (3-17)

  21. additive activation model define • A special case of the additive autoassociative model (3-18) (3-19) where is (3-20) measures the cytoplasmic resistance between neurons i and j.

  22. Hopfield circuit and continuous additive bidirectionalassociative memories • Hopfield circuit arises from if each neuron has a strictly increasing signal function and if the synaptic connection matrix is symmetric (3-21) • continuous additive bidirectional associative memories (3-22) (3-23)

  23. 3.4 ADDITIVE BIVALENT FEEDBACK Discrete additive activation models correspond to neurons with threshold signal function The neurons can assume only two value: ON and OFF. ON represents the signal value +1. OFF represents 0 or –1. Bivalent models can represent asynchronous and stochastic behavior.

  24. Bivalent Additive BAM • BAM-bidirectional associative memory • Define a discrete additive BAM with threshold signal functions, arbitrary thresholds and inputs,an arbitrary but constant synaptic connection matrix M,and discrete time steps k. (3-24) (3-25)

  25. Bivalent Additive BAM • Threshold binary signal functions (3-26) (3-27) • For arbitrary real-value thresholds • for neurons for neurons

  26. A example for BAM model • Example • A 4-by-3 matrix M represents the forward synaptic projections from to . • A 3-by-4 matrix MT represents the backward synaptic projections from to .

  27. A example for BAM model • Suppose at initial time k all the neurons in are ON. • So the signal state vector at time k corresponds to • Input • Suppose

  28. A example for BAM model • first:at time k+1 through synchronous operation,the result is: • next:at time k+1 ,these signals pass “forward” through the • filter M to affect the activations of the neurons. • The three neurons compute three dot products,or correlations. • The signal state vector multiplies each of the three columns of M.

  29. A example for BAM model • the result is: • synchronously compute the new signal state vector :

  30. A example for BAM model • the signal vector passes “backward” through the synaptic • filter at time k+2: • synchronously compute the new signal state vector :

  31. A example for BAM model since then • conclusion These same two signal state vectors will pass back and forth in bidirectional equilibrium forever-or until new inputs perturb the system out of equilibrium.

  32. A example for BAM model • asynchronous state changes may lead to different bidirectional equilibrium • keep the first neurons ON,only update the second and third neurons. At k,all neurons are ON. • new signal state vector at time k+1 equals:

  33. A example for BAM model • new activation state vector equals: • synchronously thresholds • passing this vector forward to gives

  34. A example for BAM model • similarly, • for any asynchronous state change policy we apply to the neurons • the system has reached a new equilibrium,the binary pair • represents a fixed point of the system.

  35. conclusion • conclusion • Different subset asynchronous state change policies applied to the same data need not product the same fixed-point equilibrium. They tend to produce the same equilibria. • All BAM state changes lead to fixed-point stability.

  36. Bidirectional Stability • definition • A BAM system is Bidirectional stable if all inputs converge to fixed-point equilibria. • A denotes a binary n-vector in • B denotes a binary p-vector in

  37. Bidirectional Stability • Represent a BAM system equilibrates to bidirectional fixed • point as

  38. Lyapunov Functions • Lyapunov Functions L maps system state variables to real • numbers and decreases with time. In BAM case,L maps the • Bivalent product space to real numbers. • Suppose L is sufficiently differentiable to apply the chain • rule: (3-28)

  39. Lyapunov Functions • The quadratic choice of L (3-29) • Suppose the dynamical system describes the passive decay system. (3-30) • The solution (3-31)

  40. Lyapunov Functions • The partial derivative of the quadratic L: (3-32) (3-33) or (3-34) (3-35) In either case (3-36) At equilibrium This occurs if and only if all velocities equal zero

  41. conclusion • A dynamical system is stable if some Lyapunov Functions L • decreases along trajectories. • A dynamical system is asymptotically stable if it strictly • decreases along trajectories • Monotonicity of a Lyapunov Function provides a sufficient • not necessary condition for stability and asymptotic stability.

  42. Linear system stability For symmetric matrix A and square matrix B,the quadratic form behaves as a strictly decreasing Lyapunov function for any linear dynamical system if and only if the matrix is negative definite.

  43. The relations between convergence rate and eigenvalue sign • A general theorem in dynamical system theory relates convergence rate and eigenvalue sign: • A nonlinear dynamical system converges exponetially quickly if its system Jacobian has eigenvalues with negative real parts. Locally such nonlinear system behave as linearly. • (Jacobian matrix) • A Lyapunov Function summarizes total system behavior. • A Lyapunov Function often measures the energy of a physical sysem. Represents system energy decrease • with dynamical systems

  44. Potential energy function represented by quadratic form Consider a system of n variables and its potential-energy function E. Suppose the coordinate measures the displacement from equilibrium of ith unit.The energy depends on only coordinate ,so since E is a physical quantity,we assume it is sufficiently smooth to permit a multivariable Taylor-series expansion about the origin:

  45. Potential energy function represented by quadratic form Where A is symmetric,since

  46. The reason of (3-42)follows • First,we defined the origin as an equilibrium of zero potential • energy;so • Second,the origin is an equilibrium only if all first partial • derivatives equal zero. • Third,we can neglect higher-order terms for small • displacement,since we assume the higher-order products are • smaller than the quadratic products.

  47. Conclusion: Bounded decreasing L funcs provide an intuitive way to describe global “computations” in nueral networks ad other dynamical system.

  48. Bivalent BAM theorem • The average signal energy L of the forward pass of the • Signal state vector through M,and the backward pass • Of the signal state vector through : since

  49. Lower bound of Lyapunov function • The signal is Lyapunov function clearly bounded below. • For binary or bipolar,the matrix coefficients define the • attainable bound: • The attainable upper bound is the negative of this expression.

  50. Lyapunov function for the general BAM system • The signal-energy Lyapunov function for the general BAM • system takes the form Inputs and and constant vectors of thresholds the attainable bound of this function is.

More Related