290 likes | 630 Views
Prof. Gelenbe is grateful to his PhD students who have worked or are working with him on random neural networks and/or their applications. - Univ. of Paris: Andreas Stafylopatis, Jean-Michel Fourneau, Volkan Atalay, Myriam Mokhtari, Vassilada Koubi, Ferhan Pekergin, Ali Labed, Christine HubertDuke University: Hakan Bakircioglu, Anoop Ghanwani, Yutao Feng, Chris Cramer, Yonghuan Cao, Hossam Abdelbaki, Taskin KocakUCF: Rong Wang, Pu Su, Peixiang Liu, Will Washington, Esin Seref, Zhiguang Xu, Kh15
E N D
1. The Random Neural Network: the model and some of its applications Erol Gelenbe1,2 and Varol Kaptan2
1 www.ee.imperial.ac.uk/gelenbe
Denis Gabor Professor
Head of Intelligent Systems and Networks
2 Dept of Electrical and Electronic Engineering
Imperial College
London SW7 2BT
3. Thank you to the agencies and companies who have generously supported RNN work over the last 15 yrs France (1989-97): ONERA, CNRS C3, Esprit Projects QMIPS, EPOCH and LYDIA
USA (1993-): ONR, ARO, IBM, Sandoz, US Army Stricom, NAWCTSD, NSF, Schwartz Electro-Optics, Lucent
UK (2003-): EPSRC, MoD, General Dynamics UK Ltd, EU FP6 for grant awards for the next three years, hopefully more ..
4. Outline Biological Inspiration for the RNN
The RNN Model
Some Applications
modeling biological neuronal systems
texture recognition and segmentation
image and video compression
multicast routing
5. Random Spiking Behaviour of Neurons
6. Random Spiking Behaviour of Neurons
7. Random Spiking Behaviour of Neurons
8. Random Spiking Behaviour of Neurons
9. Queuing Networks: Exploiting the Analogy
10. Queuing Network <-> Random Neural Network
12. Mathematical Model: A neural network with n neurons
Internal State of Neuron i at time t, is an Integer xi(t) > 0
Network State at time t is a Vector
x(t) = (x1(t),
, xi(t),
, xk(t),
, xn(t))
If xi(t)> 0, we say that Neuron i is excited and it may fire at t+ (in which case it will send out a spike)
Also, if xi(t)> 0, the Neuron i will fire with probability riDt in the interval [t,t+Dt]
If xi(t)=0, the Neuron cannot fire at t+
When Neuron i fires:
- It sends a spike to some Neuron j, w.p. pij
- Its internal state changes xi(t+) = xi(t) - 1
13. Mathematical Model: A neural network with n neurons
The arriving spike at Neuron j is an:
- Excitatory Spike w.p. pij+
- Inhibitory Spike w.p. pij -
- pij = pij+ + pij- with Snj=1 pij < 1 for all i=1,..,n
From Neuron i to Neuron j:
- Excitatory Weight or Rate is wij+ = ri pij+
- Inhibitory Weight or Rate is wij- = ri pij-
- Total Firing Rate is ri = Snj=1 (wij+ + wij)
To Neuron i, from Outside the Network
- External Excitatory Spikes arrive at rate Li
- External Inhibitory Spikes arrive at rate li
14. State Equations
17. Random Neural Network Neurons exchange Excitatory and Inhibitory Spikes (Signals)
Inter-neuronal Weights are Replaced by Firing Rates
Neuron Excitation Probabilities obtained from Non-Linear State Equations
Steady-State Probability is Product of Marginal Probabilities
Separability of the Stationary Solution based on Neuron Excitation Probabilities
Existence and Uniqueness of Solutions for Recurrent Network
Learning Algorithms for Recurrent Network are O(n3)
Multiple Classes (1998) and Multiple Class Learning (2002)
18. Some Applications Modeling Cortico-Thalamic Response
Supervised learning in RNN (Gradient Descent)
Texture based Image Segmentation
Image and Video Compression
Multicast Routing
20. Rat Brain Modeling with the Random Neural Network
23. Comparing Measurements and Theory:Calibrated RNN Model and Cortico-Thalamic Oscillations
24. Oscillations Disappear when Signaling Delay in Cortex is Decreased
25. Thalamic Oscillations Disappear when Positive Feedback from Cortex is removed
26. When Feedback in Cortex is Dominantly Negative, Cortico-Thalamic Oscillations Disappear
27. Summary of findings
28. Gradient Computation for the Recurrent RNN is O(n3)
29. Texture Based Object Identification Using the RNN, US Patent 99 (E. Gelenbe, Y. Feng)
30. 1) MRI Image Segmentation
31. MRI Image Segmentation
32. Brain Image Segmentation with RNN
33. Extracting Abnormal Objects from MRI Images of the Brain
34. 2) RNN based Adaptive Video Compression:Combining Motion Detection and RNN Still Image Compression
35. Neural Still Image CompressionFind RNN R that Minimizes|| R(I) - I ||Over a Training Set of Images {I }
36. RNN based Adaptive Video Compression
40. 3) Analytical Annealing with the RNN: Multicast Routing(Similar Results with the Traveling Salesman Problem) Finding an optimal many-to-many communications path in a network is equivalent to finding a Minimal Steiner Tree which is an NP-Complete problem
The best heuristics are the Average Distance Heuristic (ADH) and the Minimal Spanning Tree (MSTH) for the network graph
RNN Analytical Annealing improves the number of optimal solutions found by ADH and MST by approximately 10%
41. Selected References E. Gelenbe, ``Random neural networks with negative and positive signals and product form solution,'' Neural Computation, vol. 1, no. 4, pp. 502-511, 1989.
E. Gelenbe, ``Stability of the random neural network model,'Neural Computation, vol. 2, no. 2, pp. 239-247, 1990.
E. Gelenbe, A. Stafylopatis, and A. Likas, ``Associative memory operation of the random network model,'' in {\it Proc. Int. Conf. Artificial Neural Networks}, Helsinki, pp. 307-312, 1991.
E. Gelenbe, F. Batty, ``Minimum cost graph covering with the random neural network,'' Computer Science and Operations Research, O. Balci (ed.), New York, Pergamon, pp. 139-147, 1992.
E. Gelenbe, ``Learning in the recurrent random neural network,' Neural Computation, vol. 5, no. 1, pp. 154-164, 1993.
E. Gelenbe, V. Koubi, F. Pekergin, ``Dynamical random neural network approach to the traveling salesman problem,'' Proc. IEEE Symp. Syst., Man, Cybern., pp. 630-635, 1993.
A. Ghanwani, ``A qualitative comparison of neural network models applied to the vertex covering problem,'' Elektrik, vol. 2, no. 1, pp. 11-18, 1994.
E. Gelenbe, C. Cramer, M. Sungur, P. Gelenbe ``Traffic and video quality in adaptive neural compression'', Multimedia Systems, Vol. 4, pp. 357--369, 1996.
42. Selected References
C. Cramer, E. Gelenbe, H. Bakircioglu ``Low bit rate video compression with neural networks and temporal subsampling,'' Proceedings of the IEEE, Vol. 84, No. 10, pp. 1529--1543, October 1996.
E. Gelenbe, T. Feng, K.R.R. Krishnan ``Neural network methods for volumetric magnetic resonance imaging of the human brain,'' Proceedings of the IEEE, Vol. 84, No. 10, pp. 1488--1496, October 1996.
E. Gelenbe, A. Ghanwani, V. Srinivasan, ``Improved neural heuristics for multicast routing,'' IEEE J. Selected Areas in Communications, vol. 15, no. 2, pp. 147-155, 1997.
E. Gelenbe, Z. H. Mao, and Y. D. Li, ``Function approximation with the random neural network,'' IEEE Trans. Neural Networks, vol. 10, no. 1, January 1999.
E. Gelenbe, J.M. Fourneau ``Random neural networks with multiple classes of signals,'' Neural Computation, vol. 11, pp. 721--731, 1999.
E. Gelenbe, E. Seref, and Z. Xu, ``Simulation with learning agents Proceedings of the IEEE, 89 (2) pp.148-157 (2001)
E. Gelenbe, K. Hussain ``Learning in the multiple class random neural network, IEEE Trans. Neural Networks, vol. 13, no. 6, pp. 12571267, 2002.
E. Gelenbe, R. Lent and A. Nunez ``Self-aware networks and QoS , Proceedings of the IEEE, 92 ( 9) pp. 1479-1490 (2004)