230 likes | 476 Views
Sparse Coding in Sparse Winner networks. ISNN 2007: The 4th International Symposium on Neural Networks. Janusz A. Starzyk 1 , Yinyin Liu 1 , David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine
E N D
Sparse Coding in Sparse Winner networks ISNN 2007: The 4th International Symposium on Neural Networks Janusz A. Starzyk1, Yinyin Liu1, David Vogel2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine Commonwealth of Dominica
Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) mechanism • Sparse winner network with oligarchy-take-all (OTA) mechanism • Experimental results • Conclusions
Sparse Coding Richard Axel, 1995 Kandel Fig. 30-1 Kandel Fig. 23-5 Hip Trunk Arm Hand Foot Face Tongue Larynx • How do we take in the sensory information and make sense of them?
Sparse Coding • Neurons become active representing objects and concepts Produce sparse neural representation ——“sparse coding” • Metabolism demands of human sensory system and brain • Statistical properties of the environment – not every single bit information matters http://gandalf.psych.umn.edu/~kersten/kersten-lab/CompNeuro2002/ • “Grandmother cell” by J.V. Lettvin – only one neuron on the top level representing and recognizing an object (extreme case) • A small group of neuron on the top level representing an object C. Connor, “Friends and grandmothers’, Nature, Vol. 435, June, 2005
Sparse Structure • 1012 neurons in human brain are sparsely connected • On average, each neuron is connected to other neurons through about 104 synapses • Sparse structure enables efficient computation and saves energy and cost
Sparse Coding in Sparse Structure Increasing connection’s adaptability ………… …... Sensory input … … • Cortical learning: unsupervised learning • Finding sensory input activation pathway • Competition is needed: Finding neurons with stronger activities and suppress the ones with weaker activities • Winner-take-all (WTA) a single neuron winner • Oligarchy-take-all (OTA) a group of neurons with strong activities as winners
Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) mechanism • Sparse winner network with oligarchy-take-all (OTA) mechanism • Experimental results • Conclusions
Sparse winner network with winner-take-all (WTA) • Local network model of cognition – R-net • Primary layer and secondary layer • Random sparse connection • For associative memories, not for feature extraction • Not in hierarchical structure Secondary layer Primary layer David Vogel, “A neural network model of memory and higher cognitive functions in the cerebrum”
Sparse winner network with winner-take-all (WTA) … winner … … … Primary level h+1 … … Increasing number of Overall neurons Secondary level s … Primary level h Input pattern • Hierarchical learning network: • Use secondary neurons to provide “full connectivity” in sparse structure • More secondary levels can increase the sparsity • Primary levels and secondary levels • Finding neuronal representations: • Finding global winner which has the strongest signal strength • For large amount of neurons, it is very time-consuming
Sparse winner network with winner-take-all (WTA) • Finding global winner using localized WTA: • Data transmission: feed-forward computation • Winner tree finding: local competition and feed-back • Winner selection: feed-forward computation and weight adjustment Global winner … … h+1 … … s2 s1 … h Input pattern
Sparse winner network with winner-take-all (WTA) output input activation threshold • Data transmission: feed-forward computation • Signal calculation • Transfer function Input pattern
Sparse winner network with winner-take-all (WTA) Local winner X l2 X l1 l3 • Winner tree finding: local competition and feedback • Local competition Current –mode WTA circuit (Signal – current) • Local competitions on network Local neighborhood: Local competition local winner Branches logically cut off: l1 l3 Signal on goes to Set of post-synaptic neurons of N4level j 2 3 level+1 1 5 4 4 6 7 8 9 1 5 7 level 2 3 4 6 i Set of pre-synaptic neurons of N4level+1 N4level+1 is the winner among 4,5,6,7,8 N4level+1 N4level
Sparse winner network with winner-take-all (WTA) S S S S S S S S S S S winner winner winner winner winner winner winner winner winner winner winner Input neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron The winner network is found: all the neurons directly or indirectly connected with the global winner neuron Winner tree … … … … … …
Sparse winner network with winner-take-all (WTA) • Winner selection: feed-forward computation and weight adjustment • Signal are recalculated through logically connected links • Weights are adjusted using concept of Hebbian Learning Number of global winners found is typically 1 with sufficient links • 64-256-1028-4096 network • Find 1 global winner with • over 8 connections
Sparse winner network with winner-take-all (WTA) Number of global winners found is typically 1 with sufficient input links • 64-256-1028-4096 network • Find 1 global winner with over 8 connections
Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) mechanism • Sparse winner network with oligarchy-take-all (OTA) mechanism • Experimental results • Conclusions
Sparse winner network with oligarchy-take-all (OTA) Active neuron Winner neuron in local competition Loser neuron in local competition Inactive neuron • Signal goes through layer by layer • Local competition is done after a layer is reached • Local WTA • Multiple local winner neurons on each level • Multiple winner neurons on the top level – oligarchy-take-all • Oligarchy represents the sensory input • Provide coding redundancy • More reliable than WTA … … … … … …
Outline Motor cortex Somatosensory cortex Pars opercularis Sensory associative cortex Visual associative cortex Broca’s area Visual cortex Primary Auditory cortex Wernicke’s area • Sparse Coding • Sparse Structure • Sparse winner network with winner-take-all (WTA) • Sparse winner network with oligarchy-take-all (OTA) • Experimental results • Conclusions
Experimental Results • WTA scheme in sparse network original image Input size: 8 x 8
Experimental Results • OTA scheme in sparse network 64 bit input • Averagely, 28.3 neurons being active represent the objects. • Varies from 26 to 34 neurons
Experimental Results WTA Accuracy level of random recognition Random recognition • OTA has better fault tolerance than WTA
Conclusions & Future work • Sparse coding building in sparsely connected networks • WTA scheme: local competition accomplish the global competition using primary and secondary layers –efficient hardware implementation • OTA scheme: local competition produces neuronal activity reduction • OTA – redundant coding: more reliable and robust • WTA & OTA: learning memory for developing machine intelligence Future work: • Introducing temporal sequence learning • Building motor pathway on such learning memory • Combining with goal-creation pathway to build intelligent machine