1 / 15

because what you lack is infinite recursion

When I hear the language of men, who prove God, I see there is no difference between human and beast Ecclesiastes III, 18-19. because what you lack is infinite recursion. King Solomon. yet we cannot speak.

mina
Download Presentation

because what you lack is infinite recursion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. When I hear the language of men, who prove God, I see there is no difference between human and beast Ecclesiastes III, 18-19 because what you lack is infinite recursion King Solomon yet we cannot speak

  2. A capacity for infinite recursion may have evolved for reasons unrelated to language. Other complex cognitive capacities, seemingly unique to humans, may require infinite recursion. Such capacities are likely grounded in a memory system that allows for fine discriminations. Others (e.g. Ullman) have emphasized the roots of language in semantic and procedural memory.

  3. A simple semantic network (O’Kane & Treves, 1992) updated to remove the ‘memory glass’ problem (Fulvi Mari & Treves, 1998) Reduced to a Potts model (Treves, 2004?) Cortical modules Structured long-range connectivity Potts units with dilute connectivity Local attractor states “0” state included S+1 Potts states Global activity patterns Sparse global patterns Sparse Potts patterns

  4. Iddo Kanter (1988) Potts-glass models of neural networks. Phys Rev A 37:2739-2742. + dilute connectivity (C connections/unit) I /p = N log2S pc CS (S-1)/(2 log2S ) + sparse coding (fraction 1-a of units in “0” state) I /pNa log2Sa ? pc CS (S-1)/(2a log2Sa) ? Potts version of the Hopfield model, with N units S states pc 0.138 NS (S-1)/ 2 p patterns (2 log2S )

  5. + continuous (graded response) Potts units simulations indicate that storage capacity is not affected single-unit adaptation can lead to smooth latching behaviour

  6. P P C S + multi-factor coding model (correlated patterns) pc  CS 2? pl S ? Latching, if transition probabilities are structured, and not uniform, may be a neural basis for infinite recursion. a percolation transition to infinite recursion?

  7. G Elston et al

  8. Computer simulations of Frontal Latching Networks with N = 300 Potts units a = 0.25 sparse coding S = 3,4,5,7,10 + 1 states C = 12,17,25,50,100 connections p = 25-400 patterns generated by 20 relevant factors How to quantify retrieval ? and latching ?

  9. Retrieval and latching appear to coexist only above critical values of both C and S Is that to FLNs a percolation phase transition?

More Related