170 likes | 284 Views
Sparse Distributed Memory (SDM). By Uma Ramamurthy Cognitive Science Seminar February 5, 2003. Introduction. How to organize a record … it can be retrieved in the right way under the right circumstances ?
E N D
Sparse Distributed Memory (SDM) By Uma Ramamurthy Cognitive Science Seminar February 5, 2003
Introduction • How to organize a record … it can be retrieved in the right way under the right circumstances ? • How to construct, with neuron-like components, a physical memory that enables such storage and retrieval ? • “… some links (i.e., associations) are learned, but others are a property of the mathematical space for memory items.”
Theory • Memory items as points of the space {0,1}n for large n (between 100 and 10,000) • Contents serve as addresses to memory • Random Access • Distributed • Sparse
SDM in detail… • Address space – Boolean space of dimension 1000 -- enormous space of 21000 locations • Addresses – bit vectors of length 1000 • Choose a random sample of storage locations, say 220, from this address space – hard locations • Median Distance from a random location in the address space to the nearest hard location– 424 (98% of the time, between 411 and 430)
SDM in detail… • Store each datum in many hard locations • Many hard locations participate in retrieval of each datum • Many storage locations participate in a single read or write operation • Each hard location – a bit vector of length 1000 – stores data in 1000 counters • Range of each counter: -40 to 40
SDM in detail… • To access memory item at address ‘x’, locations closest to ‘x’ are accessed • All hard locations within a given distance ‘r’ of ‘x’ will store/provide data for write/read operations • Access Circle – hard locations in a circle of radius ‘r’ with ‘x’ as the center • Hard location ‘y’ is said to be accessible from ‘x’, if ‘y’ is no farther than ‘r’ bits from ‘x’
Writing to SDM • Writing a 1 increments the counter of the bit vector; writing a 0 decrements the counter • To write (0,1,0,0,1,1,1,…) at location ‘x’, 1st counter of ‘x’ is decremented; 2nd counter of ‘x’ incremented; 3rd counter decremented, etc. • Write-operation in SDM – To write at location ‘y’, write to all the hard locations within the access circle of the location ‘y’
Reading from SDM 0 1 1 1 0 0 0 … -3 4 7 8 -1 -2 -5 … • Contents of a hard location – multiset of all the words that have ever been written to that location • Reading at a location – by Majority Rule: The bit vector read at location ‘x’ = if the counters at ‘x’ = • Read data at location ‘x’ – an archtype of data written at ‘x’, but may not be any of them
Reading from SDM (contd.) • Read-operation in SDM: • To read from location ‘y’, pool data read from every hard location accessible from ‘y’ – within the access circle of ‘y’ • To read with a noisy cue or an arbitrary cue: Iterated Reading • read at ‘y’ to get ‘y1’ • read at ‘y1’ to get ‘y2’ • read at ‘y2’ to get ‘y3’ … • if ‘y1’, ‘y2’, ‘y3’, … converges to y’, then y’ is the result of the iterated reading at location ‘y’
Converging/Diverging Sequences in SDM x3 x2 Critical Distance y2 y3 y’ y1 y4 y x1 x5 x x4
Convergence and Divergence • Convergence – Successively read words get closer and closer to one another till they are identical • Divergence – Adjacent words are orthogonal to one another, also to the target • Convergence happens only if the initial address is sufficiently close to the target • Critical Distance – distance beyond which divergence is more likely than convergence • Rapid rate of Convergence/Divergence – fewer than ten iterations, as a rule
Memory Capacity in SDM • Size of the data set with a critical distance of zero • Stored words are no longer retrievable (no convergence) • “full” and “overloaded” memories • Words written only once cannot be retrieved • For the address space of 21000, the memory capacity is 1/10th the number of hard locations – less than 100,000 locations • A hard location can contain up to 100 words
Learning Sequences in SDM • Learning sequences • Present situation should be recognized as similar to some situation(s) in the past • Consequences of that past situation(s) can be retrieved • A sequence stored as a pointer chain, accessed by repeated reads from the memory – a way to include ‘time’ in the memory trace
Interpretations • “knowing that one knows” – Fast convergence – less than 10 iterations • “tip of the tongue” – being about the critical distance from the nearest stored item… slow convergence • “momentary feelings of familiarity” – full or overloaded memory • “rehearsal” – an item is written many times, each time to many locations • “forgetting” – increases with time due to other writes
Associative Memory in IDA Sparse Distribute Memory — Boolean Space — dim = N (enough to code features) bit vector Job List Outgoing Message Sailor Data Working memory Perception Behavior Net Negotiation Deliberation Focus
Work in progress • Only “conscious content” to be written to long-term memories • SDM in ternary space (0, 1, and “don’t care”) • Testing the modified-SDM as Transient Episodic Memory (TEM) • Plans for such a TEM in IDA and attempts to do consolidation from TEM to LTM (Autobiographical Memory)