480 likes | 492 Views
Memory: What’s it good for?. Solving problems Functions need memory (must hold inputs and outputs) Function themselves must be stored somewhere Turing machine Strip of paper Rules or procedures State memory Neural networks Activation of nodes Connections between nodes *. D. B. C. A.
E N D
Memory: What’s it good for? • Solving problems • Functions need memory (must hold inputs and outputs) • Function themselves must be stored somewhere • Turing machine • Strip of paper • Rules or procedures • State memory • Neural networks • Activation of nodes • Connections between nodes *
D B C A
Problems that take Memory • Raven’s Progressive Matrices • Towers of Hanoi • Spatial relations problems • Mental arithmetic • Syllogisms • Understanding complicated sentences There is a sewer near our home who makes terrific suits.
Problems that take Memory • Raven’s Progressive Matrices • Towers of Hanoi • Spatial relations problems • Mental arithmetic • Syllogisms • Understanding complicated sentences The horse raced past the barn fell.
Problems that take Memory • Raven’s Progressive Matrices • Towers of Hanoi • Spatial relations problems • Mental arithmetic • Syllogisms • Understanding complicated sentences The cat hiding under the bed yawned.
Problems that take Memory • Raven’s Progressive Matrices • Towers of Hanoi • Spatial relations problems • Mental arithmetic • Syllogisms • Understanding complicated sentences There is a correlation between how well people do on different tests There is a negative correlation between test results and age Controlling for memory, there is no correlation between test results and age!
primacy effect recall recency effect position in list Memory test
The Structure of Memory Working and Long-Term Memory: Double Dissociation Lengthen wait before recall: influences R.E., not P.E. primacy effect recall recency effect Speed presentation: influences P.E., not R.E. position in list
The Structure of Memory Working and Long-Term Memory: Double Dissociation Behavioral Studies: Patients: H.M. can store things in W.M., but not L.T.M. K.F. can store things in L.T.M., but not W.M. ….example study: Free Recall Task…. anterograde amnesia
Declarative Memory Production Memory retrieval storage execution match Working Memory perception action Computational Models and Memory Symbolic Models: Neural Network Models: working memory long-term memory
Working Memory Baddeley’s Theory Central Executive Phonological Loop Visuospatial Buffer
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners F *
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners F * yes
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners F * yes yes
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners F * yes yes no
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners Verbal Task: Remember the sentence, and say whether each word, left to right, is a noun or not. F * yes yes no no no no no no yes yes (answers are spoken or pointd to)
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners Verbal Task: Remember the sentence, and say whether each word, left to right, is a noun or not. F Billy ate the cat for dessert. * yes no no yes no yes yes yes no no no no no no yes yes (spoken or pointed to) (answers are spoken or pointd to)
Dissociating Visual and Phonological... Visual Task: Remember the figure, mentally trace it, and say “yes” when you come to a corner at the top or bottom, and “no” at other corners Verbal Task: Remember the sentence, and say whether each word, left to right, is a noun or not. F Billy ate the cat for dessert. Verbal response: FAST Spatial response: SLOW Verbal response: SLOW Spatial response: FAST * yes no no yes no yes yes yes no no no no no no yes yes (spoken or pointed to) (answers are spoken or pointd to)
Working Memory Central Executive Phonological Loop Visuospatial Buffer
Working Memory Central Executive Phonological Loop Visuospatial Buffer • Phonological Representation • Evidence: phonological confusion, • articulatory suppression • Limited Capacity • 7±2 chunks grouping by “chunks” helps • 2 seconds you remember fewer long words phonological buffer rehearsal double dissociation….. again…. (pp 240-241)
Working Memory Central Executive Phonological Loop Visuospatial Buffer • Phonological Representation • Evidence: phonological confusion, • articulatory suppression • Limited Capacity • 7±2 chunks grouping by “chunks” helps • 2 seconds you remember fewer long words • Visual Representation • Evidence: uses visual brain areas, • rotation and distance on • memory representations • influence response times phonological buffer rehearsal double dissociation….. again…. (pp 240-241)
Working Memory Central Executive ? Phonological Loop Visuospatial Buffer • Phonological Representation • Evidence: phonological confusion, • articulatory suppression • Limited Capacity • 7±2 chunks grouping by “chunks” helps • 2 seconds you remember fewer long words • Visual Representation • Evidence: uses visual brain areas, • rotation and distance on • memory representations • influence response times phonological buffer rehearsal double dissociation….. again…. (pp 240-241)
Memory slots are in “chunks” Chunks can be many different format Chunking in Working Memory
F BIV IPG NPC BSCIA FBI VIP GNP CBS CIA “1225ATFFBI31415” In-Class Demo
Trained to chunk Used race times to chunk Digit Span of about 80 Evidence for Chunking: SF
Chase and Simon studied memory for chess Start with a “snapshot” of a game in play, quick glance, then memory test Grand masters much higher memory score But wait - aren’t these guys smart? (Big memories) Used random positions -- no advantage for masters Experimental Evidence for Chunking
After training - responded only to “red” Responded only during delay Stopped responding when trial end Cell-Recording -- Neurons in Inferior Temporal Cortex
Cool-down inferior temporal neurons This impairs monkey’s performance on this task More Evidence of Working Memory Neurons
Semantic Memory: Concepts Geometric Approach: Concepts and items are represented as points in a high-dimensional space. Similarity between items is the inverse of distance between the points. Categorization is the task of finding which concept point is closest to the point that represents the item in question (i.e. “is it a cat?” is a question of whether the point representing “it” is close to the “cat” point than any other point in the space). • pig closer together = more similar • cat • dog • horse • duck
Semantic Memory: Concepts • Geometric Axioms: • Minimality: Similarity between an object • and itself is always maximum • ( d[A,A] = 0 ) • Symmetry: Similarity between A and B is • the same as between B and A • ( d[A,B] = d[B,A] ) • Triangle Inequality: If A is similar to B and • B is similar to C, then A can’t be • too dissimilar to C. • ( d[A,C] d[A,B + d[B,C] ) DON’T WORK FOR PEOPLE!!!... Familiar things are more similar to themselves than unfamiliar things. S(apple,apple) > S(pomogranite, pomogranite) Unfamiliar things are more similar to familiar things than vice-versa. S(pomogranite,apple) > S(apple, pomogranite) Things can be similar to for different reasons. (Jamaica, Cuba, North Korea example)
Semantic Memory: Concepts Featural Approach: Concepts and items are represented as lists of features. Similarity between items is given by: S(A,B) = a features(A&B) - b features(AnotB) - c features(BnotA) So similarity increases as two items have more in common, and decreases as each has it’s own non-shared features. Notice there can be biases: coefficients a, b, and c can be weighted differently, so that features in each category can have different effects. So, this model can account for the violations of the metric axioms...
Semantic Memory: Concepts Feature apple orange banana pomogranite edible + + + + has a skin + + + + round + + + red + + edible skin + edible seeds + + good for pies + good for juice + + Suppose the equation is: S(A,B) = 1*(A&B) - 1*(A~B) - 0.5*(B~A) S(apple,apple) = 7-0-0 = 7 S(pomagranite,pomagranite) = 5-0-0 = 5 S(apple,pomogranite) = 4-3-0.5*1 = 0.5 S(pomograntite,apple) = 4-1-0.5*3 = 1.5 violation of minimality violation of symmetry
Semantic Memory: Concepts • How can we implement the featural model in a network? • Units represent concepts and features, with links for connecting • concepts that are related, and features that describe them. • Assume spreading activation: when one unit is activated, it • automatically spreads to all of the connected units over time • Assume the fan effect: the more units activation has to spread • across, the weaker it becomes. • When we compare two things, both units are activated, and • activation spreads outward from them. Their similarity is • inversely proportional to how long it takes for a certain amount • of activation from the two sources to overlap.
Semantic Memory: Concepts • How can we implement the featural model in a network? • Units represent concepts and features, with links for connecting • concepts that are related, and features that describe them. • Assume spreading activation, and the fan effect. The more units are shared, the more activation will overlap edible has a skin round red edible skin edible seeds good for pies good for juices apple pomogranite Units not shared decrease overlapping activation, by spreading it thinner (fan effect)
Semantic Memory: Concepts • This model can also account for categorization and • typicality effects: • Categorization: It takes longer to verify “A dog is an animal” • than “A dog is a mammal”, because it has farther to travel. • Weights between units can indicate how typical an instance is • of a superordinate category, changing how strongly activation • from one is spread to the other. animal mammal bird dog cat animal mammal bird robin penguin
Semantic Memory: Concepts • What do feature lists leave out? • Causal relations (e.g. the fact that fertilizer tends to grow plants) • Relational dependencies (e.g. the fact that only small birds sing) • In short: Feature lists leave out structured information. • We recall from our discussion of Episodic Memory, this problem • ca be solved with the use of schemata: complex structured • frameworks. • Thus, schemata can be used to semantic memory, too, to tell us • what kinds of items are typically found in offices, what kinds of • events typically happen in a restaurant, and so on.
Modeling Schemata? • Challenge for the future: How to represent structured • relational information in a network? • Relational information (e.g. “Chris loves Pat”) has a problem • in networks with distributed representation, similar to the • binding problem: the catastrophic superposition problem. Suppose this pattern: and this pattern: and this pattern: and this pattern: and this pattern: Then this pattern: and this pattern: and this pattern: represents “Chris” represents “Pat” represents “Harry” represents “Sally” represents “loves” represents “Chris loves Pat” represents “Pat loves Chris” represents “Harry loves Sally” There is no way to tell the difference!
Modeling Schemata? • One Answer: Temporal synchrony Suppose this pattern: and this pattern: and this pattern: represents “Chris” represents “Sally” represents “loves” • But how do we distinguish between • “Chris loves Sally” and “Sally loves Chris”?
Modeling Schemata? • Need to combine structural and semantic information • LISA (Learning and Inference with Schemas and • Analogies) – Hummel & Holyoak • Binds semantic information (e.g. “Chris”) to roles • (e.g., “loves” Agent) • Can then make inferences like we do • Chris loves Mary. Chris gives flowers to Mary • Bill likes Sally. Bill gives candy to ??? Sally
Procedural Memory • Memory for procedures, rather than facts. • Skill Memory / Skill Learning • Knowing How (rather than “knowing that”) • Muscle Memory / Motor Memory • Implicit Memory / Implicit Knowledge All of these are different terms of a third kind of memory. How do we know it’s a separate kind of memory? • You can read all the facts about how to do something and still not be good at it • H.M. completely lost semantic and episodic memory, but still had procedural.
Stages of Skill Learning • Declarative Stage: You know what to do, like a list of • instructions. You rely mostly on declarative memory. • You are slow and make a lot of errors. • Knowledge Compilation Stage: You have developed • some procedures, but action is still divided into small individual • steps, and is not fluid. You still rely some on declarative memory. • You improve in speed and accuracy. • Procedural Stage: Individual procedures and actions get • streamlined and fluid, make less and less errors and get faster. • Almost no reliance on declarative knowledge. As a result, • it is hard to explain what you are doing, and hard to pick up • again if you are interrupted (because movement is fluid). proceduralization
ACT* Declarative Memory Production Memory retrieval storage execution match Working Memory I’m driving! Stop sign! perception action
ACT* IF at stop sign THEN subgoal: slow the car. IF subgoal to slow the car AND foot is over gas, move foot left IF subgoal to slow the car AND foot is over brake, press!! Pressing the brake slows the car The brake is to the left of the gas To stop you have to slow down You have to stop at a stop sign …... retrieval storage execution match Working Memory I’m driving! Stop sign! perception action
ACT* IF at stop sign THEN subgoal: slow the car. IF subgoal to slow the car AND foot is over gas, move foot left IF subgoal to slow the car AND foot is over brake, press!! Pressing the brake slows the car The brake is to the left of the gas To stop you have to slow down You have to stop at a stop sign …... CHUNKING! retrieval storage execution match Working Memory I’m driving! Stop sign! perception action
ACT* Pressing the brake slows the car The brake is to the left of the gas To stop you have to slow down You have to stop at a stop sign …... IF at stop sign THEN brake! retrieval storage execution match Working Memory I’m driving! Stop sign! perception action