670 likes | 713 Views
o u t p u t y. {. 1 if net > 0 0 otherwise. w 0. i 0 =1. w 1. w 2. w n. i 1. i 2. i n. i n p u t i. Abstract Neuron. Link to Vision: The Necker Cube. Constrained Best Fit in Nature. inanimate animate. Computing other relations.
E N D
o u t p u t y { 1 if net > 0 0 otherwise w0 i0=1 w1 w2 wn . . . i1 i2 in i n p u t i Abstract Neuron
Constrained Best Fit in Nature inanimate animate
Computing other relations • The 2/3 node is a useful function that activates its outputs (3) if any (2) of its 3 inputs are active • Such a node is also called a triangle node and will be useful for lots of representations.
Triangle nodes and McCullough-Pitts Neurons? Relation (A) Object (B) Value (C) A B C
“They all rose” triangle nodes: when two of the abstract neurons fire, the third also fires model of spreading activation
Basic Ideas • Parallel activation streams. • Top down and bottom up activation combine to determine the best matching structure. • Triangle nodes bind features of objects to values • Mutual inhibition and competition between structures • Mental connections are active neural connections
Behavioral Experiments • Identity – Mental activity is Structured Neural Activity • Spreading Activation— Psychological model/theory behind priming and interference experiments • Simulation — Necessary for meaningfulness and contextual inference • Parameters — Govern simulation, strict inference, link to language
Bottom-up vs. Top-down Processes • Bottom-up: When processing is driven by the stimulus • Top-down: When knowledge and context are used to assist and drive processing • Interaction: The stimulus is the basis of processing but almost immediately top-down processes are initiated
Stroop Effect • Interference between form and meaning
Name the words BookCarTableBoxTrashManBed CornSitPaperCoin Glass HouseJar KeyRugCatDoll Letter BabyTomato CheckPhone Soda DishLampWoman
Name the print color of the words BlueGreenRed YellowOrangeBlackRed PurpleGreenRedBlueYellowBlackRed GreenWhiteBlueYellow Red BlackBlue WhiteRed Yellow GreenBlackPurple
Procedure for experiment that demonstrates the word-superiority effect. First the word is presented, then the XXXX’s, then the letters.
Word-Superiority Effect Reicher (1969) • Which condition resulted in faster & more accurate recognition of the letter? • The word condition • Letters are recognized faster when they are part of a word then when they are alone • This rejects the completely bottom-up feature model • Also a challenge for serial processing
Connectionist ModelMcClelland & Rumelhart (1981) • Knowledge is distributed and processing occurs in parallel, with both bottom-up and top-down influences • This model can explain the Word-Superiority Effect because it can account for context effects
Interaction in language processing: Pragmatic constraints on lexical access Jim Magnuson Columbia University
Information integration • A central issue in psycholinguistics and cognitive science: • When/how are such sources integrated? • Two views • Interaction • Use information as soon as it is available • Free flow between levels of representation • Modularity • Protect and optimize levels by encapsulation • Staged serial processing • Reanalyze / appeal to top-down information only when needed
Reaction Times in Milliseconds after: “They all rose” 0 delay 200ms. delay
Example: Modularity and word recognition • Tanenhaus et al. (1979) [also Swinney, 1979] • Given a homophone likerose, and a context biased towards one sense, when is context integrated? • Spoken sentence primes ending in homophones: • They all rose vs. They bought a rose • Secondary task: name a displayed orthographic word • Probe at offset of ambiguous word: priming for both“stood” and “flower” • 200 ms later: only priming for appropriate sense • Suggests encapsulation followed by rapid integration • But the constraint here is weak -- overestimates modularity? • How could we examine strong constraints in natural contexts?
Allopenna, Magnuson & Tanenhaus (1998) Eye Eye camera tracking computer Scene camera ‘Pick up the beaker’
TRACE predictions Do rhymes compete? • Cohort (Marlsen-Wilson): onset similarity is primary because of the incremental nature of speech (serial/staged; Shortlist/Merge) • Cat activates cap, cast, cattle, camera, etc. • Rhymes won’t compete • NAM (Neighborhood Activation Model; Luce): global similarity is primary • Cat activatesbat, rat, cot, cast, etc. • Rhymes among set of strong competitors • TRACE (McClelland & Elman): global similarity constrained by incremental nature of speech • Cohorts and rhymes compete, but with different time course
Study 1 Conclusions • As predicted by interactive models, cohorts and rhymes are activated, with different time courses • Eye movement paradigm • More sensitive than conventional paradigms • More naturalistic • Simultaneous measures of multiple items • Transparently linkable to computational model • Time locked to speech at a fine grain
Theoretical conclusions • Natural contexts provide strong constraints that are used • When those constraints are extremely predictive, they are integrated as quickly as we can measure • Suggests rapid, continuous interaction among • Linguistic levels • Nonlinguistic context • Even for processes assumed to be low-level and automatic • Constrains processing theories, also has implications for, e.g., learnability
Producing words from pictures or from other words: A comparison of aphasic lexical access from two different input modalities Gary Dell with Myrna Schwartz, Dan Foygel, Nadine Martin, Eleanor Saffran, Deborah Gagnon, Rick Hanley, Janice Kay, Susanne Gahl, Rachel Baron, Stefanie Abel, Walter Huber
Boxes and arrows in the linguistic system Semantics Syntax Lexicon Output Phonology Input Phonology
Picture Naming Task Semantics Say: “cat” Syntax Lexicon Output Phonology Input Phonology
A 2-step Interactive Model of Lexical Access in Production Semantic Features FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Step 1 – Lemma Access Activate semantic features of CAT FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Step 1 – Lemma Access Activation spreads through network FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Step 1 – Lemma Access Most active word from proper category is selected and linked to syntactic frame NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Step 2 – Phonological Access Jolt of activation is sent to selected word NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Step 2 – Phonological Access Activation spreads through network NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Step 2 – Phonological Access Most activated phonemes are selected FOG DOG CAT RAT MAT Syl On Vo Co f r d k m ae o t g Onsets Vowels Codas
Semantic Error – “dog” Shared features activate semantic neighbors NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Formal Error – “mat” Phoneme-word feedback activates formal neighbors NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Mixed Error – “rat” Mixed semantic-formal neighbors gain activation from both top-down and bottom-up sources NP N FOG DOG CAT RAT MAT f r d k m ae o t g Onsets Vowels Codas
Errors of Phonological Access- “dat” “mat” Selection of incorrect phonemes FOG DOG CAT RAT MAT Syl On Vo Co f r d k m ae o t g Onsets Vowels Codas
A Test of the Model:Picture-naming Errors in Aphasia “cat” 175 pictures of concrete nouns–Philadelphia Naming Test 94 patients (Broca,Wernicke, anomic, conduction) 60 normal controls
Response Categories Correct Semantic Formal Mixed Unrelated Nonword CATDOG MAT RAT LOG DAT Continuity Thesis: Normal Error Pattern: 97% Correct Random Error Pattern: 80% Nonwords cat dog mat rat log dat cat dog mat rat log dat
Implementing the Continuity Thesis 2.Set processing parameters of the model so that its error pattern matches the normal controls. Random Pattern Model Random Pattern cat dog mat rat log dat Normal Controls Model Normal Pattern 1.Set up the model lexicon so that when noise is very large, it creates an error pattern similar to the random pattern. cat dog mat rat log dat
Lesioning the model: The semantic-phonological weight hypothesis Semantic Features Semantic-word weight: S FOG DOG CAT RAT MAT Phonological- word weight: P f r d k m ae o t g Onsets Vowels Codas
Patient CAT DOG MAT RAT LOG DAT Correct Semantic Formal Mixed Unrelated Nonword LH .71 .03 .07 .01 .02 .15 s=.024 p=.018.69 .06 .06 .01 .02 .17 IG .77 .10 .06 .03 .01 .03 s=.019 p=.032.77 .09 .06 .01 .04 .03 GL .29 .04 .22 .03 .10 .32 s=.010 p=.016.31 .10 .15 .01 .13 .30
Representing Model-Patient Deviations Root Mean Square Deviation (RMSD) LH .016 IG .016 GL .043
94 new patients—no exclusions 94.5 % of variance accounted for
Conclusions The logic underlying box-and-arrow- models is perfectly compatible with connectionist models. Connectionist principles augment the boxes and arrows with -- a mechanism for quantifying degreeof damage -- mechanisms for error types and hence an explanation of the error patterns Implications for recovery and rehabilitation