780 likes | 953 Views
AND Network. Input Output 0 0 0 1 0 0 0 1 0 1 1 1. OR Network. Input Output 0 0 0 1 0 1 0 1 1 1 1 1. NETWORK CONFIGURED BY TLEARN # weights after 10000 sweeps # WEIGHTS # TO NODE 1 -1.9083807468 ## bias to 1 4.3717832565 ## i1 to 1 4.3582129478 ## i2 to 1 0.0000000000.
E N D
AND Network Input Output 0 0 01 0 00 1 01 1 1
OR Network Input Output 0 0 01 0 10 1 11 1 1 • NETWORK CONFIGURED BY TLEARN • # weights after 10000 sweeps • # WEIGHTS • # TO NODE 1 • -1.9083807468 ## bias to 1 • 4.3717832565 ## i1 to 1 • 4.3582129478 ## i2 to 1 • 0.0000000000
XOR Network -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1
XOR Network Input Output 0 0 01 0 10 1 01 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1
XOR Network Input Output 0 0 01 0 10 1 01 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2
XOR Network Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2
XOR Network -4.4429202080 ## bias to output 9.0652370453 ## 1 to output 8.9045801163 ## 2 to output Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2
XOR Network Input Output 0 0 01 0 10 1 11 1 1 -4.4429202080 ## bias to output 9.0652370453 ## 1 to output 8.9045801163 ## 2 to output Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2
XOR Network Input Output 0 0 01 0 10 1 11 1 1 -4.4429202080 ## bias to output 9.0652370453 ## 1 to output 8.9045801163 ## 2 to output The mapping from the hidden units to output is an OR network, that never receives a [1 1] input. Input Output 0 0 01 0 10 1 01 1 0 Input Output 0 0 01 0 00 1 11 1 0 -3.0456776619 ## bias to 1 5.5165352821 ## i1 to 1 -5.7562727928 ## i2 to 1 -3.6789164543 ## bias to 2 -6.4448370934 ## i1 to 2 6.4957633018 ## i2 to 2
Classic Developmental Story • Initial mastery of regular and irregular past tense forms • Overregularization appears only later (e.g. goed, comed) • ‘U-Shaped’ developmental pattern taken as evidence for learning of a morphological rule V + [+past] --> stem + /d/
Rumelhart & McClelland 1986 Model learns to classify regulars and irregulars,based on sound similarity alone.Shows U-shaped developmental profile.
What is really at stake here? • Abstraction • Operations over variables • Learning based on input
What is not at stake here • Feedback, negative evidence, etc.
Who has the most at stake here? • Those who deny the need for rules/variables in language have the most to lose here • …but if they are successful, they bring with them a simple and attractive learning theory, and mechanisms that can readily be grounded at the neural level • However, if the advocates of rules/variables succeed here or elsewhere, they face the more difficult challenge at the neuroscientific level
Questions about Lab 2b • How did the network perform? • How well did the network generalize to novel stems? • What was the effect of the frequency manipulation? • Does the network need to internalize a Blocking Principle? • Does the network explicitly represent a default form?
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Zero-derived denominals are regular Soldiers ringed the city *Soldiers rang the city high-sticked, grandstanded, … *high-stuck, *grandstood, … Productive in adults & children Shows sensitivity to morphological structure[[ stemN] ø V]-ed Provides good evidence that sound similarity is not everything But nothing prevents a model from using richer similarity metric morphological structure (for ringed) semantic similarity (for low-lifes) Beyond Sound Similarity
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Regulars are productive, need not be stored Irregulars are not productive, must be stored But are regulars immune to effects of associative memory? frequency over-irregularization Pinker & Ullman: regulars may be stored but they can also be generated on-the-fly ‘race’ can determine which of the two routes wins some tasks more likely to show effects of stored regulars Regulars & Associative Memory
Specific Language Impairment Early claims that regulars show greater impairment than irregulars are not confirmed Pinker & Ullman 2002b ‘The best explanation is that language-impaired people are indeed impaired with rules, […] but can memorize common regular forms.’ Regulars show consistent frequency effects in SLI, not in controls. ‘This suggests that children growing up with a grammatical deficit are better at compensating for it via memorization than are adults who acquired their deficit later in life.’ Child vs. Adult Impairments
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Ullman et al. 1997 Alzheimer’s disease patients Poor memory retrieval Poor irregulars Good regulars Parkinson’s disease patients Impaired motor control, good memory Good irregulars Poor regulars Striking correlation involving laterality of effect Marslen-Wilson & Tyler 1997 Normals past tense primes stem 2 Broca’s Patients irregulars prime stems inhibition for regulars 1 patient with bilateral lesion regulars prime stems no priming for irregulars or semantic associates Neuropsychological Dissociations
Lexical Decision Task CAT, TAC, BIR, LGU, DOG press ‘Yes’ if this is a word Priming facilitation in decision times when related word precedes target (relative to unrelated control) e.g., {dog, rug} - cat Marslen-Wilson & Tyler 1997 Regular{jumped, locked} - jump Irregular{found, shows} - find Semantic{swan, hay} - goose Sound{gravy, sherry} - grave Morphological Priming
Bird et al. 2003 complain that arguments for selective difficulty with regulars are confounded with the phonological complexity of the word-endings Pinker & Ullman 2002 weight of evidence still supports dissociation; Bird et al.’s materials contained additional confounds Neuropsychological Dissociations
Jaeger et al. 1996, Language PET study of past tense Task: generate past from stem Design: blocked conditions Result: different areas of activation for regulars and irregulars Is this evidence decisive? task demands very different difference could show up in network doesn’t implicate variables Münte et al. 1997 ERP study of violations Task: sentence reading Design: mixed Result: regulars: ~LAN irregulars: ~N400 Is this evidence decisive? allows possibility of comparison with other violations Brain Imaging Studies
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
German Plurals die Straße die Straßendie Frau die Frauen der Apfel die Äpfeldie Mutter die Mütter das Auto die Autosder Park die Parks die Schmidts -s plural low frequency, used for loan-words, denominals, names, etc. Response frequency is not the critical factor in a system that focuses on similarity distribution in the similarity space is crucial similarity space with islands of reliability network can learn islands or network can learn to associate a form with the space between the islands Low-Frequency Defaults
Arabic Broken Plural • CvCC • nafs nufuus ‘soul’ • qidh qidaah ‘arrow’ • CvvCv(v)C • xaatam xawaatim ‘signet ring’ • jaamuus jawaamiis ‘buffalo’ • Sound Plural • shuway?ir shuway?ir-uun ‘poet (dim.)’ • kaatib kaatib-uun ‘writing (participle)’ • hind hind-aat ‘Hind (fem. name)’ • ramadaan ramadaan-aat ‘Ramadan (month)’
German Plurals (Hahn & Nakisa 2000)
Starting Small Simulation • How well did the network perform? • How did it manage to learn?
Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ?
Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? 1 1 1 1 (Humans) 1 1 1 0 (Network)
Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? • Generalization fails because learning is local 1 1 1 1 (Humans) 1 1 1 0 (Network)
Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? • Generalization succeeds because representations are shared 1 1 1 1 (Humans) 1 1 1 1 (Network)