640 likes | 842 Views
The Past Tense Neural Networks and Non-Symbolic Computation. Abstraction (again!). Powerful, but costly How much is needed in human language? Model system: English past tense. Classic Developmental Story. Initial mastery of regular and irregular past tense forms
E N D
Abstraction (again!) • Powerful, but costly • How much is needed in human language?Model system: English past tense
Classic Developmental Story • Initial mastery of regular and irregular past tense forms • Overregularization appears only later (e.g. goed, comed) • ‘U-Shaped’ developmental pattern taken as evidence for learning of a morphological rule V + [+past] --> stem + /d/
Rumelhart & McClelland 1986 Model learns to classify regulars and irregulars,based on sound similarity alone.Shows U-shaped developmental profile.
What is really at stake here? • Abstraction • Operations over variables • Symbol manipulation • Algebraic computation • Learning based on input • How do learners generalize beyond input? y = 2x
Functions Input Output 4, 4 8 2, 3 5 1, 9 10 6, 7 13 341, 257 598
Functions Input Output rock rock sing sing alqz alqz dark dark lamb lamb
Functions Input Output 0 0 0 1 0 0 0 1 0 1 1 1
Functions Input Output look looked rake raked sing sang go went want wanted
Functions Input Output John left 1 Wallace fed Gromit 1 Fed Wallace Gromit 0 Who do you like Mary and? 0
Learning Functions • Learners are shown examples of what the function generates, and have to figure out what the function is. • Think of language/grammar as a very big function (or set of functions). Learning task is similar – learner is presented with examples of what the function generates, and has to figure out what the system is. • Main question in language acquisition: what does the learner need to know in order to successfully figure out what this function is? • Questions about Neural Networks • How can a network represent a function? • How can the network discover what this function is?
What is not at stake here • Feedback, negative evidence, etc.
Who has the most at stake here? • Those who deny the need for rules/variables in language have the most to lose here…if the English past tense is hard, just wait until you get to the rest of natural language! • …but if they are successful, they bring with them a simple and attractive learning theory, and mechanisms that can readily be grounded at the neural level • However, if the advocates of rules/variables succeed here or elsewhere, they face the more difficult challenge at the neuroscientific level
Pinker Ullman
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Zero-derived denominals are regular Soldiers ringed the city *Soldiers rang the city high-sticked, grandstanded, … *high-stuck, *grandstood, … Productive in adults & children Shows sensitivity to morphological structure[[ stemN] ø V]-ed Beyond Sound Similarity
Zero-derived denominals are regular Soldiers ringed the city *Soldiers rang the city high-sticked, grandstanded, … *high-stuck, *grandstood, … Productive in adults & children Shows sensitivity to morphological structure[[ stemN] ø V]-ed Provides good evidence that sound similarity is not everything But nothing prevents a model from using richer similarity metric morphological structure (for ringed) semantic similarity (for low-lifes) Beyond Sound Similarity
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Two types of arguments • Storage of regulars • Default forms
Regulars are productive, need not be stored Irregulars are not productive, must be stored But are regulars immune to effects of associative memory? frequency over-irregularization Pinker & Ullman: regulars may be stored but they can also be generated on-the-fly ‘race’ can determine which of the two routes wins some tasks more likely to show effects of stored regulars Regulars & Associative Memory
singular freq.matched base freq.matched
English Singular frequency matched
Specific Language Impairment Early claims that regulars show greater impairment than irregulars are not confirmed Pinker & Ullman 2002b ‘The best explanation is that language-impaired people are indeed impaired with rules, […] but can memorize common regular forms.’ Regulars show consistent frequency effects in SLI, not in controls. ‘This suggests that children growing up with a grammatical deficit are better at compensating for it via memorization than are adults who acquired their deficit later in life.’ Child vs. Adult Impairments
German Plurals die Straße die Straßendie Frau die Frauen der Apfel die Äpfeldie Mutter die Mütter das Auto die Autosder Park die Parks die Schmidts -s plural low frequency, used for loan-words, denominals, names, etc. Response frequency is not the critical factor in a system that focuses on similarity distribution in the similarity space is crucial similarity space with islands of reliability network can learn islands or network can learn to associate a form with the space between the islands Low-Frequency Defaults
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Ullman et al. 1997 Alzheimer’s disease patients Poor memory retrieval Poor irregulars Good regulars Parkinson’s disease patients Impaired motor control, good memory Good irregulars Poor regulars Striking correlation involving laterality of effect Marslen-Wilson & Tyler 1997 Normals past tense primes stem 2 Broca’s Patients irregulars prime stems inhibition for regulars 1 patient with bilateral lesion regulars prime stems no priming for irregulars or semantic associates Neuropsychological Dissociations
Lexical Decision Task CAT, TAC, BIR, LGU, DOG press ‘Yes’ if this is a word Priming facilitation in decision times when related word precedes target (relative to unrelated control) e.g., {dog, rug} - cat Marslen-Wilson & Tyler 1997 Regular{jumped, locked} - jump Irregular{found, shows} - find Semantic{swan, hay} - goose Sound{gravy, sherry} - grave Morphological Priming
Bird et al. 2003 complain that arguments for selective difficulty with regulars are confounded with the phonological complexity of the word-endings Pinker & Ullman 2002 weight of evidence still supports dissociation; Bird et al.’s materials contained additional confounds Neuropsychological Dissociations
Jaeger et al. 1996, Language PET study of past tense Task: generate past from stem Design: blocked conditions Result: different areas of activation for regulars and irregulars Is this evidence decisive? task demands very different difference could show up in network doesn’t implicate variables Münte et al. 1997 ERP study of violations Task: sentence reading Design: mixed Result: regulars: ~LAN irregulars: ~N400 Is this evidence decisive? allows possibility of comparison with other violations Brain Imaging Studies
Regular Irregular Nonce (Jaeger et al. 1996)
Beyond Sound Similarity Regulars and Associative Memory 1. Are regulars different?2. Do regulars implicate operations over variables? Neuropsychological Dissociations Other Domains of Morphology
Abstraction • Phonological categories, e.g., /b/ • Treating different sounds as equivalent • Failure to discriminate members of the same category • Treating minimally different words as the same • Efficient memory encoding • Morphological concatenation, e.g., V + ed • Productivity: generalization to novel words, novel sounds • Frequency-insensitivity in memory encoding • Association with other aspects of ‘procedural memory’
Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ?
Generalization • Training Items • Input: 1 0 1 0 Output: 1 0 1 0 • Input: 0 1 0 0 Output: 0 1 0 0 • Input: 1 1 1 0 Output: 1 1 1 0 • Input: 0 0 0 0 Output: 0 0 0 0 • Test Item • Input: 1 1 1 1 Output ? ? ? ? 1 1 1 1 (Humans) 1 1 1 0 (Network)