1.74k likes | 1.95k Views
Parsers and Grammars. Colin Phillips. Outline. The Standard History of Psycholinguistics Parsing and rewrite rules Initial optimism Disappointment and the DTC Emergence of independent psycholinguistics Reevaluating relations between competence and performance systems. Standard View. 324
E N D
Parsers and Grammars Colin Phillips
Outline • The Standard History of Psycholinguistics • Parsing and rewrite rules • Initial optimism • Disappointment and the DTC • Emergence of independent psycholinguistics • Reevaluating relations between competence and performance systems
Standard View 324 697+ ? 217 x 32 = ? arithmetic
Standard View specialized algorithm specialized algorithm 324 697+ ? 217 x 32 = ? arithmetic
Standard View specialized algorithm specialized algorithm 324 697+ ? 217 x 32 = ? ? arithmetic something deeper
Standard View specialized algorithm specialized algorithm understanding speaking grammaticalknowledge,competence language recursive characterization ofwell-formed expressions
Standard View specialized algorithm specialized algorithm understanding speaking precisebut ill-adapted toreal-time operation grammaticalknowledge,competence language recursive characterization ofwell-formed expressions
Standard View specialized algorithm specialized algorithm understanding speaking well-adapted toreal-time operationbut maybe inaccurate grammaticalknowledge,competence language recursive characterization ofwell-formed expressions
Grammatical Knowledge • How is grammatical knowledge accessed in syntactic computation for...(a) grammaticality judgment(b) understanding(c) speaking • Almost no proposals under standard view • This presents a serious obstacle to unification at the level of syntactic computation
Townsend & Bever (2001, ch. 2) • “Linguists made a firm point of insisting that, at most, a grammar was a model of competence - that is, what the speaker knows. This was contrasted with effects of performance, actual systems of language behaviors such as speaking and understanding. Part of the motive for this distinction was the observation that sentences can be intuitively ‘grammatical’ while being difficult to understand, and conversely.”
Townsend & Bever (2001, ch. 2) • “…Despite this distinction the syntactic model had great appeal as a model of the processes we carry out when we talk and listen. It was tempting to postulate that the theory of what we know is a theory of what we do, thus answering two questions simultaneously.1. What do we know when we know a language?2. What do we do when we use what we know?
Townsend & Bever (2001, ch. 2) • “…It was assumed that this knowledge is linked to behavior in such a way that every syntactic operation corresponds to a psychological process. The hypothesis linking language behavior and knowledge was that they are identical.
Miller (1962) 1. Mary hit Mark. K(ernel)2. Mary did not hit Mark. N3. Mark was hit by Mary. P4. Did Mary hit Mark? Q5. Mark was not hit by Mary. NP6. Didn’t Mary hit Mark? NQ7. Was Mark hit by Mary? PQ8. Wasn’t Mark hit by Mary? PNQ
Miller (1962) Transformational Cube
Townsend & Bever (2001, ch. 2) • “The initial results were breathtaking. The amount of time it takes to produce a sentence, given another variant of it, is a function of the distance between them on the sentence cube. (Miller & McKean 1964).”“…It is hard to convey how exciting these developments were. It appeared that there was to be a continuing direct connection between linguistic and psychological research. […] The golden age had arrived.”
Townsend & Bever (2001, ch. 2) • “Alas, it soon became clear that either the linking hypothesis was wrong, or the grammar was wrong, or both.”
Townsend & Bever (2001, ch. 2) • “The moral of this experience is clear. Cognitive science made progress by separating the question of what people understand and say from how they understand and say it. The straightforward attempt to use the grammatical model directly as a processing model failed. The question of what humans know about language is not only distinct from how children learn it, it is distinct from how adults use it.”
A Simple Derivation S (starting axiom) S
A Simple Derivation S (starting axiom)1. S NP VP S NP VP
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP S NP VP V NP
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N S NP VP V NP D N
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill S NP VP Bill V NP D N
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit S NP VP Bill V NP hit D N
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the S NP VP Bill V NP hit D N the
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball S NP VP Bill V NP hit D N the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball S NP VP Bill V NP hit D N the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball Bill
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill hit
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill V hit
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill V hit the
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill V hit D the
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill V hit D the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill V hit D N the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP Bill V NP hit D N the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball NP VP Bill V NP hit D N the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball S NP VP Bill V NP hit D N the ball
A Simple Derivation S (starting axiom)1. S NP VP2. VP V NP3. NP D N4. N Bill5. V hit6. D the7. N ball S NP VP Bill V NP hit D N the ball
Transformations wh-movement X wh-NP Y 1 2 3 --> 2 1 0 3
Transformations VP-ellipsis X VP1 Y VP2 Z 1 2 3 4 5 --> 1 2 3 0 5 condition: VP1 = VP2
Difficulties • How to build structure incrementally in right-branching structures • How to recognize output of transformations that create nulls
Summary • Running the grammar ‘backwards’ is not so straightforward - problems of indeterminacy and incrementality • Disappointment in empirical tests of Derivational Theory of Complexity • Unable to account for processing of local ambiguities
Standard View specialized algorithm specialized algorithm understanding speaking grammaticalknowledge,competence language recursive characterization ofwell-formed expressions
Grammatical Knowledge • How is grammatical knowledge accessed in syntactic computation for...(a) grammaticality judgment(b) understanding(c) speaking • Almost no proposals under standard view • This presents a serious obstacle to unification at the level of syntactic computation
Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?
Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?
Grammar as Parser - Problems • Incremental structure building with PS Rules (e.g. S -> NP VP) • delay • prediction/guessing • Indeterminacy ( how to recover nulls created by transformations)