440 likes | 459 Views
The Role of Background Knowledge in Sentence Processing. Raluca Budiu July 9, 2001 Thesis Committee: John Anderson, Chair Jaime Carbonell David Plaut Lynne Reder, Department of Psychology. Ambiguity of Language. My mouse behaves erratically lately. -- From an e-mail to CS facilities.
E N D
The Role of Background Knowledge in Sentence Processing Raluca Budiu July 9, 2001 Thesis Committee: John Anderson, Chair Jaime Carbonell David Plaut Lynne Reder, Department of Psychology
Ambiguity of Language My mouse behaves erratically lately. -- From an e-mail to CS facilities Could you pass me the salt? That's the sun of the egg. -- Child speaking about the yolk of a fried egg
Language and Noise • Communication channels are noisy • People make mistakes • We understand how unfair the death penalty is. • -- George W. Bush, speaking of death tax • Listeners ignore semantic inconsistencies • When an aircraft crashes, where should the survivors be buried?
Insight from this Research Flexibility: stretching words’ meanings Reliability: ignoring noise & semantic inconsistencies
The Sentence-Processing Model Sentenceinterpretation Model Sentence Priorknowledge Noah took two animals of each kind on the ark Napoleon was defeated at Waterloo in 1815 Plato was Socrate’s student
Main Contribution A model of language comprehension that: • Offers a unified explanation of several complex linguistic phenomena • Is incremental (on line) • Is as fast as humans • Uses prior knowledge and sentence context to understand vague words • Is based on the ACT-R theory (Anderson & Lebiere, 1998)
ACT-R • A cognitive architecture based on production systems • A rigorous framework for building, running and testing computational models • Based on verified assumptions about human cognition (e.g., memory properties, attention) • Produces quantitative predictions about human behavior (e.g., accuracy and latency in a task)
Research Methodology Experiment predictions ComputationalACT-R model Human subjects Quantitative measures Quantitative measures Match? no
Evaluation of the Model The model • Can comprehend • Literal or metaphoric, distorted or undistorted sentences • Isolated or in-discourse sentences • Can explain patterns of text recall • Compares well with people on psycholinguistic experiments • Is fast, accurate, and scalable
Outline • Introduction • The sentence-processing model • Evaluation • Comprehension of sentences in discourse • Evaluation • Scalability • Future work and conclusions
The Sentence-Processing Model Sentence interpretation Model Input sentence (words + thematic roles) Background knowledge Noah took two animals of each kind on the ark Napoleon was defeated at Waterloo in 1815 Plato was Socrate’s student
Propositional Representation Noah took the animals on the ark take Noah agent verb Ark Prop patient place-oblique animals ark Parent Ark Prop Child animals Type patient
& Activation Noah Associations Noah is Lamech’s son Napoleon was defeated at Waterloo Napoleon Noah Patriarch Noah took the animals on the ark Moses take
Noah Noah is Lamech’s son Noah Noah took the animals on the ark Search Noah took the animals on the ark Noah is Lamech’s son Napoleon was defeated at Waterloo Napoleon Noah Patriarch Noah took the animals on the ark Moses take
Noah Noah Match Noah took the animals on the ark Noah is Lamech’s son Noah Noah is Lamech’s son Napoleon was defeated at Waterloo Napoleon Noah Patriarch Noah took the animals on the ark Moses take
Noah took Noah take Match Noah took the animals on the ark Noah is Lamech’s son is Noah is Lamech’s son Napoleon was defeated at Waterloo Napoleon Noah Patriarch Noah took the animals on the ark Moses take
Noah Noah Search Noah took took the animals on the ark Noah is Lamech’s son Napoleon was defeated at Waterloo Napoleon Noah Patriarch Noah took the animals on the ark Noah took the animals on the ark Noah took the animals on the ark Moses take take
Final Interpretation Noah took the animals on the ark Noah is Lamech’s son Napoleon was defeated at Waterloo Napoleon Noah Patriarch Noah took the animals on the ark Moses take
Bug word offered role verb interpretation Lamech Prop Failures of Comprehension Noah offered burnt offerings on the altar Lamech Prop Lamech Prop No interpretation
Interpretation? no Bug end of sentence yes Search Integration Match? yes no Summary of the Model Read word
Answering True/False Queries • False = a bug OR no final interpretation found • True = no bug AND final interpretation found
Outline • Introduction • The sentence-processing model • Empirical evaluation • Moses illusion • Metaphor-position effects • Comprehension of sentences in discourse • Evaluation • Scalability • Future work and conclusions
Moses Illusion • How many animals of each kind did Moses take on the ark? • Good vs. bad distortions • How many animals of each kind did Adam take on the ark?
Moses-Illusion Data • Illusion rates for good and bad distortions (Ayers, Reder & Anderson, 1996) • Percent correct distortions in the gist task (Ayers et al., 1996) • Reading times in the literal and gist task (Reder & Kusbit, 1991)
Zoo Prop Zoo Prop Adam Zoo Prop No interpretation Bug Simulation of Moses Illusion Noah Adam Moses take animals verb patient agent Ark Prop Ark Prop place-oblique ark How many animals did Moses take on the ark
Metaphor Comprehension • Effects of position on metaphor understanding (Gerrig & Healy, 1983) • Metaphor-familiarity effects (Budiu & Anderson, 1999) • Understanding metaphoric/literal sentences in context (Budiu & Anderson, 2000)
Humans Model 4.21 s 4.30 s Container Prop Container Prop 3.53 s 3.68 s Stars Prop Stars Prop Metaphor Position Drops of molten silver filled the night sky. Stars Prop The night sky was filled with drops of molten silver.
Outline • Introduction • The sentence-processing model • Evaluation • Comprehension of sentences in discourse • Evaluation • Scalability • Future work and conclusions
Sentences in Discourse Create background knowledge from discourse propositions King Lear’sstory King Lear decided to divide his kingdom King Lear had three daughters Goneril and Regan declare their grand love … Cordelia refuses to make an insincere speech Cordelia is disinherited Cordelia marries the king of France
<end> Integration Prop 5 Prop 5 No interpretation Cordelia is disinherited word married Interpretation Prop 5 …… Bug Prop 5 Novel Sentences Use a partially matching interpretation to relate to discourse Cordelia marries the king of France No interpretation
Outline • Introduction • The sentence-processing model • Evaluation • Comprehension of sentences in discourse • Evaluation • Scalability • Future work and conclusions
Our Experiments Comprehension Comprehension (of novel sentences) shallow Answer true/false deep Metaphor in Discourse Experiments Metaphor vs. Literal Reading Time Ortony et al., 1978 Inhoff et al., 1984 Shinjo & Myers, 1987 Keysar, 1990 same Gibbs, 1990 Onishi & Murphy, 1993 slower
Metaphoric Sentences in Context During history seminars, a massive young man always yawned and never paid any attention to the discussions. He was a very good linebacker who had been all-state in football. The seminar always came after his training sessions, so he was very tired. True or false: Read new: The bear yawned in class The bear slept quietly The athlete yawned in class The athlete slept quietly
True or false: Read new: The bear yawned in class The bear slept quietly bear No interpretation Find interpretation Find interpretation Bug Bug Reevaluate bug Bug-based integration The athlete slept quietly <end> The athlete yawned in class Interpretation No interpretation Metaphoric Sentences in Context
Outline • Introduction • The sentence-processing model • Evaluation • Comprehension of sentences in discourse • Evaluation • Scalability • Future work and conclusions
Computational Constraints • Speed • Accuracy • Scalability - Word database • Sentence database
Scalability Test • 436 noun-verb-noun sentences (Brown corpus via PennTreebank project) • 999 distinct words • One word repeated in at most 9 propositions • Associations based on LSA similarity measures (Landauer & Dumais, 1997) • Test for comprehension of a known sentence
Summary • A model of sentence comprehension with a strong associative mechanism to speed up the search of an interpretation • It offers a unified explanation for a variety of empirical psycholinguistic data • It is scalable • It is implemented in ACT-R
Future Work • Extend the model to other empirical phenomena (e.g., priming, text inference, lexical ambiguity) • Identify the ACT-R assumptions that are fundamental • Eliminate some of the limitations
Conclusions • Context can help the comprehension of metaphoric or semantically-flawed sentences • Semantic associations between words are a powerful mechanism that allows fast and flexible comprehension • “Peripheral” language phenomena can shedlight on deep cognitive processes
Limitations of the Model • No syntactic processing • Atomic word-phrases (e.g., drops of molten silver) • Rudimentary discourse processing • Cannot account for sentences containing similar words (e.g., George W.Bush is the son of George Bush) • Relationship between discourse and background knowledge • Similarities not from ratings • No thematic-role cues
Comprehension shallow deep Metaphor in Discourse Experiments Metaphor vs. Literal Reading Time Ortony et al., 1978 Inhoff et al., 1984 Shinjo & Myers, 1987 Keysar, 1990 same Gibbs, 1990 Onishi & Murphy, 1993 slower