880 likes | 1.16k Views
Announcements. Sample exam questions This week (Thursday): You will submit your Qs into dropbox Bring Completed Homework For next class: sentence completion survey given to friends. Psy1302 Psychology of Language. Lecture 11 & 12 Sentence Comprehension II. Models of Sentence Processing.
E N D
Announcements • Sample exam questions • This week (Thursday): You will submit your Qs into dropbox • Bring Completed Homework • For next class: sentence completion survey given to friends.
Psy1302 Psychology of Language Lecture 11 & 12Sentence Comprehension II
Models of Sentence Processing • Garden-Path Model • Autonomous • Late closure • Minimal attachment • Constraint-Based Model • Interactive • Lexical Biases • Referential Contexts • Structural Biases } Cues from multiple sources constrain interpretation
Traditional Views(contrasting lexical and syntactic ambiguity) Constraint-Satisfaction Model SAYS it’s not the right characterization! Table from MacDonald, Pearlmutter, & Seidenberg Paper
Experiment to test the 2 models(Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995) Method: Eye-Tracking During Listening
Setting-Up the Experiment: RC Group 1 Put the frog on the napkin in the box.
Setting-Up the Experiment: RC Group 2 Put the frog that is on the napkin in the box.
Setting-Up the Experiment: RC Which group was garden-pathed? • Group 1: Put the frog on the napkin in the box. • Group 2: Put the frog that is on the napkin in the box.
Setting-Up the Experiment: RC What is a Relative Clause? • Relative Clause: a subordinate clause that modifies the noun • Group 1: Put the frog on the napkin in the box. • Group 2: Put the frogthat ison the napkin in the box. REDUCED RELATIVE CLAUSE,AMBIGUOUSAT “ON” NON-REDUCED RELATIVE CLAUSE,UNAMBIGUOUSAT “ON”
How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. The sentence processed using these 2 simple rules: Late Closure & Minimal Attachment Sometimes these simple rules lead leads one down the incorrect path, and a reanalysis is necessary. Setting-Up the Experiment: Garden-Path Garden-Path Model
Setting-Up the Experiment: Garden-Path Late Closure When possible, attach incoming lexical items into the clause or phrase currently being processed (i.e., the lowest possible nonterminal node dominating the last item analyzed). Minimal Attachment Attach incoming lexical items into the phrase-marker being constructed with the fewest nodes consistent with well-formedness rules of language. VP V NP PP frog Det N P put the on Where to attach? VP-attachment put the frog on… VP or NP? NP-attachment
VP attachment Setting-Up the Experiment: Garden-Path VP V NP PP Det N P NP put the frog on Det N the napkin VP V NP PP put NP Det NP N P … the frog on Det N the napkin 2 Attachments & 2 Meanings • NP attachment PP phrase as modifier of “frog” PP phrase as destination of “put” the frog (that is) on the napkin… on(to) the napkin the frog put
Setting-Up the Experiment: Garden-Path Late Closure When possible, attach incoming lexical items into the clause or phrase currently being processed (i.e., the lowest possible nonterminal node dominating the last item analyzed). Minimal Attachment Attach incoming lexical items into the phrase-marker being constructed with the fewest nodes consistent with well-formedness rules of language. VP V NP PP frog Det N P put the on Where to attach? VP-attachment 1. CANNOT attach directly to NP: NP Det N PP IF attach to NP: NP NP PP Violates Minimal Attachment! 2. Attach to VP: VP V NP PP Does NOT violate either rules! VP or NP? NP-attachment
Setting-Up the Experiment: Garden-Path put the frog on… 1. Syntactic processor first VP-attaches for “on” put the frog on the napkin in… 2. When encountering the 2nd prep “in” of “in the box”, parser does not know how to incorporate the word. 3. Reanalysis is needed due to incorrect first parse longer processing time. Garden-Path Model • How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. • Answer:
How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. Constraint-Satisfaction Model uses information from multiple sources to constrain interpretation In this case the lexical and contextual information likely does not support the interpretation or favors another one. Setting-Up the Experiment: Garden-Path Constraint-Satisfaction Model
How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. BIG Q: What kinds of information can be used to constrain interpretation? Examples: Lexical Biases Referential Context Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model
Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model Lexical Biases • Type of syntactic/semantic environments in which a word appears Example: • “Put” almost always appears with a VP attached PP (destination) • “Put the car in the garage” • “Choose” rarely does so • “Choose the car in the garage”
Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model • How does the model explain the difficulty of parsing: Put the frog on the napkin in the box. • Lexical Biases Support VP-attachment • “Put” almost always appears with a VP attached PP (destination) • “on” is a locative preposition • “on the napkin” is a location • i.e., compatible with possibility of a destination required by “put”
Referential Context Pick a frog. Which frog did you pick? Modifiers pick out a member of a set When 2+ referents, modifiers help differentiate the referent in question Setting-Up the Experiment: Constraint-Satisfaction Model Constraint-Satisfaction Model
Experiment to test the 2 models(Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995) • FINALLY, the experiment!!! • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? http://www.ircs.upenn.edu/Trueswellabs/video.html
Put the frog on the napkin in the box. • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? 1-Referent: 1 frog 2-Referents: 2 frogs OR
Put the frog on the napkin in the box. • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? NAPKIN is a potential destination. 1-Referent: 1 frog 2-Referents: 2 frogs OR
Put the frog on the napkin in the box. • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? BY GARDEN-PATH MODEL: Regardless of 1 or 2 referent, during the first pass, NAPKIN is considered as a destination. 1-Referent: 1 frog 2-Referents: 2 frogs OR
Put the frog on the napkin in the box. BY CONSTRAINT-SATISFACTION MODEL (which takes into consideration of referential context early): For 1 referent, NAPKIN is considered as a destination For 2 referent, NAPKIN could potentially be a modifier of FROG, and NOT a destination • Do we consider the referential context in parsing? • More specifically, WHEN do we consider referential parsing? 1-Referent: 1 frog 2-Referents: 2 frogs OR
(2-Referents) (2-Referents) (1-Referent) (1-Referent) Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy (1995) Method: Eye-Tracking During Listening AMBIGUOUS SENTENCE HEARD: Put the frog on the napkin… into the box. UNAMBIGUOUS SENTENCE HEARD: Put the frog that is on the napkin… into the box.
PUT THE FROG ON THE NAPKIN IN THE BOX. CORRECT DESTINATION INCORRECT DESTINATION - Reduced Relative - Unreduced Relative “that is”
4 1 2 3 Typical Eye-movement for the Ambiguous Sentences 1-Referent: 1 frog 2-Referents: 2 frogs Put the frog on the napkin in the box. 3 4 1 2
4 B 1 A 2 3 Typical Eye-movement for the Ambiguous Sentences 1-Referent: 1 frog 2-Referents: 2 frogs Put the frog on the napkin in the box. 3 4 1 2 A B
Constraint-Satisfaction Model • Highly Interactive • Limited Parallel Processing • If all information converge on a single analysis, then serial • If they do not, then several may be maintained
How are cues combined?(Interactive Activation Unfolding in Time) Noun Arg Structure (prob. of PP) e.g., frog Verb Argument Structure (prob. of PP) e.g., put, choose Preposition prob. of NP vs. VP e.g., of, on PP NP-Attached PP VP-Attached Referential Context or
Verb Argument Structure (prob. of PP) e.g., put, choose Noun Arg Structure (prob. of PP) e.g., frog Preposition prob. of NP vs. VP e.g., of, on VP-Attachment NP-Attachment Referential Context or How are cues combined?(Interactive Activation Unfolding in Time) • Selection of VP- vs. NP-attachment • Put the frog on… • When with: • 1 referent • 2 referent
How are cues combined?(Interactive Activation Unfolding in Time) Noun Arg Structure (prob. of PP) e.g., frog Preposition prob. of NP vs. VP e.g., of, on PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or
How are cues combined?(Interactive Activation Unfolding in Time) Noun Arg Structure (prob. of PP) e.g., frog Preposition prob. of NP vs. VP e.g., of, on PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or
How are cues combined?(Interactive Activation Unfolding in Time) Preposition prob. of NP vs. VP e.g., of, on FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context or
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 1-referent
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 1-referent
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 2-referents
How are cues combined?(Interactive Activation Unfolding in Time) ON (P): 95% NP-Attach 5% VP-Attach. FROG (N): No bias PUT (V): NP, PP PP NP-Attached PP VP-Attached Referential Context 2-referents
Moving on to Assigned Readings • Garden Path Model vs. Constraint Satisfaction Model • Ferreira & Clifton (1986) • Trueswell, Tanenhaus, & Garnsey (1994)
Subtext • These experiments test hypotheses • What was being tested? • What was found? • Multiple experiments • How did each experiment replicate or extend previous findings? • How did each experiment support or refute previous findings?
Outline • Stats Terms Simplified • t-tests • ANOVAs, Main effects and Interactions • Regressions, Correlations • Assigned Papers
T-tests and ANOVAs • T-tests: Compare 2 means. • ANOVA (Analysis of Variance): Compare multiple means • Yields significance of main or interaction effects
Hypothetical Experiment(Example of Main & Interactions Effects) • Dependent Measure: Number of Girlfriends • Independent Measure: • Wealth of bachelors according to Income • (Rich, Poor) • Looks of same bachelors according to Oprah • (Handsome, Ugly)
Design 2 x 2 # of GF # of GF # of GF # of GF
Many Few Many Few Many Most Many Many Few Many Few Few Few Many Least Few handsome handsome #GFs ugly #GFs ugly Rich Poor Rich Poor handsome handsome #GFs ugly #GFs ugly Rich Poor Rich Poor
Hypothetical Experiment(Example of ANOVAs F1 vs. F2) • Is a female model more attractive in short or long skirt? • Model pictured in 10 different short skirts and 10 different long skirts • 30 Males rated the model’s attractiveness in each skirt (1 = not attractive to 7 = extremely attractive)