1 / 53

CS 224U / LINGUIST 288 Natural Language Understanding

CS 224U / LINGUIST 288 Natural Language Understanding. Lecture 9: Syntax-Semantics Interface & Glue Semantics Feb 7, 2006 Iddo Lev. Meaning Composition. How do the meanings of words combine to create the meanings of larger phrases and sentences?. Meaning Composition.

fordon
Download Presentation

CS 224U / LINGUIST 288 Natural Language Understanding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 224U / LINGUIST 288Natural Language Understanding Lecture 9: Syntax-Semantics Interface & Glue Semantics Feb 7, 2006 Iddo Lev CS 224U/LING 188/288 Winter 2006

  2. Meaning Composition • How do the meanings of words combine to create the meanings of larger phrases and sentences? CS 224U/LING 188/288 Winter 2006

  3. Meaning Composition • Need to take semantic roles into account • [John]ag admires [Mary]pt  [Mary]ag admires [John]pt • Need to take metaphor/metonymy into account • Acrylic has taken over the art world (= the use of acrylic paint) • The art world has taken over Acrylic (= reserves/supplies of acrylic paint) • But we also need to take into account sentence meaning, to understand the information about the world that is conveyed in propositions: CS 224U/LING 188/288 Winter 2006

  4. Local Composition Isn’t Enough… • Need to understand previous propositions. • John is an avid reader. One day on his way to the library he saw a mind readers’ booth and decided to walk over and give them a try. They read him and said that he enjoys reading Stephen King. (= the writings of Stephen King) • John can successfully read people’s minds, and he has many famous clients. One day … reading Stephen King. (= the mind of Stephen King) • Naïve modeling of context as bag-of-words fails. CS 224U/LING 188/288 Winter 2006

  5. What is the argument of What? • So we need to combine word meaning to get sentence meaning. • How do you know what is the argument of what? • “Our professional lives are blossoming but our marriage is on the rocks.” • “blossom”[“our professional lives”] ; “is-on-the-rocks”[“our marriage”] • “is-on-the-rocks”[“our professional lives”] ; “blossom”[“our marriage”] • Two very different meanings. • Bag-of-words approach ignores this structure. • Research on metaphor and metonymy tells us what each combination means but not which combination to choose here. CS 224U/LING 188/288 Winter 2006

  6. Phrase Structure • Of course, the phrase structure tree gives us part of the answer… S S S VP VP Our professional lives are blossoming but our marriage is on the rocks CS 224U/LING 188/288 Winter 2006

  7. S VP NP VP NP NNP VBD RB VB DT NN Semantic Structure  Syntactic Structure • But the arguments of an operator are not always dominated by it in syntactic structure: John did n’t see every show not( every( show, x.past( see[agent:john, theme:x] ) ) ) simplistic syntax-driven translation won’t work CS 224U/LING 188/288 Winter 2006

  8. Semantic Structure  Syntactic Structure dependency parse like like introduced to and Bill and each other and each other Bill John Mary John Mary John Mary semantic dependencies each other like each other and Bill and like and introduced John Mary John Mary John Mary to Bill RECIP(johnmary, like) RECIP(johnmary, yz.introduced(bill,y,z)) CS 224U/LING 188/288 Winter 2006

  9. “Floating” Operators • Quantifiers • Two sculptures are exhibited in each room. • each(room, …) • John and Mary think they like each other. • RECIP(johnmary, xy.think(x,like(x,y))) • think(johnmary, RECIP(johnmary, like)) • Floating ‘each’: • Each of the children drew a picture. • The children each drew a picture. • The children drew a picture each. • Split • All the girls went to the show except Mary. • More students attended the show than teachers. CS 224U/LING 188/288 Winter 2006

  10. What is the argument of What Two complex issues: • How do we know what words combine with what words in a sentence? • Only partly determined by syntax. • How do we deal with ambiguity here • Without explicitly enumerating all the readings? CS 224U/LING 188/288 Winter 2006

  11. What Combines With What? • We’ll motivate the answer by giving a brief survey of previous proposals. CS 224U/LING 188/288 Winter 2006

  12. Classic Approach Just use lambda calculus and function application saw(john,mary) arrived(john) S S x.saw(x,mary) NP VP NP VP V NP John arrived x.arrived(x) john John saw Mary mary john yx.saw(x,y) CS 224U/LING 188/288 Winter 2006

  13. Digression! Examples Too Simple? • For the purpose of analyzing combination of words into sentences, it does not matter if we analyze: • “Our professional lives are blossoming and our marriage is on the rocks.” • blossoming(our-professional-lives)  on-the-rocks(our-marriage) • “John is happy and Mary is sad” • happy(john)  sad(mary) • “c is p and d is q” • p(c)  q(d) • That’s why we can allow ourselves (at least at first) to consider seemingly simple examples like “John likes Mary”. Their analysis carries over to more complex sentences. CS 224U/LING 188/288 Winter 2006

  14. so make it so here as well (or allow type shifting) arrived(john) S NP VP John arrived P.P(john) x.arrived(x) Classical Approach Problem 1: quantifiers Need here to apply NP on VP, not VP on NP: every(person,arrived) * S NP VP Everyone arrived P.every(person,P) x.arrived(x) *Define: every(,) := x. (x) (x) or: every(,) :=    CS 224U/LING 188/288 Winter 2006

  15. Classical Approach Problem 2: verbs with 2 arguments every(person, y.saw(john,y)) S x.every(person, y.saw(x,y)) NP VP V NP • Type mismatch. • To resolve it, need to: • Apply ‘saw’ on “assumed” y • Apply this on “assumed” x • Bring y back • Apply ‘every’ • Bring x back John saw everyone P.every(person,P) : et john yx.saw(x,y) : eet CS 224U/LING 188/288 Winter 2006

  16. Classical Approach Problem 3: scope ambiguity some(person, x.every(person, y.saw(x,y))) every(person, y.some(person, x.saw(x,y))) S x.every(person, y.saw(x,y)) NP VP V NP The previous solution gives only the first scoping. someone saw everyone P.some(person,P) P.every(person,P) yx.saw(x,y) CS 224U/LING 188/288 Winter 2006

  17. “the meaning of an expression [the VP] is a function of the meaning of its part” in an uninteresting way – we leave a big hole here [Q2] Classical Approach possible solution: type shifting some(person, x.every(person, y.saw(x,y))) every(person, y.some(person, x.saw(x,y))) S x.every(person, y.saw(x,y)) or Q2.every(person, y.Q2(x.saw(x,y))) NP VP V NP gets very complex! someone saw everyone P.some(person,P) P.every(person,P) yx.saw(x,y)  Qx.Q(y.saw(x,y)) or  Q1Q2.Q1(y.Q2(x.saw(x,y))) : ((et)t)et : ((et)t) ((et)t)t CS 224U/LING 188/288 Winter 2006

  18. Stores, QLFs, etc. • This led to a solution of writing just the quantifier+restrictor “in place”, and later extracting them • Woods ‘70, Schubert&Pelletier ‘82 • Cooper/Keller store • Quasi-Logical Forms (the Core Language Engine) CS 224U/LING 188/288 Winter 2006

  19. Quasi-Logical Forms (Alshawi & Crouch ‘92) • QLF for “John saw Mary”: saw(john,mary) • QLF for “Every man saw some woman”:saw(qterm[every,x,man(x)], qterm[some,y,woman(y)]) • A legal resolution: in terms of “pulling out”: 1. saw(qterm[every,x,man(x)], qterm[some,y,woman(y)]) some(y,woman(y),saw(qterm[every,x,man(x)], y)) every(x,man(x), some(y,woman(y),saw(x,y))) 2. saw(qterm[every,x,man(x)], qterm[some,y,woman(y)]) every(x,man(x), saw(x, qterm[some,y,woman(y)])) some(y,woman(y),every(x,man(x), saw(x,y))) CS 224U/LING 188/288 Winter 2006

  20. Quasi-Logical Forms • Problem: very complicated algorithms to do the extraction, hard to debug and predict results • Hobbs & Shieber ‘87 • Core Language Engine ‘92 chapter 8 • Does not generalize well to other scope-bearing elements • Negation • Modals • ‘each other’ • … CS 224U/LING 188/288 Winter 2006

  21. HS, MRS, CLLS, etc. • This led to frameworks with more expressive power • Hole Semantics (Bos, Blackburn) • Minimal Recursion Semantics (Copestake, Flickinger, Pollard, Sag) • Constraint Language for the Lambda Structures (Egg, Koller, Niehren) • They show how to embed HS and MRS in CLLS CS 224U/LING 188/288 Winter 2006

  22. HS, MRS, CLLS, etc. • Can deal with more things: • “Some representative of every department didn’tgive John most samples.” htop l1:not(h5) l4:some(x, rep(x)h2,h3) l3:most(z, samp(z),h1) l6:every(y, dep(y),h4) l5:of(x,y) l2:give(x , john, z) CS 224U/LING 188/288 Winter 2006

  23. HS, MRS, CLLS, etc. • Can even express some scoping constraints within the UR • Just add more arrows between nodes • (unavailable in classical approach and QLFs) CS 224U/LING 188/288 Winter 2006

  24. HS, MRS, CLLS, etc. • Problem 1: can’t deal with all phenomena. • The main reason: the constraints say how pieces of formulas may combine rather than how semantic entities may combine. • The pieces of formulas do not necessarily correspond to any semantic entity. • E.g. in MRS, unprincipled syntactic manipulation of formulas which does not correspond to any semantic operation (contra the lambda calculus) sometimes yields formulas with unbound variables, which an additional ad-hoc module needs to get rid of. CS 224U/LING 188/288 Winter 2006

  25. HS Problem • Example: existing HS grammar treats “saw” in a way that is inconsistent with what’s needed for “each other”: • “Every man saw some woman.” • Existing HS/MRS/CLLS representation: every(x, man(x), some(y, woman(y), saw(x,y))) some(y, woman(y), every(x, man(x), saw(x,y))) • “John and Mary saw each other.” • We must have access to “saw” as a binary relation RECIP(johnmary, saw) or RECIP(johnmary, xy.saw(x,y)) CS 224U/LING 188/288 Winter 2006

  26. HS Problem (more details in section 6.4 of the glue semantics document) Existing HS UR for “Every man saw some woman.” Correct UR for “John and Mary saw each other” but inconsistent with UR above. UR similar to top UR, but incorrect. CS 224U/LING 188/288 Winter 2006

  27. HS, MRS, CLLS, etc. • Problem 2: the composition process as presented in the papers gets complicated • The complications of lambda calculus we saw + complications because composing meta-structures: • Meta-level URs farther from surface structure • Need to specify local dominance constraints • Lambda vars: quantifier body, local hole and label • Hard to visualize in one’s mind partial URs • Harder to debug, maintain, extend partial URs CS 224U/LING 188/288 Winter 2006

  28. Semantic Composition: HS • Higher Types • Composing partial URs using lambda calculus CS 224U/LING 188/288 Winter 2006

  29. The big picture? • So what’s the next step in this series? … • Instead of jumping to invent a new solution, how about stopping for a moment, taking a step back, and reflecting on what’s going on here. • What do we want? • What is the big picture? CS 224U/LING 188/288 Winter 2006

  30. Syntax-Semantics Interface Sentential SSI • (“Compositional Semantics”) • Relation between: • the words’ “functor-argument structure” • the sentence’s “functor-argument structure” • the sentence’s syntactic structure • Specifies what applies on what • We want it to be straightforward and elegant CS 224U/LING 188/288 Winter 2006

  31. SSI Syntax-Semantics Interface Syntactic structure: Sentence functor-argument structure: expects two (et) args expects two e args expects one e arg Word functor-argument structure CS 224U/LING 188/288 Winter 2006

  32. Glue Semantics • Separates semantic entities from their composition mechanism • Glue Semantics statement: :A •  - meaning expression • denotes a semantic entity (not just a piece of formula) • kept the simplest possible • A – type expression with labels • provides a “handle” on  • this is what guides the composition • The lexicon provides generic constraints on where to look for a word’s arguments CS 224U/LING 188/288 Winter 2006

  33. Basic Meanings CS 224U/LING 188/288 Winter 2006

  34. Adding Constraints • “John saw Mary” • Based on the types, we can get both: saw(john,mary) and saw(mary,john) • So we add constaints: john : e1 e4 = e1 mary : e2 e5 = e2 saw : e4e5t3 • Now we can get only the right answer by using: CS 224U/LING 188/288 Winter 2006

  35. Meaning Composition As a first step, frees you from the need to think about the parse tree. CS 224U/LING 188/288 Winter 2006

  36. Where do the constraints come from? generic lexicon example syntactic structure example actual lexicon CS 224U/LING 188/288 Winter 2006

  37. A More Realistic Syntax: LFG c-structure f-structure lexicon phrase structure rules CS 224U/LING 188/288 Winter 2006

  38. syntax semantic composition All Together Now: Modularity lexicon phrase structure rules c-structure f-structure Instantiated glue statements glue derivation CS 224U/LING 188/288 Winter 2006

  39. What Has Been Gained • Clean separation of semantic expressions from the composition mechanism • No need for complicated lambda expressions • Or ad-hoc URs (as we’ll see next) • Can use with different meaning representations (e.g. DRSs) • Modularity • Glue Semantics has been shown for different syntactic systems • Naïve PSG, LFG, HPSG, LTAG, Categorial Grammar, and in principle, can work with parse trees from statistical parsers • These systems all use the same GS statements. The only difference is the instructions in the lexicon about the connection to the syntactic structures. CS 224U/LING 188/288 Winter 2006

  40. More Semantic Rules • The rules of the lambda calculus (-conversion,-reduction, -reduction) • Some more rules (derivable using assumption introduction): CS 224U/LING 188/288 Winter 2006

  41. Semantic Structure  Syntactic Structure • The arguments of an operator are not always dominated by it in syntactic structure: S VP NP VP NP NNP VBD RB VB DT NN John did n’t see every show not( every( show, x.past( see[agent:john, theme:x] ) ) ) CS 224U/LING 188/288 Winter 2006

  42. Semantic Structure  Syntactic Structure [[John]2 did n’t see [every show]3]1 1t  see 2e  see.agent 3e  see.patient 4e  3e.variable 5e  3e.result see : 2e3e1tpast : 1t1t xy.past(see(x,y)) : 2e3e1t show : (4e5t ) every :(4e5t )(3e1t )1t john : 2e S.every(show,S) : (3e1t )1t y.past(see(john,y)) : 3e1t every(show, y.past(see(john,y))) : 1t not : 1t 1t not(every(show, y.past(see(john,y)))) : 1t CS 224U/LING 188/288 Winter 2006

  43. Monadic Quantifiers Every man saw some woman. f-structure instantiated glue statements CS 224U/LING 188/288 Winter 2006

  44. Scope Ambiguity Two possible derivations: CS 224U/LING 188/288 Winter 2006

  45. Scope Flexibility • In the lexicon, the handle of a quantifier’s 1st argument is constrained to equal that of the N’ in the NP, but the handle of the 2nd argument contains a free handle (Ht) CS 224U/LING 188/288 Winter 2006

  46. Scope Flexibility • [[Every mayor of [a big city]3]2 was indicted for accepting bribes]1 • “a big city”: S.a(x.big(x)city(x), S) : (3eH t)H t • If H t = 1t then • a(x.big(x)city(x), x.every(y.mayor-of(y,x), y.indic(y)) : 1t • If H t = 2rt then • every(y.a(x.big(x)city(x), x.mayor-of(y,x)), y.indic(y) : 1t • (= “every big-city-mayor was indicted”) CS 224U/LING 188/288 Winter 2006

  47. Reciprocal Expressions • [[John and Mary]2 saw [each other]3 ]1 • RECIP(johnmary, saw) • The elements: • johnmary : 2e • saw : 2e3e1t • (same as for “every man saw some woman”!!) • RECIP : 2e(2e3e1t )1t • The general entry: • RECIP : ae(ae3eH t )H t • where ae = “my antecedent” CS 224U/LING 188/288 Winter 2006

  48. Could combine, but in this sentence would lead to a dead-end. Reciprocal Expressions • [[Bill]2 introduced [John and Mary]3 to [each other]4 ]1 • RECIP(johnmary, yz.introduce(bill,y,z)) • The elements: • bill : 2e • johnmary : 3e • introduce : 2e 3e4e1t • yz.introduce(bill,y,z) : 3e4e1t • RECIP : 3e(3e4e1t )1t CS 224U/LING 188/288 Winter 2006

  49. Ambiguity • Anaphora ambiguity: • Bill and Sue introduced John and Mary to each other. • as in: Bill introduced [John and Mary]i to [each other]i. • as in: [Bill and Sue]i gave a present to [each other]i. • Scope ambiguity: • [John and Mary think [they like each other]2]1. • RECIP(johnmary, xy.think(x,like(x,y))) (H t = 1t) • think(johnmary, RECIP(johnmary, like)) (H t = 2t) • How many readings could a computer find? • John and Mary gave presents to each other. CS 224U/LING 188/288 Winter 2006

  50. Similar Constructs • different • John and Mary read different books. • [John and Mary]i read different books than [each other]i . • same • John and Mary read the same books. • [John and Mary]i read the same books as [each other]i . CS 224U/LING 188/288 Winter 2006

More Related