1 / 47

CS 544: Lecture 3.4 Interpretation as Abduction and Local Pragmatics

CS 544: Lecture 3.4 Interpretation as Abduction and Local Pragmatics. Jerry R. Hobbs USC/ISI Marina del Rey, CA. Logical Form. The logical form of a sentence (or text) is an existentially quantified conjunction of positive ground literals.

brac
Download Presentation

CS 544: Lecture 3.4 Interpretation as Abduction and Local Pragmatics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 544: Lecture 3.4Interpretation as Abductionand Local Pragmatics Jerry R. Hobbs USC/ISI Marina del Rey, CA

  2. Logical Form The logical form of a sentence (or text) is an existentially quantified conjunction of positive ground literals. (Existential quantification is over a Platonic universe of possible individuals, including eventualities and typical elements.) John didn't work again. ==> (E j, e1, e2, e3, e4, e5) Rexists(e5) & again'(e5,e3) & not'(e3,e2) & past'(e2,e4) & work'(e4,j) & John'(e1,j) OR again(e3) & not'(e3,e2) & past'(e2,e4) & work'(e4,j) & John(j)

  3. Outline Abduction Solutions to Local Pragmatics Problems using Abduction How Weighted Abduction Works Some Systems Using Abduction

  4. Interpreting the Environment:Abduction Boat in Tree by Sea Explain Entities in Environment cause Storm Explain Relations in Environment

  5. Interpreting the Environment:Picking the Best Explanation boat in tree tree down chopped down crane storm ?

  6. Interpreting the Environment:Picking the Best Explanation boat in tree tree down in magazine chopped down crane storm ad agency advertisement

  7. Interpreting the Environment observable-1 observable-2 observable-3 underlying underlying cause cause In Abduction, deeper best explanation can be variable depth. underlying cause

  8. What is Abduction? Deduction: p(a), (A x) p(x) --> q(x) ==> q(a) Induction: p(a), q(a) ==> (A x) p(x) --> q(x) Abduction: q(a), (A x) p(x) --> q(x) ==> p(a) Abduction = Deduction + Assumptions + Cost function on proofs

  9. Interpretation as Abduction To Interpret a Situation: Find the best explanation for the observables. Abduction: Inference to the best explanation. Represent the observables as propositions. 2. Prove them, using the axioms in the knowledge base. 3. Allow assumptions in the proof, at a cost. 4. Pick the cheapest proof: Shortest proof Fewest and most plausible assumptions Greatest redundancy Most salient axioms

  10. Cognitive Benefit Knowledge of causal and implicational structure of current situation Ability to manipulate causal and implicational structure of situation to achieve goals

  11. Interpreting Discourse An utterance presents "observable" propositions. To interpret an utterance, find the best explanation for the propositional content of the utterance. Represent the content as propositions (the logical form). 2. Prove them, using the axioms in the knowledge base. 3. Allow assumptions in the proof, at a cost. 4. Pick the cheapest proof: Shortest proof Fewest and most plausible assumptions Greatest redundancy Most salient axioms

  12. Interpretation as Abduction 1. Represent the content as predications (the logical form). 2. Prove them, using the axioms in the knowledge base. 3. Allow assumptions in the proof, at a cost. 4. Pick the lowest cost proof. Speaker Hearer MB Utt Uniform framework for syntax, semantics, and pragmatics

  13. Factors in Cost 1. Salience of Facts and Axioms Used in Proof 2. Size of Proof 3. Number and Plausibility of Assumptions 4. Use of Redundant Information in Proofs

  14. Knowledge Base / Belief System Expressed as large collection of (defeasible) axioms of form: ( x,z) p1(x) & p2(x,z) --> ( y) q1(y,x) & q2(y) e.g., jar(y) --> container(y,x) & fluid(x) (A jar is a container for fluid) car(x) --> engine(y,x) (Cars have engines) fly'(e1,x,y) --> move-fast'(e,x,y) & imply(e1,e) (Flying implies moving fast)

  15. Nonmonotonicity or Defeasibility bird(x)w1 & etc1(x)w2 --> fly(x) You can never prove this, but you can assume it for a cost. This may yield lowest cost interpretation. mammal(x)w3 & etc2(x)w4 <--> elephant(x) genus differentiae species

  16. Outline Abduction Solutions to Local Pragmatics Problems using Abduction How Weighted Abduction Works Some Systems Using Abduction

  17. Example The Boston office called. Local Pragmatics Problems illustrated: 1. Definite Reference: What does the Boston office refer to? 2. Interpreting compound nominals: What is the implicit relation between Boston and office? 3. Metonymy: Coerce from the Boston office to someone at the Boston office.

  18. The Example Interpreted The Boston office called. LF: call'(e,x) & person(x) & rel(x,y) & office(y) & Boston(z) & nn(z,y) KB: person(J) work-for(J,O), office(O) work-for(x,y) --> rel(x,y) in(O,B), Boston(B) in(y,z) --> nn(z,y) New Information Definite Reference Metonymy Compound Nominal “Local Pragmatics” problems solved as a by-product Syntax : Parse Tree :: Interpretation : Proof Graph

  19. Definite Reference John bought a new car. The engine is already broken. LF: . . . & car(c) & . . . . . . & engine(y,x) & . . . KB: car(x) --> engine(y,x) Definite Reference with Implicature: John walked into the room. The chandelier shone brightly. LF: . . . & room(r) & . . . . . . & chandelier(y) & . . . KB: room(x) --> light(y) & in(y,x) light(y) & branching-fixtures(y) --> chandelier(y)

  20. Interpreting Compound Nominals turpentine jar Adjacency to be explained turpentine(x) nn(x,y) jar(y) fluid(x) & container(y,x) Proof is explanation of adjacency

  21. Lexical Ambiguity The plane taxied to the terminal. LF: plane(x) & taxi(x,y) & terminal(y) KB: airplane(x) --> plane(x) move-on-ground(x,y) & airplane(x) --> taxi(x,y) airport-terminal(y) --> terminal(y) airport(z) --> airplane(x) & airport-terminal(y) wood-smoother(x) --> plane(x) ride-in-cab(x,y) & person(x) --> taxi(x,y) computer-terminal(y) --> terminal(y)

  22. Lexical Ambiguity John wanted a loan. He went to the bank. LF: . . . & loan(l) & . . . . . . & bank(y) & . . . KB: loan(x) --> financial-institution(y) & issue(y,x) financial-institution(y) & etc4(y) --> bank1(y) bank1(y) --> bank(y) river(z) --> bank2(y) & borders(y,z) bank2(y) --> bank(y)

  23. Metonymy as Part of Syntax Syn("read Shakespeare", e,x,-) Syn("Shakespeare", y1, ...) Syn("read", e, x, y1) Metonymy rel'(y2,y1) Syn("read", e, x, y2) Selectional Constraint read'(e,x,y2) text(y2) Coercion Right Argument play(y2) & write'(e3,y1,y2) & Shakespeare(y1) Coerce "Shakespeare" into "plays of Shakespeare"

  24. Metonymy as Part of Syntax Syn("read Shakespeare", e,x,-) Find an author as Object Syn("Shakespeare", y1, ...) Syn("read", e, x, y1) Find a text as Object rel'(y2,y1) Syn("read", e, x, y2) read'(e,x,y2) text(y2) Coercion play(y2) & write'(e3,y1,y2) & Shakespeare(y1) Coerce "Shakespeare" into "plays of Shakespeare"

  25. Metonymy after Syntax read'(e,x,y2) & text’(e1,y2) & rel’(e2,y2,y1) & Shakespeare’(e3,y1) rel’(e2 y2,y1) Selectional Constraint read'(e,x,y2) text’(e1 y2) Coercion Right Argument play’(e4,y2) & write'(e3,y1,y2) & Shakespeare’(e3 y1) Coerce "Shakespeare" into "plays of Shakespeare"

  26. Pragmatic Loosening as Coercion of Eventualities Syn("flew to USC", e,x,-) Syn("to USC", y, ...) Syn("flew", e, x, y) USC(y) rel(e1,e) Syn("flew", e1, x, y) Coercion past(e) fly'(e1,x,y) --> move-fast'(e,x,y) & imply(e1,e) "Figurative" predicate coerced into inferentially related predicate.

  27. Pragmatic Loosening as Coercion of Eventualities past(e) & fly'(e1,x,y) & rel(e1,e) & to’(e2,e1,y) &USC(y) USC(y) rel(e1,e) Coercion past(e) fly'(e1,x,y) --> move-fast'(e,x,y) & imply(e1,e) "Figurative" predicate coerced into inferentially related predicate.

  28. Pronoun Resolution The plain was reduced by erosion to its present level. LF: reduce'(e1,p,l) & plain(p) & erode'(e2,x) & present(e3) & level'(e3,l,y) KB: To decrease on a vertical scale is to reduce: decrease(p,l,s) & vertical(s) & etc1(p,l,s) --> reduce'(e,p,l) A flat landform is a plain: landform(p) & flat(p) & etc2(p) --> plain(p) If a flat thing Y is at a point L on a vertical scale, then L is the level of Y: at'(e,y,l) & on(l,s) & vertical(s) & flat(y) & etc3(e,y,l,s) ---> level'(e,l,y) One way for a landform to decrease on the altitude scale is to erode: decrease'(x,l,s) & landform(x) & altitude(s) & etc4(x,l,s) ---> erode'(e,x) One kind of vertical scale is the altitude scale: vertical(s) & etc5(s) --> altitude(s)

  29. Pronoun Resolution The plain was reduced by erosion to its present level. KB: decrease(p,l,s) & vertical(s) & etc1(p,l,s) --> reduce'(e,p,l) landform(p) & flat(p) & etc2(p) --> plain(p) at'(e,y,l) & on(l,s) & vertical(s) & flat(y) & etc3(e,y,l,s) ---> level'(e,l,y) decrease'(x,l,s) & landform(x) & altitude(s) & etc4(x,l,s) ---> erode'(e,x) vertical(s) & etc5(s) --> altitude(s) Therefore, y(it )= x = p(the plain ) LF: reduce’(e1,p,l) & plain(p) & erode’(e2,x) & present(e3) & level’(e3,l,y) y=p x=p x=p

  30. Schema Recognition and Matching A bomb exploded at . . . The FMLN claimed responsibility for . . . Schema Axiom: bomb-situation(e1,b, . . . , g, e2, . . . ) ---> bomb(b) & explode'(e1,b) & . . . & terrorist-group(g) & responsible'(e2,g,e1) & . . . Recognizing schema yields minimal interpretation.

  31. Outline Abduction Solutions to Local Pragmatics Problems using Abduction How Weighted Abduction Works Some Systems Using Abduction

  32. Factors in Most Economical Proof Shortest proof Fewest and most plausible assumptions Most salient axioms Greatest redundancy Language has a huge amount of implicit redundancy. Recognizing redundancies yields more propositions proved for fewer assumptions

  33. Weighted Abduction (Stickel, 1988) 1. Goal expressions are assumable at cost (depending on utility of explaining them). turpentine(x)$3 & nn(x,y)$20 & jar(y)$10 2. Assumability costs can be passed back. P1w1 & P2w2 ---> Q If Q costs $c, then Pi costs wi * c. Informativity vs. Reliability Trade-off 3. Factoring: Goal expressions can be unified, with minimum cost. p(x1) & p(x2) ==> p(x) Helps minimize size of proofs

  34. Weighted Abduction P1w1 & P2w2 ---> Q If w1 + w2 < 1, more specific interpretations are favored. If w1 + w2 > 1, less specific interpretations are favored. But in P1.6 & P2.6 ---> Q if P1 is proved, it is cheaper to assume P2 than Q. P1 provides evidence for Q.

  35. Weighted Abduction Factoring can also override less specific abduction: Axioms: P1.6 & P2.6 ---> Q1, P2.6 & P3.6 ---> Q2 Goals: Q1$10 & Q2$10 Proof: Q1 Q2 P1 & P2 P2 & P3 P1 & P2 & P3 Cost of assuming Q1 & Q2 = $20 Cost of assuming P1 & P2 & P3 = $18

  36. Range of Interpretations most reliable I went to Dallas optimum I flew to Dallas Reliability I flew to Dallas on Southwest most informative Informativity

  37. The Form of Axioms Implicative relation between p and q: (A x,y) p(x,y) --> (E z) q(x,z) Add eventualities: (A x,y,e1) p’(e1,x,y) --> (E z,e2) q’(e2,x,z) Make rule part of explicit knowledge: (A x,y,e1) p’(e1,x,y) --> (E z,e2) q’(e2,x,z) & imply(e1,e2) Make the rule defeasible: (A x,y,e1) p’(e1,x,y)u & etc1(e1,x,y)v --> (E z,e2) q’(e2,x,z) & imply(e1,e2) Make the rule defeasibly biconditional: (A x,y,e1) p’(e1,x,y)u1 & etc1(e1,x,y)v1 --> (E z,e2) q’(e2,x,z) & imply(e1,e2) (A x,z,e2) q’(e2,x,z)u2 & etc2(e2,x,y)v2 --> (E y,e1) p’(e1,x,y) & imprel(e2,e1) The general form for expressing associations between concepts.

  38. What the Numbers Mean:Probability of Occurrence in Interpretation Space of events: Occurrences of propositions in best proofs (= correct interpretations) for all texts in corpus. P1w1 & P2w2 ---> Q: wi should vary with Pr(Q | Pi). P1w1 ---> Q P2w2 ---> Q wi should vary inversely with Pr (Pi | Q), . with Pr (¬ [P1 & . . . & Pk] | Q) . anchored at 1. . Pkwk ---> Q Cost on goal expressions: Utility of finding more specific interpretation.

  39. What the Numbers Mean:Finding Proofs 0: P0 --> Q: Literal freely assumable. e.g., P & S0 --> Q: S is side-effect. 1: P1 --> Q: No added cost to using axiom. , d << 1, n = number of literals in antecedent: P1.6 & P2.6 --> Q: Small added cost for using axiom, favors not backchaining unless partial proof or redundancy. P --> Q: Must prove. 1+d n

  40. Outline Abduction Solutions to Local Pragmatics Problems using Abduction How Weighted Abduction Works Some Systems Using Abduction

  41. AQUAINT-I: Question-Answeringfrom Multiple Sources Show me the region 100 km north of the capital of Afghanistan. Question Decomposition via Logical Rules What is the capital of Afghanistan? What is the lat/long 100 km north? Show that lat/long What is the lat/long of Kabul? Terravision CIA Fact Book Alexandrian Digital Library Gazetteer Geographical Formula Resources Attached to Reasoning Process

  42. A Complex Query What recent purchases of suspicious equipment has XYZ Corp or its subsidiaries or parent firm made in foreign countries? parent(y,x) illegal not USA Ask User subsidiary(x,y) Subsidiaries: XYZ: ABC, ... DEF: ..., XYZ, ... biowarfare Purchase: Agent: XYZ, ABC, DEF, ... Patient: anthrax, ... Date: since Jun05 Location: -- DB of bio-equip

  43. Prove Question from Answer Q: “How did Adolf Hitler die?” QLF: manner(e4) & Adolf(x10) & Hitler(x11) & nn(x12,x10,11) & die’(e4,x12) e4=e5? “suicide” is troponym of “kill”: suicide’(e5,x12) --> kill’(e5,x12,x12) & manner(e5) Gloss of “kill”: kill’(e5,x12,x12) <--> cause’(e5,x12,e4) & die’(e4,x12) Gloss of “suicide”: suicide’(e5,x12) <--> kill’(e5,x12,x12) ALF: it(x14) & be’(e1,x14,x2) & Zhukov(x1) & ’s(x2,x1) & soldier(x2) & plant’(e2,x2,x3) & Soviet(x3) & flag(x3) & atop(e2,x4) & Reichstag(x4) & on(e2,x8) & May(x5) & 1(x6) & 1945(x7) & nn(x8,x5,x6,x7) & day(x9) & Adolf(x10) & Hitler(x11) & nn(x12,x10,x11) & commit’(e3,x12,e5) & suicide’(e5,x12) A: “It was Zhukov’s soldiers who planted a Soviet flag atop the Reichstag on May 1, 1945, a day after Adolf Hitler committed suicide.”

  44. The Search Space Problem 120,000 glosses --> 120,000 axioms Theorem proving would take forever. Lexical chains / marker passing: Try to find paths between Answer Logical Form and Question Logical Form. Ignore the arguments; look for links between predicates in XWN; it becomes a graph traversal problem (e.g., confuse “buy”, “sell”) Observation: All proofs use chains of inference no longer than 4 steps Carry out this marker passing only 4 levels out Q: “What Spanish explorer discovered the Mississippi River?” Candidate A: “Spanish explorer Hernando de Soto reached the Mississippi River in 1536.” Lexical chain: discover-v#7 --GLOSS--> reach-v#1 Set of support strategy: Use only axioms that are on one of these paths. 120,000 axioms ==> several hundred axioms

  45. Relaxation (Assumptions) Rarely or never can the entire Question Logical Form be proved from the Answer Logical Form ==> We have to relax the Question Logical Form “Do tall men succeed?” Logical Form: tall’(e1,x1) & x1=x2 & man’(e2,x2) & x2=x3 & succeed’(e3,x3) Remove these conjuncts from what has to be proved, one by one, in some order, and try to prove again. E.g., we might find a mention of something tall and a statement that men succeed. One limiting case: We find a mention of success. Penalize proof for every relaxation, and pick the best proof.

  46. Abduction Observable: Q General principle: P --> Q Conclusion, assumption, or explanation: P Inference to the best explanation Abduction: Try to prove Q the best you can; Make assumptions where you have to. In the LCC QA system: The question is the observable: Hitler died The XWN glosses and troponyms are suicide --> kill --> die the general principles: The answer is the explanation: Hitler committed suicide Relaxation is the assumptions you have to make to get the proof to go through.

More Related