1 / 34

Formal Issues in Natural Language Generation

Formal Issues in Natural Language Generation. Lecture 4 Shieber 1993; van Deemter 2002. Semantics. Formal semantics concentrates on information content and its representation.

Download Presentation

Formal Issues in Natural Language Generation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formal IssuesinNatural Language Generation Lecture 4 Shieber 1993; van Deemter 2002 Kees van Deemter Matthew Stone

  2. Semantics Formal semantics concentrates on information content and its representation. To what extent does good NLG depend on the right information? To what extent does good NLG depend on the right representation? Note: GRE, but also more general.

  3. Information in NLG Logical space: all the ways things could turn out to be

  4. Information in NLG John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. Logical space: all the ways things could turn out to be

  5. A proposition - information Identifies particular cases as real possibilities

  6. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. Here is a particular proposition.

  7. A wrinkle Computer systems get their knowledge of logical space, common ground, etc. from statements in formal logic. Lots of formulas can carry the same information.

  8. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. ABC  ABC  ABC  ABC

  9. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. AB  AB

  10. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. (A  B)  (A  B)

  11. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. F (A  B)

  12. Shieber 1993 The problem of logical form equivalence is about how you get this representation. In general, an algorithm can choose this representation in one of two ways: In a reasoner that does general, non-grammatical inference. Using at least some grammatical knowledge.

  13. Shieber 1993 If it is chosen without access to the grammar (modularly) then the surface realizer has to know what logical formulas mean the same. This is intractable, philosophically, because the notion is impossible to pin down and computationally, because our best attempts are not computable.

  14. What about GRE? Arguably, GRE uses a grammar. • Parameters such as the preference order on properties reflect knowledge of how to communicate effectively. • Decisions about usefulness or completeness of a referring expression reflect beliefs about utterance interpretation. Maybe this is a good idea for NLG generally.

  15. Letting grammar fix representation Choice of alternatives reflects linguistic notions – discourse coherence, information structure, function. ABC  ABC  ABC  ABC AB  AB (A  B)  (A  B) F (A  B)

  16. Now there’s a new question If grammar is responsible for how information is represented, where does the information itself come from? To answer, let’s consider information and communication in more detail.

  17. Information in NLG Logical space: all the ways things could turn out to be

  18. Information in NLG Common ground: the possibilities mutual knowledge still leaves open.

  19. Information in NLG John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. Common ground: the possibilities mutual knowledge still leaves open.

  20. Information in NLG Private knowledge: the things you take as possible.

  21. Information in NLG John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. Private knowledge: the things you take as possible.

  22. Information in NLG Communicative Goal: an important distinction that should go on the common ground.

  23. Information in NLG John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. Communicative Goal: an important distinction that should go on the common ground.

  24. Formal question What information satisfies what communicative goals? Objective: modularity general reasoning gives communicative goals, grammar determines information. Another meaty issue.

  25. Information in NLG John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. Communicative Goal: an important distinction that should go on the common ground.

  26. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. What John ate was a piece of fruit.

  27. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. John didn’t eat the cake.

  28. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. John ate one thing.

  29. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the cake (C). John ate the banana (B). John ate B+C. John ate A, B+C. John ate at most one thing.

  30. For example John ate nothing. John ate the apple (A). John ate A+C. John ate A+B. John ate the banana (B). John ate the cake (C). John ate B+C. John ate A, B+C. What John ate was the apple.

  31. Formal questions What information satisfies what communicative goals? Let u be the info. in the utterance. Let g be goal info. Let c, p be info. in common ground, private info. u = g? p  u  g? c  u = c  g? p  c  u  c  g?

  32. Logical form equivalence An inference problem is inevitable u = g? p  u  g? c  u = c  g? p  c  u  c  g? But the problems are very different not always as precise (entailment vs. equivalence) not always as abstract (assumptions, context, etc.) Consequences for philosophical & computational tractability.

  33. GRE, again We can use GRE to illustrate, assuming c = domain (context set) g = set of individuals to identify represented as set of discourse refs u = identifying description represented as a conjunction of properties solution criterion c  u = c  g

  34. GRE How does the algorithm choose representation of u? The algorithm finds a canonical representation of u, based on incremental selection of properties. And how does the representation and choice of u relate to the representation and choice of an actual utterance to say? The representation of u works as a sentence plan.

More Related