110 likes | 282 Views
Poised-for Learning: Natural Language Generation. Sunny Khemlani Selmer Bringsjord and Kostas Arkoudas Rensselaer AI & Reasoning (RAIR) Laboratory Department of Cognitive Science Department of Computer Science Rensselaer Polytechnic Institute (RPI) Troy NY 12180 US 03.30.05.
E N D
Poised-for Learning: Natural Language Generation Sunny Khemlani Selmer Bringsjord and Kostas Arkoudas Rensselaer AI & Reasoning (RAIR) Laboratory Department of Cognitive Science Department of Computer Science Rensselaer Polytechnic Institute (RPI) Troy NY 12180 US 03.30.05
NLG andPoised-for Knowledge Six distinguishing features: • Mixed representation mode • Tapestried • Extreme expressivity • Mixed inference types • Deep connection to natural language • Multi-agent structures
A Deep Connection to Natural Language “In the case of p.f.-knowledge, we know that it can give rise to explanations, presentations, answers, essays, justifications . . . in English. P.f.-knowledge is knowledge that is poised for communication expressed in natural language.”
Testing P.f. Knowledge How do we know that p.f. knowledge has been obtained by the system? • Send the system a query • Analyze the answer: • Proof representation (NDL, Athena, etc…) • Output (English)
Generating Query Output in English rep(J,A) π NLG Q O = (J,A)
Proof Representation = Natural Deduction Language • developed by Kostas Arkoudas • Fitch-style natural-deduction formulation of FOL • Proof-checker • Proof J is entered by individual (or generated from Q π) • NDL checks proof • If proof is sound, NDL generates conclusion A(discharge of proof J) rep(J,A) π NLG Q O = (J,A)
Output in English Natural Language Generation • Disclaimer: only initial stages complete! • Proof J is broken apart into its constituent subproofs • Each subproof type (proof by contradiction, proof by cases, existential elimination, universal introduction, etc…) is verbalized • Output is composited from subproof verbalizations, assorted into paragraphs • Conclusion A (discharge of proof J) is added to final composition rep(J,A) π NLG Q O = (J,A)
Output in English: Example Assume that for all x, P(x) and Q(x). Next assume that for all x, if Q(x), then T(x). Pick any arbitrary y. Since P(x) and Q(x) for all x, let y suffice for x. We infer that Q(y). Since if Q(x), then T(x) for all x, let y suffice for x. Since if Q(y), then T(y) and Q(y), it follows by modus-ponens that T(y). Since y was picked arbitrarily, and we've concluded that T(y), it follows that for all y, T(y). Finally, we conclude that if for all x, P(x) and Q(x), and if for all x, if Q(x), then T(x), then for all y, T(y). assume (forall x (P(x) & Q(x))) assume (forall x (Q(x) ==> T(x))) pick-any y begin specialize (forall x (P(x) & Q(x))) with y; Q(y) BY right-and (P(y) & Q(y)); specialize (forall x (Q(x) ==> T(x))) with y; modus-ponens (forall y (Q(y) ==> T(y))), Q(y) end conclusion (under premises): (forall y T(y))
Formula-level structure • Formulas either left intact or manually manipulated • Formula structure in development to handle formulas automatically • Propositions and predicates can be checked for contextual validity • Greater integration with NLP architecture New Order Demo
Output in English: Development • The previous example was very inflexible • Longer proofs will be very irritating to read • We intend to develop: • Post-processing • NLP architecture + ad-hoc proof-conducive rules • Model/pictographic verbalization
Case Study #4, Chart H:Report Generation Conclusion: Based on evidence gathered by both the FBI and the CIA, we conclude that terrorists operating in the New York City have access to the New York Stock Exchange. This is evident from information concerning Hamid Alwan (aka Mark Davis). Because Alwan installs and services vending machines for the NYSE as an employee of Empire State Vending Services, he has access to the NYSE. Since Alwan collaborates with Shahim Albakri, a known terrorist, our conclusion follows.