150 likes | 270 Views
Artificial Intelligence: Natural Language. A little more on grammars Semantics Pragmatics Generation. More on grammars. Consider following examples: “John likes.” NOT OK “John jumps.” OK “John jumps in the water,” OK “The small fluffy cat jumps.” OK John like the cat. NOT OK.
E N D
Artificial Intelligence: Natural Language A little more on grammars Semantics Pragmatics Generation
More on grammars • Consider following examples: • “John likes.” NOT OK • “John jumps.” OK • “John jumps in the water,” OK • “The small fluffy cat jumps.” OK • John like the cat. NOT OK. • The cats likes John. NOT OK. • The cat on the table likes John. OK
Better grammar • Should deal with: • Intransive/Transitive verbs. Former are ones that don’t need following noun phrase. • Prepositional phrases (e.g., in the lake). Prepostion followed by noun phrase. • Series of adjectives. Recursive rule can be used.. • Subject-verb agreement. Can add arguments to grammar rules/dictionary entries. • sentence --> np(Num), vp(Num). • np(Num) --> art, noun(Num). • noun(sing) --> [cat].
Semantics • Syntax: Uses grammar to structure sentence. • Semantics: Maps this to a structured representation that can be used in inference. (often referred to as sentence meaning) • Possible representations: • SQL. Map “Find me all the students who are taking AI3” to relevant SQL query. • Predicate Logic: Map “John loves anyone who is tall” onto relevant statement in predicate logic. • Other structured rep: (e.g., “case frame”: action: loves subject: john object: mary
Semantics • How do we get from the parsed sentence to this kind of representation? • In general rather tricky, but to illustrate idea we will show how it could be done for “John loves Mary” by adding extra arguments to a prolog grammar. • We want to map that sentence to • loves(john, mary). • We will cheat by assuming that the functor pf Prolog structured objects can be a variable. • Verb(Object, Subject)
Grammar with Semantics Sentence(Verb(Subject, Object)) --> nounPhrase(Subject), verbPhrase(Verb, Object). nounPhrase(Subject) --> properName(Subject). verbPhrase(Verb, Object) --> verb(Verb), nounPhrase(Object). • General idea is that we can “compose” the sentence meaning by working out the “meaning” of the syntactic constituents and sticking the results together somehow.
Pragmatics • But can’t get very far without knowing something about the world, and the context in which a sentence is uttered. • Pragmatics deals with this. • Example. Determining referents of pronouns etc. • “John likes that blue car. He buys it.” • We need context to determine what he is referring to in “that blue car”, “he”, it”. • Then can create meaning: likes(john, car1) and buys(john, car1).
Pragmatics • Pragmatics is also about what people DO with language. • Making sense of, and generating language involves mapping language to goals. • “Do you have the time?” -> speaker wants to know the time. • “When is the last train to London?” -> speaker probably wants to go there. • We can apply some of our planning ideas to this problem.
Pragmatics and Plans • As an example of a plan-based approach to language, consider the actions of requesting, informing, asking. • Referred to as “speech acts”. • We can describe these as planning operators. • The preconditions and effects refer to speaker and hearer’s beliefs and desires. • We use a notation to describe these: • knows(Agent, Fact) • wants(Agent, State/Action) • e.g., wants(fred, kiss(fred, mary)) • knows(fred, loves(mary, joe))
More speech acts • Sketch of inform, request, • inform(Speaker, Hearer, Fact) pre: knows(Speaker, Fact) wants(Speaker, knows(Hearer, Fact)) add: knows(Hearer, Fact) knows(Speaker, knows(Hearer, Fact)) • How does this oversimplify the “informing” action? • request(Speaker, Hearer, do(Hearer, Action)) pre: wants(Speaker, Action) knows(Speaker, cando(Hearer, Action)) add: wants(hearer, Action) • (Note: A bit tricky to integrate with ordinary planning rules.) • We talk of people having “communicative goals” (like wanting someone to know something)
Putting it all together • Given sentences like spoken by John about Fred: • “What is the time? • He has missed the train. • Can now • parse the sentence • map that to a structured representation that is good for inference. • Use context and knowledge of goals/plans to obtain from that: • wants(john, know(john, time1)) (where time1 is the time at some instant) • believes(john, missed(fred, train2))
Language Generation • Language processing also about generation of language. • Structured representation --> NL text. • Simplest generation method is using templates, mapping representation straight to text template (with variables/slots to fill in). • loves(X, Y) -> X “loves” Y • gives(X, Y, Z) -> X “gives the” Y “to” Z • Mail-merge tools in word processors work similarly, extracting data from simple database to fill slots.
Language Generation • But much more to language generation in general. Templates are very rigid. • Consider “John eats the cheese. John eats the apple. John sneezes. John laughs.” • Better as “John eats the cheese and apple, then sneezes. He then laughs.” • Getting good style involves working out how to map many facts to one sentence, when to use pronouns, when to use “connectives” like “then”.
Language Generation • Serious language generation involves deciding: • what to say. • how to order and structure it. • How to break it up into sentences. • How to refer to objects (using pronouns, and expressions like “the cat” etc). • How to express things in terms of grammatically correct sentences. • Often starting point is a communicative goal
Summary • Natural Language Processing includes: • Syntax • Semantics • Pragmatics • And involves: • Generating language • Understanding language