590 likes | 1.88k Views
Chapter 6 Representing Knowledge Using Rules. 323-670 Artificial Intelligence ดร.วิภาดา เวทย์ประสิทธิ์ ภาควิชาวิทยาการคอมพิวเตอร์ คณะวิทยาศาสตร์ มหาวิทยาลัยสงขลานครินทร์. Representations. procedural representation declarative representation man(Marcus) man(Caesar)
E N D
Chapter 6 Representing Knowledge Using Rules 323-670 Artificial Intelligence ดร.วิภาดา เวทย์ประสิทธิ์ภาควิชาวิทยาการคอมพิวเตอร์ คณะวิทยาศาสตร์ มหาวิทยาลัยสงขลานครินทร์
Representations • procedural representation • declarative representation man(Marcus) man(Caesar) person(Cleopatra) X : man(X) -> person(X) ? y person(y) The End y = Cleopatra y = Marcus y = Caesar Page 2
apartmentpet (X) :- pet (X), small (X) . pet (X) :- cat (X) . pet (X) :- dog (X) . dog (X) :- poodle (X) . small (X) :- poodle (X) . poodle (fluffy) . A Representation in PROLOG Rules Fact Logic Programming x : pet(x) small(x) apartmentpet(x) x : cat(x) dog(x) pet(x) x : poodle(x) dog(x) small(x) poodle(fluffy) A Representation in Logic The End Figure 6.1: Declarative and a Procedural Representation Page 3
Matching The End 1. Indexing : problems: 1. need large number of rules 2. the precondition of rule is not always obvious 2. Matching with variables problems: many-many match problem many rules are matched against many elements in the state description simultaneously. son(X,Y) son(Y,Z) :- grandson(X,Z) 3. Complex and approximate matching Page 4
Indexing The End Page 5
Complex Indexing their precondition approximately match the current situation -> ELIZA Person: Men are all alike. ELIZA: In what way? Person: They’re always bugging us about somethings or other. ELIZA: Can you think of a specific example? Person: Well, my boyfriend made you come here. ELIZA: Your boyfriend made you come here. Person: He says I’m depressed much of the time. ELIZA: I am sorry to hear you depressed. Person: It’s true. I am unhappy. ELIZA: Do you think coming here will help you not to be unhappy? Person: I need some help, that much seems certain. ELIZA: What would it mean to you if you got some help? Person: Perhaps I could learn to get along with my mother. ELIZA Tell me more about your family. The End Page 6 Figure 6.6: A Bit of a Dialogue with ELIZA
Complex Indexing The End (X me Y) (X you Y) (I remember X) (Why do remember X just now?) (My {family-member} is Y)(Who else in you family is Y?) (X {family-member} Y) (Tell me more about your family) Figure 6.7: Some ELIZA-like rules Page 7
Conflict Resolution • Preference based on Rules generalization of rules specific rule (higher priority) Bird can fly Penquin cannot fly. • Preference based on Objects based on important object (ELIZA) I: semantic significant everybody : rarely use • Preference based on States based on heuristic function The End Page 8
Control Knowledge • Knowledge about which parts are most likely to find the goal state. • Knowledge about which rules to apply in a given situation. • Knowledge about the order in which to pursue subgoals. • Knowledge about useful sequence of rules to apply. The End 1. Long term memory -> Rules 2. Short term memory -> Working memory Page 9
Control Knowledge Under conditions A and B, Rules that do {not} mention X { at all, in their left-hand side. in their right-hand side.} Will {definitely be useless, probably be useless … probably be especially useful definitely be especially useful} The End Page 10 Figure 6.8: Syntax for a Control Rule [Davis, 1980]
End Chapter 6 One that would have the fruit The End must climb the tree. Page 11