420 likes | 522 Views
Artificial Intelligence. Knowledge Representation Problem 2. Reverse translation. Translate the following into English. x hesitates(x) lost(x) He who hesitates is lost. x business(x) like(x,Showbusiness) There is no business like show business. x glitters(x) gold(x)
E N D
Artificial Intelligence Knowledge Representation Problem 2
Reverse translation • Translate the following into English. • x hesitates(x) lost(x) • He who hesitates is lost. • x business(x) like(x,Showbusiness) • There is no business like show business. • x glitters(x) gold(x) • Not everything that glitters is gold. • x t person(x) time(t) canfool(x,t) • You can fool some of the people all the time.
Translating English to FOL Every gardener likes the sun. x gardener(x) likes(x,Sun) You can fool some of the people all of the time. x t person(x) time(t) can-fool(x,t) You can fool all of the people some of the time. x t (person(x) time(t) can-fool(x,t)) x (person(x) t (time(t) can-fool(x,t))) All purple mushrooms are poisonous. x (mushroom(x) purple(x)) poisonous(x) No purple mushroom is poisonous. x purple(x) mushroom(x) poisonous(x) x (mushroom(x) purple(x)) poisonous(x) There are exactly two purple mushrooms. x y mushroom(x) purple(x) mushroom(y) purple(y) ^ (x=y) z (mushroom(z) purple(z)) ((x=z) (y=z)) Clinton is not tall. tall(Clinton) X is above Y iff X is on directly on top of Y or there is a pile of one or more other objects directly on top of one another starting with X and ending with Y. x y above(x,y) ↔ (on(x,y) z (on(x,z) above(z,y))) Equivalent Equivalent
Resolution for first-order logic • for all x: (NOT(Knows(John, x)) OR IsMean(x) OR Loves(John, x)) • John loves everything he knows, with the possible exception of mean things • for all y: (Loves(Jane, y) OR Knows(y, Jane)) • Jane loves everything that does not know her • What can we unify? What can we conclude? • Use the substitution: {x/Jane, y/John} • Get: IsMean(Jane) OR Loves(John, Jane) OR Loves(Jane, John) • Complete (i.e., if not satisfiable, will find a proof of this), if we can remove literals that are duplicates after unification • Also need to put everything in canonical form first
Converting sentences to CNF 1. Eliminate all ↔ connectives (P ↔ Q) ((P Q) ^ (Q P)) 2. Eliminate all connectives (P Q) (P Q) 3. Reduce the scope of each negation symbol to a single predicate P P (P Q) P Q (P Q) P Q (x)P (x)P (x)P (x)P 4. Standardize variables: rename all variables so that each quantifier has its own unique variable name
Converting sentences 5. Eliminate existential quantification by introducing Skolem constants/functions (x)P(x) P(c) c is a Skolem constant (a brand-new constant symbol that is not used in any other sentence) (x)(y)P(x,y) (x)P(x, f(x)) since is within the scope of a universally quantified variable, use a Skolem function f to construct a new value that depends on the universally quantified variable f must be a brand-new function name not occurring in any other sentence in the KB. E.g., (x)(y)loves(x,y) (x)loves(x,f(x)) In this case, f(x) specifies the person that x loves
Modus Ponens - special case of Resolution • p Þ q • p • q • Sunday Þ Dr Yasser is teaching AI • Sunday • Dr Yasser teaching AI • Using the tricks: • p Þ q • Þ p • p Þ p Ù q • q, i.e. q
Sound rules of inference • Each can be shown to be sound using a truth table RULEPREMISE CONCLUSION Modus Ponens A, A B B And Introduction A, B A B And Elimination A B A Double Negation A A Unit Resolution A B, B A Resolution A B, B C A C
An example (x)(P(x) ((y)(P(y) P(f(x,y))) (y)(Q(x,y) P(y)))) 2. Eliminate (x)(P(x) ((y)(P(y) P(f(x,y))) (y)(Q(x,y) P(y)))) 3. Reduce scope of negation (x)(P(x) ((y)(P(y) P(f(x,y))) (y)(Q(x,y) P(y)))) 4. Standardize variables (x)(P(x) ((y)(P(y) P(f(x,y))) (z)(Q(x,z) P(z)))) 5. Eliminate existential quantification (x)(P(x) ((y)(P(y) P(f(x,y))) (Q(x,g(x)) P(g(x))))) 6. Drop universal quantification symbols (P(x) ((P(y) P(f(x,y))) (Q(x,g(x)) P(g(x)))))
Forward chaining • Proofs start with the given axioms/premises in KB, deriving new sentences until the goal/query sentence is derived • This defines a forward-chaining inference procedure because it moves “forward” from the KB to the goal [eventually]
Forward chaining • Idea: fire any rule whose premises are satisfied in the KB, • add its conclusion to the KB, until query is found
Backward chaining • Proofs start with the goal query, find rules with that conclusion, and then prove each of the antecedents in the implication • Keep going until you reach premises • Avoid loops: check if new sub-goal is already on the goal stack • Avoid repeated work: check if new sub-goal • Has already been proved true • Has already failed
Forward chaining example • KB: • allergies(X) sneeze(X) • cat(Y) allergic-to-cats(X) allergies(X) • cat(Felix) • allergic-to-cats(Lise) • Goal: • sneeze(Lise)
Reduction to propositional inference Suppose the KB contains just the following: x King(x) Greedy(x) Evil(x) King(Ali) Greedy(Ali) Brother(Saad, Ali) Instantiating the universal sentence in all possible ways, we have: King(John) Greedy(John) Evil(John) King(Richard) Greedy(Richard) Evil(Richard) King(John) Greedy(John) Brother(Richard,John) • The new KB is propositionalized: proposition symbols are King(John), Greedy(John), Evil(John), King(Richard),etc.
An example • Sameh is a lawyer. • Lawyers are rich. • Rich people have big houses. • Big houses are a lot of work. • We would like to conclude that Sameh’s house is a lot of work.
Axiomatization 1 • lawyer(Sameh) • x lawyer(x) rich(x) • x rich(x) y house(x,y) • x,y rich(x) house(x,y) big(y) • x,y ( house(x,y) big(y) work(y) ) • 3 and 4, say that rich people do have at least one house and all their houses are big. • Conclusion we want to show: house(Sameh, S_house) work(Sameh, S_house) • Or, do we want to conclude that Sameh has at least one house that needs a lot of work? I.e. • y house(Sameh,y) work(y)
Hassan and the cat • Everyone who loves all animals is loved by someone. • Anyone who kills an animal is loved by no one. • Mustafa loves all animals. • Either Mustafa or Hassan killed the cat, who is named SoSo. • Did Hassan kill the cat?
Practice exampleDid Hassan kill the cat • Mustafa owns a dog. Every dog owner is an animal lover. No animal lover kills an animal. Either Hassan or Mustafa killed the cat, who is named SoSo . Did Hassan kill the cat? • These can be represented as follows: A. (x) Dog(x) Owns(Mustafa ,x) B. (x) ((y) Dog(y) Owns(x, y)) AnimalLover(x) C. (x) AnimalLover(x) ((y) Animal(y) Kills(x,y)) D. Kills(Mustafa ,SoSo) Kills(Hassan,SoSo) E. Cat(SoSo) F. (x) Cat(x) Animal(x) G. Kills(Hassan, SoSo) GOAL
Convert to clause form A1. (Dog(D)) A2. (Owns(Mustafa,D)) B. (Dog(y), Owns(x, y), AnimalLover(x)) C. (AnimalLover(a), Animal(b), Kills(a,b)) D. (Kills(Mustafa,SoSo), Kills(Hassan,SoSo)) E. Cat(SoSo) F. (Cat(z), Animal(z)) • Add the negation of query: G: (Kills(Hassan, SoSo))
The resolution refutation proof R1: G, D, {} (Kills(Mustafa,SoSo)) R2: R1, C, {a/Mustafa, b/SoSo} (~AnimalLover(Mustafa), ~Animal(SoSo)) R3: R2, B, {x/Mustafa} (~Dog(y), ~Owns(Mustafa, y), ~Animal(SoSo)) R4: R3, A1, {y/D} (~Owns(Mustafa, D), ~Animal(SoSo)) R5: R4, A2, {} (~Animal(SoSo)) R6: R5, F, {z/SoSo} (~Cat(SoSo)) R7: R6, E, {} FALSE
D G {} R1: K(J,T) C {a/J,b/T} • The proof tree B R2: AL(J) A(T) {x/J} R3: D(y) O(J,y) A(T) A1 {y/D} R4: O(J,D), A(T) A2 {} R5: A(T) F {z/T} R6: C(T) A {} R7: FALSE
Example knowledge base • The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American. • Prove that Col. West is a criminal
Example knowledge base ... it is a crime for an American to sell weapons to hostile nations: American(x) Weapon(y) Sells(x,y,z) Hostile(z) Criminal(x) Nono … has some missiles, i.e., x Owns(Nono,x) Missile(x): Owns(Nono,M1) and Missile(M1) … all of its missiles were sold to it by Colonel West Missile(x) Owns(Nono,x) Sells(West,x,Nono) Missiles are weapons: Missile(x) Weapon(x) An enemy of America counts as "hostile“: Enemy(x,America) Hostile(x) West, who is American … American(West) The country Nono, an enemy of America … Enemy(Nono,America)
Rule-Based Systems • Also known as “production systems” or “expert systems” • Rule-based systems are one of the most successful AI paradigms • Used for synthesis (construction) type systems • Also used for analysis (diagnostic or classification) type systems
Rule Based Reasoning • The advantages of rule-based approach: • The ability to use • Good performance • Good explanation • The disadvantage are • Cannot handle missing information • Knowledge tends to be very task dependent
Other Reasoning • There exist some other approaches as: • Case-Based Reasoning • Model-Based Reasoning • Hybrid Reasoning • Rule-based + case-based • Rule-based + model-based • Model-based + case-based
Expert System • Uses domain specific knowledge to provide expert quality performance in a problem domain • It is practical program that use heuristic strategies developed by humans to solve specific class of problems
Expert System Functionality • replace human expert decision making when not available • assist human expert when integrating various decisions • provides an ES user with • an appropriate hypothesis • methodology for knowledge storage and reuse • expert system – software systems simulating expert-like decision making while keeping knowledge separate from the reasoning mechanism
Expert System User Interface Question&Answer Natural Language Graphical interface Knowledge editor User General Knowledge Inference Engine Case-specific data Explanation
Expert System Components • Global Database • content of working memory (WM) • Production Rules • knowledge-base for the system • Inference Engine • rule interpreter and control subsystem
Rule-Based System • knowledge in the form of if condition then effect (production) rules • reasoning algorithm: (i) FR detect(WM) (ii) R select(FR) (iii) WM apply R (iv) goto (i) • conflicts in FR: • examples – CLIPS (OPS/5), Prolog
Inference Engine • It applies the knowledge to the solution of actual problem • It is an interpreter for the knowledge base • It performs the recognize-act control cycle
Weaknesses of Expert Systems • Require a lot of detailed knowledge • Restrict knowledge domain • Not all domain knowledge fits rule format • Expert consensus must exist • Knowledge acquisition is time consuming • Truth maintenance is hard to maintain • Forgetting bad facts is hard
Expert Systems in Practice • MYCIN • example of medical expert system • old well known reference • great use of Stanford Certainty Algebra • problems with legal liability and knowledge acquisition • Prospector • geological system • knowledge encoded in semantic networks • Bayesian model of uncertainty handling • saved much money
Expert Systems in Practice • XCON/R1 • classical rule-based system • configuration DEC computer systems • commercial application, well used, followed by XSEL, XSITE • failed operating after 1700 rules in the knowledge base • FelExpert • rule-based, baysian model, • taxonomised, used in a number of applications • ICON • configuration expert system • uses proof planning structure of methods
10 11 9 12 14 15 13 14 5 2 4 1 3 22 20 21