410 likes | 561 Views
Resolution Theorem Proving. G51IAI Introduction to Artificial Intelligence Dr Matthew Hyde. Outline. Propositional Logic Recap Conjunctive normal form The Resolution Algorithm Strategies to Guide the Search Conclusion. Propositional Logic. Proposition symbols (“literals”)
E N D
Resolution Theorem Proving G51IAI Introduction to Artificial Intelligence Dr Matthew Hyde
Outline • Propositional Logic Recap • Conjunctive normal form • The Resolution Algorithm • Strategies to Guide the Search • Conclusion
Propositional Logic • Proposition symbols (“literals”) • A, B, C, D, Student, North • Each can be either True or False • The name is irrelevant • It is just the name you give to the proposition • In your model, if North = True, it could mean that you are facing north, or everyone is facing north, or that it is possible to go north, etc...
Propositional Logic • Logical Connectives form more complex sentences • OR: “V” - True if one of the symbols is true, or both ( Awake V Asleep ) • AND: “Λ” - True if both of the symbols are true ( AwakeΛListening ) • NOT: “¬” ¬Awake, ¬A, ¬B, ¬Student • IMPLIES: “=>” P => Q - True unless P is true and Q is false ( AwakeΛListening ) => ¬Fail
Propositional Logic • A “knowledge base” • Simply, a group of logical expressions that we already know to be true. E.g.: North Awake Λ Listening Student Awake V Asleep Literals
Propositional Logic • The knowledge base evaluates to true, because it is equivalent to putting an “Λ” between each expression North Awake Λ Listening Student Awake V Asleep North Λ(Awake Λ Listening) ΛStudent Λ(Awake V Asleep)
Summary so far • Literals can be either TRUE or FALSE • e.g A, B, ¬C, D, ¬Student, North ... • Knowledge base made up of combinations of literals • Each line of the knowledge base is TRUE • Therefore, ALL of the knowledge base evaluates to TRUE
Inference in Propositional Logic • We can infer new facts from what we already know • “Modus Ponens” Rule A => B A B • Also works for longer sentences A Λ B Λ C => D A Λ B Λ C D We know these two We can therefore infer that this is true
Example of Inference CompassNorth => FacingNorth Snowing => Cold CompassNorth Awake Cold We can infer that FacingNorth = ????
Example of Inference CompassNorth => FacingNorth Snowing => Cold CompassNorth Awake Cold FacingNorth We can infer that FacingNorth = True
Resolution in Propositional Logic • Resolution is one method for automated theorem proving • It is important to Artificial Itelligence because it helps logical agents to reason about the world • It helps them to prove new theorems, and therefore helps them to add to their knowledge
Resolution Algorithm • Input a knowledge base and an expression • It negates the expression, adds that to the knowledge base, and then finds a contradiction if one exists • If it finds a contradiction, then the negated statement is false • Therefore, the original statement mustbe true
Resolution Algorithm Small example Is it sunny? sunny = TRUE? Prove sunny Knowledge base: sunny daytime sunny V night
Resolution Algorithm Small example Is it sunny? sunny = TRUE? Prove sunny Knowledge base: sunny daytime sunny V night ¬sunny Negate it Add it to the knowledge base CONTRADICTION ¬sunny = FALSE Therefore: sunny = TRUE
Conjunctive Normal Form • Resolution algorithm needs sentences in CNF • A series of “conjunctions” (clauses joined together by “AND”) • (¬ A V B) Λ (B V C) Λ (D V ¬ E V F) Λ (G) Λ ... • Inside the brackets, we only have V (OR) ¬ (NOT) symbols • There are no “implies” (=>) symbols anywhere
Conjunctive Normal Form (A V B) Λ (B V C)Λ (D V E V F) Λ (G) Λ ... Clauses The whole thing represents the knowledge base, so it evaluates to TRUE
Conjunctive Normal Form • Resolution algorithm ‘resolves’ clauses • In fact, it only applies to clauses • Each pair of clauses that contains complementaryliterals is resolved • Complementary literals have the property that one negates the other • A, ¬A • Student, ¬Student
Procedure for converting to CNF • (a) To eliminate ↔, • (a ↔ b) ≡ (a → b) Λ(b→ a) • (b) To eliminate →, • (a → b) ≡ ¬ a ν b • (c) Double negation ¬ (¬a) ≡ a • (d) De Morgan • ¬ (a Λ b) ≡(¬a ν¬b) ¬(a ν b) ≡(¬a Λ¬b) • (e) Distributivity of Λover ν • (a Λ(b ν c )) ≡((aΛ b) ν (a Λ c)) • (f) Distributivity of νover Λ • (a ν(b Λ c )) ≡((a ν b) Λ(a ν c))
Resolution Rule • Given two clauses: ( A V B ) ( ¬B V C ) • Produce one clause containing all of the literals except the two complementary literals: A V C
Resolution Rule • Given two clauses: ( A V B V C V D ) ( E V F V ¬B V G V H ) • Produce one clause containing all of the literals except the two complementary literals: A V C V D V E V F V G V H
Resolution Example ¬asleep v fail asleep Show that the knowledge base entails “fail” • Negate the theorem ¬asleep v fail asleep
Very Important: • NEGATE THE THEOREM • This means, put a “¬” symbol in front of it • If it already has one, then remove it
Very Important: Theorem Negated Theorem A ¬A ¬B B Fail ¬Fail Then add it to the knowledge base and find a contradiction
Resolution Example ¬asleep v fail asleep Show that the knowledge base entails “fail” • Negate the theorem ¬fail ¬asleep v fail asleep ¬asleep fail Empty
Example 2 Beep Beep! • Roadrunner and Coyote
Example 2 • Coyote chases Roadrunner • If Roadrunner is smart, Coyote does not catch it • If coyote chases Roadrunner and does not catch it, then Coyote is annoyed. • Roadrunner is smart • Theorem: Coyote is annoyed ???? Beep Beep!
Example 2 • Theorem: Coyote is annoyed • We try to prove that “Coyote is NOT annoyed” is false • We add “Coyote is NOT annoyed” to the knowledge base, and prove false • So, the original theorem must be true Beep Beep!
Example 2 We are asking: Does the knowledge base entail “Annoyed”
Example 2 • Convert these into Conjunctive Normal Form: • S => ¬B • C Λ ¬B => A
Example 2 • C = Coyote Chases Roadrunner • S = Roadrunner is Smart • B = Coyote Catches Roadrunner • A = Coyote is Annoyed
Example 2 ¬Annoyed Λ Annoyed
Example 2 – Proved in a different way Coyote catches Roadrunner AND Coyote does not catch Roadrunner
Example 2 – conclusion Annoyed Λ ¬Annoyed • This cannot be true, therefore adding “¬Annoyed” causes a contradiction in the knowledge base • Theorem was: “Coyote is annoyed” • We added the opposite and proved FALSE • We proved that Annoyed = TRUE, by proving that ¬Annoyed = FALSE • Therefore Coyote is annoyed • The knowledge base entails “annoyed”
Resolution Problems • Can take a very long time • Depending on the number of clauses in the knowledge base
Example 3 B V A ¬C V A ¬B V A C V ¬D ¬A V ¬B V D • Does this entail A? • In other words: Does all of this mean that A is TRUE? • If we set A to FALSE, and find a contradiction, then A must be TRUE
Example 3 The Knowledge Base B V A ¬C V A ¬B V A C V ¬D ¬A V ¬B V D ¬A A V A A Contradiction
Example 3 1) B V A 2) ¬C V A 3) ¬B V A 4) C V ¬D 5) ¬A V ¬B V D 6) ¬A 7) ¬C From 2,6 8) ¬D From 4,7 9) ¬A V ¬B From 5,8 10) B From 1,6 11) ¬A From 9,10 12) ¬B From 3,11 Contradiction
Example 3 - conclusion • So you can find the answer in 1 step or 6 steps, depending on the order in which you resolve the clauses • The speed of the resolution algorithm depends on the order • But, resolution will always find a proof if one exists. You just have to keep going until there are no more clauses to resolve
What you need to know • The resolution algorithm • Uses the principle of proof by contradiction • Why the knowledge base must be in Conjunctive Normal Form • The speed of the algorithm depends on the order in which you resolve the clauses