1 / 40

CS 416 Artificial Intelligence

CS 416 Artificial Intelligence. Lecture 10 Logic Chapters 7 and 8. Homework Assignment. First homework assignment is out Due October 20th. Midterm. October 25 th Up through chapter 9 (excluding chapter 5). Review. Store information in a knowledge base Backus-Naur Form (BNF)

zlata
Download Presentation

CS 416 Artificial Intelligence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 416Artificial Intelligence Lecture 10 Logic Chapters 7 and 8

  2. Homework Assignment • First homework assignment is out • Due October 20th

  3. Midterm • October 25th • Up through chapter 9 (excluding chapter 5)

  4. Review • Store information in a knowledge base • Backus-Naur Form (BNF) • Use equivalences to isolate what you want • Use inference to relate sentences • Modus Ponens • And-Elimination

  5. Constructing a proof • Proving is like searching • Find sequence of logical inference rules that lead to desired result • Note the explosion of propositions • Good proof methods ignore the countless irrelevant propositions • The two inference rules are sound but not complete!

  6. Completeness • Recall that a complete inference algorithm is one that can derive any sentence that is entailed • Resolution is a single inference rule • Guarantees the ability to derive any sentence that is entailed • i.e. it is complete • It must be partnered with a complete search algorithm

  7. Resolution • Resolution is a single inference rule • Guarantees the ability to derive any sentence that is entailed • i.e. it is complete • It must be partnered with a complete search algorithm

  8. Resolution • Unit Resolution Inference Rule • If m and li arecomplementaryliterals

  9. Resolution Inference Rule • Also works with clauses • But make sure each literal appears only once • We really just want the resolution to return A

  10. Resolution and completeness • Any complete search algorithm, applying only the resolution rule, can derive any conclusion entailed by any knowledge base in propositional logic • More specifically, refutation completeness • Able to confirm or refute any sentence • Unable to enumerate all true sentences

  11. What about “and” clauses? • Resolution only applies to “or” clauses (disjunctions) • Every sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literals • Conjunctive Normal Form (CNF) • A sentence expressed as conjunction of disjunction of literals

  12. CNF

  13. An algorithm for resolution • We wish to prove KB entails a • That is, will some sequence of inferences create new sentences in knowledge base equal to a? • If some sequence of inferences created ~ a, we would know the knowledge base could not also entail a • a ^ ~ a is impossible

  14. An algorithm for resolution • To show KB cannot resolve a • Must show (KB ^ ~a) is unsatisfiable • No possible way for KB to entail (not a) • Proof by contradiction

  15. An algorithm for resolution • Algorithm • (KB ^ ~a) is put in CNF • Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel) • Cease if no new clauses to add (~a is not entailed) • Cease if resolution rule derives empty clause • If resolution generates a… a ^ ~a = empty clause • (~a is entailed)

  16. Example of resolution • Proof that there is not a pit in P1,2: ~P1,2 • KB ^ P1,2 leads to empty clause • Therefore ~P1,2 is true

  17. Formal Algorithm

  18. Horn Clauses • Horn Clause • Disjunction of literals where at most one is positive • (~a V ~b V ~c V d) • (~a V b V c V ~d) Not a Horn Clause

  19. Horn Clauses • Why? • Every Horn clause can be written as an implication: • (a conjunction of positive literals) => single positive literal • Inference algorithms can use forward chaining and backward chaining • Entailment computation is linear in size of KB

  20. Horn Clauses • Can be written as a special implication • (~a V ~b V c) becomes (a ^ b) => c • (~a V ~b V c) == (~(a ^ b) V c) … de Morgan • (~(a ^ b) V c) == ((a ^ b) => c … implication elimination

  21. Forward Chaining • Does KB (Horn clauses) entail q (single symbol)? • Keep track of known facts (positive literals) • If the premises of an implication are known • Add the conclusion to the known set of facts • Repeat process until no further inferences or q is added

  22. Forward Chaining

  23. Forward Chaining • Properties • Sound • Complete • All entailed atomic sentences will be derived • Data Driven • Start with what we know • Derive new info until we discover what we want

  24. Backward Chaining • Start with what you want to know, a query (q) • Look for implications that conclude q • Look at the premises of those implications • Look for implications that conclude those premises… • Goal-Directed Reasoning • Can be less complex search than forward chaining

  25. Making it fast • Example • Problem to solve is in CNF • Is Marvin a Martian given --- M == 1 (true)? • Marvin is green --- G=1 • Marvin is little --- L=1 • (little and green) implies Martian --- (L ^ G) => M ~(L^G) V M ~L V ~G V M • Proof by contradiction… are there true/false values for G, L, and M that are consistent with knowledge base and Marvin not being a Martian? • G ^ L ^ (~L V ~G V M) ^ ~M == empty?

  26. Searching for variable values • Want to find values such that: • Randomly consider all true/false assignments to variables until we exhaust them all or find match (model checking) • (G, L, M) = (1, 0, 0)… no = (0, 1, 0)… no = (0, 0, 0)… no = (1, 1, 0)… no • Alternatively… G ^ L ^ (~L V ~G V M) ^ ~M == 0

  27. G L M Backtracking Algorithm • Davis-Putnam Algorithm (DPLL) • Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)… • Each clause of CNF must be true • Terminate consideration when clause evaluates to false • Use heuristics to reduce repeated computation of propositions • Early termination • Pure symbol heuristic • Unit clause heuristic Cull branches from tree

  28. Searching for variable values • Other ways to find (G, L, M) assignments for: • G ^ L ^ (~L V ~G V M) ^ ~M == 0 • Simulated Annealing (WalkSAT) • Start with initial guess (0, 1, 1) • With each iteration, pick an unsatisfied clause and flip one symbol in the clause • Evaluation metric is the number of clauses that evaluate to true • Move “in direction” of guesses that cause more clauses to be true • Many local mins, use lots of randomness

  29. WalkSAT termination • How do you know when simulated annealing is done? • No way to know with certainty that an answer is not possible • Could have been bad luck • Could be there really is no answer • Establish a max number of iterations and go with best answer to that point

  30. So how well do these work? • Think about it this way • 16 of 32 possible assignments are models (are satisfiable) for this sentence • Therefore, 2 random guesses should find a solution • WalkSAT and DPLL should work quickly

  31. What about more clauses? • If # symbols (variables) stays the same, but number of clauses increases • More ways for an assignment to fail (on any one clause) • More searching through possible assignments is needed • Let’s create a ratio, m/n, to measure #clauses / # symbols • We expect large m/n causes slower solution

  32. What about more clauses • Higher m/n means fewer assignments will work • If fewer assignments work it is harder for DPLL and WalkSAT

  33. Combining it all • 4x4 Wumpus World • The “physics” of the game • At least one wumpus on board • A most one wumpus on board (for any two squares, one is free) • n(n-1)/2 rules like: ~W1,1 V ~W1,2 • Total of 155 sentences containing 64 distinct symbols

  34. Wumpus World • Inefficiencies as world becomes large • Knowledge base must expand if size of world expands • Preferred to have sentences that apply to all squares • We brought this subject up last week • Next chapter addresses this issue

  35. Chapter 8: First-Order Logic

  36. What do we like about propositional logic? • It is: • Declarative • Relationships between variables are described • A method for propagating relationships • Expressive • Can represent partial information using disjunction • Compositional • If A means foo and B means bar, A ^ B means foo and bar

  37. What don’t we like about propositional logic? • Lacks expressive power to describe the environment concisely • Separate rules for every square/square relationship in Wumpus world

  38. Natural Language • English appears to be expressive • Squares adjacent to pits are breezy • But natural language is a medium of communication, not a knowledge representation • Much of the information and logic conveyed by language is dependent on context • Information exchange is not well defined • Not compositional (combining sentences may mean something different) • It is ambiguous

  39. But we borrow representational ideas from natural language • Natural language syntax • Nouns and noun phrases refer to objects • People, houses, cars • Properties and verbs refer to object relations • Red, round, nearby, eaten • Some relationships are clearly defined functions where there is only one output for a given input • Best friend, first thing, plus • We build first-order logic around objects and relations

  40. Ontology • a “specification of a conceptualization” • A description of the objects and relationships that can exist • Propositional logic had only true/false relationships • First-order logic has many more relationships • The ontological commitment of languages is different • How much can you infer from what you know? • Temporal logic defines additional ontological commitments because of timing constraints

More Related