1 / 35

Artificial Intelligence

Learn about the Situational Calculus and Event Calculus for representing change over time. Gain an understanding of knowledge representation and probabilistic reasoning in AI.

kbeaton
Download Presentation

Artificial Intelligence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Intelligence CS 165A Tuesday, November 20, 2007 • Knowledge Representation (Ch 10) • Uncertainty (Ch 13) Today

  2. Notes • HW #4 due by noon tomorrow • Reminder: Final exam December 14, 4-7pm • Review in class on Dec. 6th

  3. Review Situation Calculus – actions, events • “Situation Calculus” is a way of describing change over time in first-order logic • Fluents: Functions or predicates that can vary over time have an extra argument, Si(the situation argument) • Predicate(args, Si) • Location of an agent, aliveness, changing properties, ... • The Result function is used to represent change from one situation to another resulting from an action (or action sequence) • Result(GoForward, Si) = Sj • “Sj is the situation that results from the action GoForward applied to situation Si • Result() indicates the relationship between situations

  4. Review Situation Calculus Represents the world in different “situations” and the relationship between situations

  5. Review Situation Calculus Represents the world in different “situations” and the relationship between situations

  6. Review Examples • How would you interpret the following sentences in First-Order Logic using situation calculus?  x, s Studying(x, s)  Failed(x, Result(TakeTest, s))  x, s TurnedOn(x, s)  LightSwitch(x)  TurnedOff(x, Result(FlipSwitch, s)) If you’re studying and then you take the test, you will fail. (or) Studying a subject implies that you will fail the test for that subject. If you flip the light switch when it is turned on, it will then be turned off.

  7. There are other ways to deal with time • Event calculus • Based on points in time rather than situations • Designed to allow reasoning over periods of time • Can represent actions with duration, overlapping actions, etc. • Generalized events • Parts of a general “space-time chunk” • Processes • Not just discrete events • Intervals • Moments and durations of time • Objects with state fluents • Not just events, but objects can also have time properties

  8. Event calculus relations • Initiates(e, f, t) • Event e at time t causes fluent f to become true • Terminates(e, f, t) • Event e at time t causes fluent f to no longer be true • Happens(e, t) • Event e happens at time t • Clipped(f, t1, t2) • f is terminated by some event sometime between t1and t2

  9. Australia time Generalized events • An ontology of time that allows for reasoning about various temporal events, subevents, durations, processes, intervals, etc. Space-time chunk

  10. Time interval predicates Ex: After(ReignOf(ElizabethII), ReignOf(GeorgeVI)) Overlap(Fifties, ReignOf(Elvis)) Start(Fifties) = Start(AD1950) Meet(Fifties, Sixties)

  11. Objects with state fluents President(USA)

  12. Knowledge representation • Chapter 10 covers many topics in knowledge representation, many of which are important to real, sophisticated AI reasoning systems • We’re only scratching the surface of this topic • Best covered in depth in an advanced AI course and in context of particular AI problems • Read through the Internet shopping world example in 10.5 • Now we move on to probabilistic reasoning, a different way of representing and manipulating knowledge • Chapters 13 and 14

  13. Quick Review of Probability From here on we will assume that you know this…

  14. Probability notation and notes • Probabilities of propositions • P(A), P(the sun is shining) • Probabilities of random variables • P(X = x1), P(Y = y1), P(x1 < X < x2) • P(A) usually means P(A = True) (A is a proposition, not a variable) • This is a probability value • Technically, P(A) is a probability function • P(X = x1) • This is a probability value (P(X) is a probability function) • P(X) • This is a probability function or a probability density function • Technically, if X is a variable, we should not write P(X) = 0.5 • But rather P(X = x1) = 0.5

  15. Discrete and continuous probabilities • Discrete: Probability function P(X, Y) is described by an MxN matrix of probabilities • Possible values of each: P(X=x1, Y=y1) = p1 •  P(X=xi, Y=yj) = 1 • P(X, Y, Z) is an MxNxP matrix • Continuous: Probability density function (pdf) P(X, Y) is described by a 2D function • P(x1 <X < x2, y1 <Y < y2) = p1 •  P(X, Y) dX dY = 1

  16. Discrete probability distribution 0.2 p(X) 0.1 0 1 2 3 4 5 6 7 8 9 10 11 12 X

  17. Continuous probability distribution 0.4 p(X) 0.2 0 1 2 3 4 5 6 7 8 9 10 11 12 X

  18. Continuous probability distribution 0.4 • P(X=5) = ??? • P(X=5) = 0 • P(X=x1) = 0 p(X) 0.2 0 1 2 3 4 5 6 7 8 9 10 11 12 X

  19. Three Axioms of Probability • The probability of every event must be nonnegative • For any event A, P(A)  0 • Valid propositions have probability 1 • P(True) = 1 • P(A  A) = 1 • For disjoint events A1, A2, … • P(A1 A2  …) = P(A1) + P(A2) + … • From these axioms, all other properties of probabilities can be derived. • E.g., derive P(A) + P(A) = 1

  20. Some consequences of the axioms • Unsatisfiable propositions have probability 0 • P(False) = 0 • P(A  A) = 0 • For any two events A and B • P(A  B) = P(A) + P(B) – P(A  B) • For the complement Ac of event A • P(Ac) = 1 – P(A) • For any event A • 0  P(A)  1 • For independent events A and B • P(A  B) = P(A) P(B)

  21. True A  B A B Venn Diagram Visualize: P(True), P(False), P(A), P(B), P(A), P(B), P(A  B), P(A  B), P(A  B), …

  22. WetGrass WetGrass Raining Raining Joint Probabilities • A complete probability model is a single joint probability distribution over all propositions/variables in the domain • P(X1, X2, …, Xi, …) • A particular instance of the world has the probability • P(X1=x1  X2=x2  … Xi=xi  …) = p • Rather than stating knowledge as • Raining  WetGrass • We can state it as • P(Raining, WetGrass) = 0.15 • P(Raining, WetGrass) = 0.01 • P(Raining, WetGrass) = 0.04 • P(Raining, WetGrass) = 0.8

  23. or Assumes P(Y) nonzero Conditional Probability • Unconditional, or Prior, Probability • Probabilities associated with a proposition or variable, prior to any evidence • E.g., P(WetGrass), P(Raining) • Conditional, or Posterior, Probability • Probabilities after evidence is gathered • P(A | B) – “The probability of A given that we know B” • After (posterior to) procuring evidence • E.g., P(WetGrass | Raining)

  24. Notes: • Precedence: ‘|’ is lowest • E.g., P(X | Y, Z) means which? • P( (X | Y), Z ) • P(X | (Y, Z) ) The chain rule By the Chain Rule

  25. X x1 x2 x3 0.2 0.1 0.1 y1 Y 0.1 0.2 0.3 y2 Joint probability distribution P(X) P(Y) P(X|Y) P(Y|X) P(X=x1) P(Y=y2) P(X|Y=y1) P(Y|X=x1) P(X=x1|Y) etc. From P(X,Y), we can always calculate:

  26. x1 x2 x3 0.3 0.3 0.4 x1 x2 x3 0.4 y1 0.5 0.25 0.25 y1 0.6 y2 0.167 0.333 0.5 y2 x1 x2 x3 0.667 0.333 0.25 y1 0.333 0.667 0.75 y2 P(X,Y) x1 x2 x3 P(X) y1 y2 P(Y) P(X|Y) P(X=x1,Y=y2) = ? P(X=x1) = ? P(Y=y2)= ? P(X|Y=y1)= ? P(X=x1|Y)= ? P(Y|X)

  27. P(X) P(X=x) P(X,Y) P(X|Y) P(X|Y=y) P(X=x|Y) P(X=x|Y=y) Probability Distributions Continuous vars Discrete vars Function (of one variable) M vector Scalar* Scalar Function of two variables MxN matrix Function of two variables MxN matrix Function of one variable M vector Function of one variable N vector Scalar* Scalar * - actually zero. Should be P(x1 < X < x2)

  28. Bayes’ Rule • Since and • Then Bayes’ Rule

  29. Bayes’ Rule • Similarly, P(X) conditioned on two variables: • Or N variables:

  30. Likelihood (causal knowledge) Prior probability Normalizing constant Posterior probability (diagnostic knowledge) Bayes’ Rule • This simple equation is very useful in practice • Usually framed in terms of hypotheses (H) and data (D) • Which of the hypotheses is best supported by the data?

  31. Bayes’ rule example: Medical diagnosis • Meningitis causes a stiff neck 50% of the time • A patient comes in with a stiff neck – what is the probability that he has meningitis? • Need to know two things: • The prior probability of a patient having meningitis (1/50,000) • The prior probability of a patient having a stiff neck (1/20) • ? • P(M | S) = (0.5)(0.00002)/(0.05) = 0.0002

  32. Example (cont.) • Suppose that we also know about whiplash • P(W) = 1/1000 • P(S | W) = 0.8 • What is the relative likelihood of whiplash and meningitis? • P(W | S) / P(M | S) So the relative likelihood of whiplash vs. meningitis is (0.016/0.0002) = 80

  33. A useful Bayes rule example A test for a new, deadly strain of anthrax (that has no symptoms) is known to be 99.9% accurate. Should you get tested? The chances of having this strain are one in a million. What are the random variables? A – you have anthrax (boolean) T – you test positive for anthrax (boolean) Notation: Instead of P(A=True) and P(A=False), we will write P(A) and P(A) What do we want to compute? P(A|T) What else do we need to know or assume? Priors: P(A) , P(A) Given: P(T|A) , P(T|A), P(T|A), P(T|A) Possibilities

  34. Example (cont.) We know: Given: P(T|A) = 0.999, P(T|A) = 0.001, P(T|A) = 0.001, P(T|A) = 0.999 Prior knowledge: P(A) = 10-6, P(A) = 1 – 10-6 Want to know P(A|T) P(A|T) = P(T|A) P(A) / P(T) Calculate P(T) by marginalization P(T) = P(T|A) P(A) + P(T|A) P(A) = (0.999)(10-6) + (0.001)(1 – 10-6)  0.001 So P(A|T) = (0.999)(10-6) / 0.001  0.001 Therefore P(A|T)  0.999 What if you work at a Post Office?

  35. People without anthrax People with anthrax Bad T (0.1%) Good T All people

More Related