290 likes | 471 Views
Lecture 26 of 42. Conditional, Continuous, and Multi-Agent Planning Discussion: Probability Refresher. Wednesday. 24 October 2007 William H. Hsu Department of Computing and Information Sciences, KSU KSOL course page: http://snipurl.com/v9v3
E N D
Lecture 26 of 42 Conditional, Continuous, and Multi-Agent Planning Discussion: Probability Refresher Wednesday. 24 October 2007 William H. Hsu Department of Computing and Information Sciences, KSU KSOL course page: http://snipurl.com/v9v3 Course web site: http://www.kddresearch.org/Courses/Fall-2007/CIS730 Instructor home page: http://www.cis.ksu.edu/~bhsu Reading for Next Class: Section 12.5 – 12.8, Russell & Norvig 2nd edition CIS 530 / 730: Artificial Intelligence
Lecture Outline • Today’s Reading: Sections 12.1 – 12.4, R&N 2e • Friday’s Reading: Sections 12.5 – 12.8, R&N 2e • Today: Practical Planning, concluded • Conditional Planning • Replanning • Monitoring and Execution • Continual Planning • Hierarchical Planning Revisited • Examples: Korf • Real-World Example • Friday and Next Week: Reasoning under Uncertainty • Basics of reasoning under uncertainty • Probability review • BNJ interface (http://bnj.sourceforge.net) CIS 530 / 730: Artificial Intelligence
Planning and Learning Roadmap • Bounded Indeterminacy (12.3) • Four Techniques for Dealing with Nondeterministic Domains • 1. Sensorless/Conformant Planning: “Be Prepared” (12.3) • Idea: be able to respond to any situation (universal planning) • Coercion • 2. Conditional / Contingency Planning: “Plan B” (12.4) • Idea: be able to respond to many typical alternative situations • Actions for sensing (“reviewing the situation”) • 3. Execution Monitoring / Replanning: “Show Must Go On” (12.5) • Idea: be able to resume momentarily failed plans • Plan revision • 4. Continuous Planning: “Always in Motion, The Future Is” (12.6) • Lifetime planning (and learning!) • Formulate new goals CIS 530 / 730: Artificial Intelligence
Hierarchical Abstraction Planning:Review • Need for Abstraction • Question: What is wrong with uniform granularity? • Answers (among many) • Representational problems • Inferential problems: inefficient plan synthesis • Family of Solutions: Abstract Planning • But what to abstract in “problem environment”, “representation”? • Objects, obstacles (quantification: later) • Assumptions (closed world) • Other entities • Operators • Situations • Hierarchical abstraction • See: Sections 12.2 – 12.3 R&N, pp. 371 – 380 • Figure 12.1, 12.6 (examples), 12.2 (algorithm), 12.3-5 (properties) Adapted from Russell and Norvig CIS 530 / 730: Artificial Intelligence
Universal Quantifiers in Planning • Quantification within Operators • p. 383 R&N • Examples • Shakey’s World • Blocks World • Grocery shopping • Others (from projects?) • Exercise for Next Tuesday: Blocks World CIS 530 / 730: Artificial Intelligence
Practical Planning • The Real World • What can go wrong with classical planning? • What are possible solution approaches? • Conditional Planning • Monitoring and Replanning (Next Time) Adapted from Russell and Norvig CIS 530 / 730: Artificial Intelligence
Review:How Things Go Wrong in Planning Adapted from slides by S. Russell, UC Berkeley CIS 530 / 730: Artificial Intelligence
Review:Practical Planning Solutions Adapted from slides by S. Russell, UC Berkeley CIS 530 / 730: Artificial Intelligence
Conditional Planning Adapted from slides by S. Russell, UC Berkeley CIS 530 / 730: Artificial Intelligence
Monitoring and Replanning CIS 530 / 730: Artificial Intelligence
Preconditions for Remaining Plan Adapted from slides by S. Russell, UC Berkeley CIS 530 / 730: Artificial Intelligence
Replanning Adapted from slides by S. Russell, UC Berkeley CIS 530 / 730: Artificial Intelligence
Making Decisions under Uncertainty Adapted from slides by S. Russell, UC Berkeley CIS 530 / 730: Artificial Intelligence
Sample Space (): Range of a Random Variable X • Probability Measure Pr() • denotes a range of “events”; X: • ProbabilityPr, or P, is a measure over 2 • In a general sense, Pr(X = x ) is a measure of belief in X = x • P(X = x) = 0 or P(X = x) = 1: plain (akacategorical) beliefs (can’t be revised) • All other beliefs are subject to revision • Kolmogorov Axioms • 1. x . 0 P(X = x) 1 • 2. P() x P(X = x) = 1 • 3. • Joint Probability: P(X1X2) Probability of the Joint Event X1X2 • Independence: P(X1X2) = P(X1) P(X2) Probability:Basic Definitions and Axioms CIS 530 / 730: Artificial Intelligence
Product Rule (Alternative Statement of Bayes’s Theorem) • Proof: requires axiomatic set theory, as does Bayes’s Theorem • Sum Rule • Sketch of proof (immediate from axiomatic set theory) • Draw a Venn diagram of two sets denoting events A and B • Let A B denote the event corresponding to A B… • Theorem of Total Probability • Suppose events A1, A2, …, An are mutually exclusive and exhaustive • Mutually exclusive: i j Ai Aj = • Exhaustive: P(Ai) = 1 • Then • Proof: follows from product rule and 3rd Kolmogorov axiom Basic Formulas for Probabilities A B CIS 530 / 730: Artificial Intelligence