1 / 16

Bayesian Networks: Representing Knowledge in an Uncertain Domain

This chapter explores the use of Bayesian Networks for representing knowledge in domains with uncertainty. It covers topics such as random variables, conditional probability tables, network structure, efficient representation of conditional distributions, and exact inference in Bayesian Networks. The chapter also includes examples and discusses the semantics and effectiveness of Bayesian Networks.

victorj
Download Presentation

Bayesian Networks: Representing Knowledge in an Uncertain Domain

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 14 February 26, 2004

  2. 14.1 Representing Knowledge in an Uncertain Domain • Bayesian Networks • random variables • directed links (X influences Y) • conditional probability tables • directed, acyclic graph • Example: Figure 14.1 • Example: Figure 14.2

  3. 14.2 The Semantics of Bayesian Networks • Determining the full joint distribution • P(j  m  a  ¬b  ¬e) = P(j | a) * P(m | a) * P(a| ¬ b  ¬ e) * P(¬ b) * P(¬ e) • P(x1, x2, x3) = P(x3 | x1, x2) * P(x1, x2) • P(x1, x2) = P(x2 | x1) * P(x1)

  4. Bayesian Networks can be compact • n Boolean random variables • k upper bound on incoming arrows • 2n vs n*2k probabilities needed

  5. Network structure depends on order of introduction • Figure 14.3 • Causal models are typically better than diagnostic models

  6. Conditional independence relations in Bayesian Networks • Figure 14.4

  7. 14.3 Efficient Representation of Conditional Distributions • Noisy-Or, p. 501 • Hybrid Bayesian Network (Figures 14.5-14.7) • discrete  discrete • discrete  continuous • continuous  discrete • continuous  continuous

  8. 14.4 Exact Inference in Bayesian Networks • The section describes tricks to do the inference more efficiently. • Clustering, Figure 14.11 • Goal is to produce a polytree • Often used in commercial Bayesian systems • No magic bullet

  9. Midterm Review • Thursday, March 4th • Open book, open notes, etc. • Bring a calculator • Major topics are …

  10. 9: Inference in First-Order Logic • Unification • Forward Chaining • Backward Chaining • Prolog • Resolution Theorem Proving • Resolution Strategies

  11. 10: Knowledge Representation • Ontologies • Situation Calculus • Intervals • Frame Problem • Semantic Networks • Closed World Assumption • Unique Names Assumption

  12. 18: Learning from Observations • Decision Trees • Ensemble Learning / AdaBoost • PAC learning

  13. 19: Knowledge in Learning • Version Space • Explanation Based Learning

  14. 20: Statistical Learning Methods • Maximum-likelihood parameter learning: discrete models • Naive Bayes models • K nearest neighbors • Perceptrons • Backpropagation Neural Networks

  15. 13: Uncertainty • Terminology • Conditional Probability • Axioms of Probability • Inference Using Full Joint Distributions • Independence • Baye’s Rule

  16. 14: Probabilistic Reasoning • Bayesian Networks • Construction • Reasoning With

More Related