1 / 15

Probabilistic (Bayesian) representations of knowledge have had a major impact on AI

Probabilistic (Bayesian) representations of knowledge have had a major impact on AI contrast with symbolic/logical knowledge bases necessity to handle uncertainty in real world apps recent advances allow scaling up to larger networks Example applications of Bayesian networks

calum
Download Presentation

Probabilistic (Bayesian) representations of knowledge have had a major impact on AI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probabilistic (Bayesian) representations of knowledge have had a major impact on AI • contrast with symbolic/logical knowledge bases • necessity to handle uncertainty in real world apps • recent advances allow scaling up to larger networks • Example applications of Bayesian networks • HCI: inferring intent in conversation/action, plan recognition, intelligent tutoring • vision – image interpretation, de-noising • control – variables that influence flight • medicine • economics

  2. Structure and Semantics of BN • draw causal nodes first • draw directed edges to effects (“direct causes”) • links encode conditional probability tables (CPT over parents) • fewer parameters than full joint PDF • absence of link is related to independence

  3. types of independence • A is indep of non-descendants given parents • Markov blanket • d-separation – all paths between A and B are “blocked” • useful for determining if obtaining knowledge of B would change belief about A

  4. A B • child is cond.dep. on parent: P(B|A) • parent is cond.dep. on child: • P(A|B)=P(B|A)P(A)/P(B) • what about when one node is not an ancestor of the other? e.g. siblings A and B are only conditionally independent given C

  5. simple trees poly-trees (singly connected, one path between any pair of nodes) “cyclic” (using undirected edges) – much harder to do computations explaining away: P(sprinkler | wetGrass) = 0.43 P(sprinkler | wetGrass,rain) = 0.19

  6. Compact representations of CPT • Noisy-Or • prob. version of: cold  flu  malaria  fever • only have to represent 3 numbers (“strengths”) instead of 8

  7. Network Engineering for Complex Belief Networks, Mahoney and Laskey

  8. A Bayesian network approach to threat valuation with application to an air defense scenario, Johansson and Falkman

  9. Lumiere – Office Assistant

  10. Inference Tasks • posterior: P(Xi|{Zi}) • Zi observed vars, with unobserved variables Yi, marginalized out • prediction vs. diagnosis • evidence combination is crucial • handling unobserved variables is crucial • all marginals: P(Ai) – like priors, but for interior nodes too • subjoint: P(A,B) • boolean queries • most-probable explanation: • argmax{Yi} P(Yi U Zi) – state with highest joint probability

  11. (see slides 4-10 in http://aima.eecs.berkeley.edu/slides-pdf/chapter14b.pdf for discussion of Enumeration and VariableElimination)

  12. Inference in Bayesian Networks, D’Ambrosio

  13. Belief Propagation (this figure happens to come from http://www.pr-owl.org/basics/bn.php) see also: wiki, Ch. 8 in Bishop PR&ML

More Related