1 / 10

Bayesian Networks

Bayesian Networks. Motivation. encode uncertain expert knowledge other approaches: possibility theory, fuzzy arithmetic, … encode causality relationships truly “understand” a domain (AI) given a set of random variables, what is their Joint Probability Distribution (JPD) ?

halle
Download Presentation

Bayesian Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Networks

  2. Motivation • encode uncertain expert knowledge • other approaches: possibility theory, fuzzy arithmetic, … • encode causality relationships • truly “understand” a domain (AI) • given a set of random variables, what is their Joint Probability Distribution (JPD)? • Bayesian approach:prior probability + likelihood  posterior probability

  3. Qualitative (topology) C {sc1, sc2} Discrete Random Variables A {sa1, sa2} B {sb1, sb2} “C is independent of A and B” Quantitative(probabilities) “A influences B” “If I observe certain states of A,I can draw consequences regarding B”

  4. A {sa1, sa2} B {sb1, sb2} C {sc1, sc2}

  5. Inference • predictive, top-down reasoning: P(symptom | cause) • diagnostic, bottom-up reasoning: P(cause | symptom) • exact inference: NP-hard • Message Passing Algorithm • Cycle-Cutset Conditioning • approximate inference • Monte Carlo sampling (e.g. MCMC) • Loopy Belief Propagation cause symptom

  6. Learning training data + expert knowledge  structure & parameters of BN • known structure + full observations: • Maximum Likelihood Estimation • known structure + partial observations: • Expectation Maximization (EM): local optimum of maximum likelihood • Markov Chain Monte Carlo (MCMC) • unknown structure: NP-hard (many possible graph topologies) • K2: find most probable structures based on some expert knowledge • simplifying assumption: independent variables with common parent node

  7. Applications: Dependability Modelling

  8. Applications: Online Fault Diagnosis http://www.research.ibm.com/people/r/rish/papers/AAAI02symp-probe.pdf

  9. Applications: Transaction Recognition • predict transitions based on server-side RPC sequences • http://www.google.de/patents/US6925452

  10. A statistical battlefield… BN inference  Bayesian approach BN learning  Frequentist approach http://oikosjournal.wordpress.com/2011/10/11/frequentist-vs-bayesian-statistics-resources-to-help-you-choose/

More Related