70 likes | 87 Views
Learn about the structure and semantics of Bayesian Networks, including drawing causal nodes and directed edges, encoding conditional probability tables, and conducting inference. Discover how Bayesian Nets calculate posterior probabilities and handle conditional independence. Gain insights into the advantages of Bayesian Networks over full joint probability tables.
E N D
Structure and Semantics of BN • draw causal nodes first • draw directed edges to effects (“direct causes”) • links encode conditional probability tables • absence of link implies conditional independence given parents • advantage: fewer parameters than full joint prob table (25=32 entries in this case)
Inference in Bayesian Nets • Objective: calculate posterior probability of a variable X conditioned on evidence Y and marginalizing over Z (unobserved variables) • suppose we wanted to know the probability that there was a burglary (B) given that john calls (j) and mary calls (m) (but we don’t know alarm or earthquake) so the conditional prob is proportional to joint prob. next, we marginalize over unobserved vars e and a... full joint rearrange