1 / 15

Review: Bayesian networks

Review: Bayesian networks. Example: Cloudy, Sprinkler, Rain, Wet Grass. Review: Probabilistic inference. A general scenario: Query variables: X Evidence ( observed ) variables and their values: E = e Unobserved variables: Y

sonora
Download Presentation

Review: Bayesian networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review: Bayesian networks • Example: Cloudy, Sprinkler, Rain, Wet Grass

  2. Review: Probabilistic inference • A general scenario: • Query variables:X • Evidence (observed) variables and their values: E = e • Unobserved variables: Y • Inference problem: answer questions about the query variables given the evidence variables • This can be done using theposterior distribution P(X | E = e) • In turn, the posterior needs to be derived fromthe full joint P(X, E, Y) • Since Bayesian networks can afford exponential savings in representing joint distributions, can they afford similar savings for inference?

  3. Bayesian network inference • In full generality, NP-hard • More precisely, #P-hard: equivalent to counting satisfying assignments • We can reduce satisfiability to Bayesian network inference • Decision problem: is P(Y) > 0? C1 C2 C3 G. Cooper, 1990

  4. Bayesian network inference: Big picture • Exact inference is intractable • There exist techniques to speed up computations, but worst-case complexity is still exponential except in some classes of networks • Approximate inference (not covered) • Sampling, variational methods, message passing / belief propagation…

  5. Exact inference example • Query: P(B | j, m) • How to compute this sum efficiently?

  6. Exact inference example

  7. Exact inference with variable elimination • Key idea: compute the results of sub-expressions in a bottom-up way and cache them for later use • Form of dynamic programming • Exponential complexity in general • Polynomial time and space complexity for polytrees: networks at most one undirected path between any two nodes

  8. Representing people

  9. Bayesian network inference: Big picture • In general, exact inference is intractable • Efficient inference via dynamic programming is possible for polytrees • In other practical cases, must resort to approximate methods (not covered in this class) • Sampling, variational methods, message passing / belief propagation…

  10. Parameter learning • Inference problem: given values of evidence variables E = e, answer questions about query variablesX using the posterior P(X| E = e) • Learning problem: estimate the parameters of the probabilistic model P(X | E) given a training sample{(x1,e1), …, (xn,en)}

  11. Parameter learning • Suppose we know the network structure (but not the parameters), and have a training set of complete observations Training set ? ? ? ? ? ? ? ? ?

  12. Parameter learning • Suppose we know the network structure (but not the parameters), and have a training set of complete observations • P(X | Parents(X)) is given by the observed frequencies of the different values of X for each combination of parent values

  13. Parameter learning • Incomplete observations • Expectation maximization (EM) algorithm for dealing with missing data Training set ? ? ? ? ? ? ? ? ?

  14. Parameter learning • What if the network structure is unknown? • Structure learning algorithms exist, but they are pretty complicated… Training set C ? S R W

  15. Summary: Bayesian networks • Structure • Parameters • Inference • Learning

More Related