1 / 13

- Hareesh Lingareddy University of South Carolina

CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination. - Hareesh Lingareddy University of South Carolina. Bucket Elimination.

fionan
Download Presentation

- Hareesh Lingareddy University of South Carolina

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE 582Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -HareeshLingareddy University of South Carolina

  2. Bucket Elimination • Algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problem solving and reasoning activities. • Uses “buckets” to mimic the algebraic manipulations involved in each of these problems resulting in an easily expressible algorithmic formulation

  3. Bucket Elimination Algorithm • Partition functions on the graph into “buckets” in backwards relative to the given node order • In the bucket of variable X we put all functions that mention X but do not mention any variable having a higher index • Process buckets backwards relative to the node order • The computed function after elimination is placed in the bucket of the ‘highest’ variable in its scope

  4. Algorithms using Bucket Elimination • Belief Assessment • Most Probable Estimation(MPE) • Maximum A Posteriori Hypothesis(MAP) • Maximum Expected Utility(MEU)

  5. Belief Assessment • Definition - Given a set of evidence compute the posterior probability of all the variables • The belief assessment task of Xk = xk is to find • In the Visit to Asia example, the belief assessment problem answers questions like • What is the probability that a person has tuberculosis, given that he/she has dyspnoeaand has visited Asia recently ? • where k – normalizing constant

  6. Belief Assessment Overview • In reverse Node Ordering: • Create bucket function by multiplying all functions (given as tables) containing the current node • Perform variable elimination using Summation over the current node • Place the new created function table into the appropriate bucket

  7. Most Probable Explanation (MPE) • Definition • Given evidence find the maximum probability assignment to the remaining variables • The MPE task is to find an assignment xo = (xo1, …, xon) such that

  8. Differences from Belief Assessment • Replace Sums With Max • Keep track of maximizing value at each stage • “Forward Step” to determine what is the maximizing assignment tuple

  9. Elimination Algorithm for Most Probable Explanation Finding MPE = max ,,,,,,, P(,,,,,,,) MPE= MAX{,,,,,,,} (P(|)* P(|)* P(|,)* P(|,)* P()*P(|)*P(|)*P()) Bucket : P(|)*P() Hn(u)=maxxn (ПxnFnC(xn|xpa)) Bucket : P(|) Bucket : P(|,), =“no” Bucket : P(|,) H(,) H() Bucket : P(|) H(,,) Bucket : P(|)*P() H(,,) Bucket : H(,) Bucket : H() H() MPE probability

  10. Elimination Algorithm for Most Probable Explanation Forward part ’ = arg maxP(’|)*P() Bucket : P(|)*P() Bucket : P(|) ’ = arg maxP(|’) Bucket : P(|,), =“no” ’ = “no” Bucket : P(|,) H(,) H() ’ = arg maxP(|’,’)*H(,’)*H() Bucket : P(|) H(,,) ’ = arg maxP(|’)*H(’,,’) Bucket : P(|)*P() H(,,) ’ = arg maxP(’|)*P()* H(’,’,) Bucket : H(,) ’ = arg maxH(’,) Bucket : H() H() ’ = arg maxH()* H() Return: (’, ’, ’, ’, ’, ’, ’, ’)

  11. MPE Overview • In reverse node Ordering • Create bucket function by multiplying all functions (given as tables) containing the current node • Perform variable elimination using the Maximization operation over the current node (recording the maximizing state function) • Place the new created function table into the appropriate bucket • In forward node ordering • Calculate the maximum probability using maximizing state functions

  12. Maximum Aposteriori Hypothesis (MAP) • Definition • Given evidence find an assignment to a subset of “hypothesis” variables that maximizes their probability • Given a set of hypothesis variables A = {A1, …, Ak}, ,the MAP taskis to find an assignment ao = (ao1, …, aok) such that

More Related