290 likes | 295 Views
Learn about Bayesian networks and how they can be used to make informed medical decisions. This guide covers the basics of probability theory, modeling and inference with Bayesian networks, elimination and evidence handling techniques, and more.
E N D
MINVOLSET KINKEDTUBE PULMEMBOLUS INTUBATION VENTMACH DISCONNECT PAP SHUNT VENTLUNG VENITUBE PRESS MINOVL FIO2 VENTALV PVSAT ANAPHYLAXIS ARTCO2 EXPCO2 SAO2 TPR INSUFFANESTH HYPOVOLEMIA LVFAILURE CATECHOL LVEDVOLUME STROEVOLUME ERRCAUTER HR ERRBLOWOUTPUT HISTORY CO CVP PCWP HREKG HRSAT HRBP BP Mainly based on F. V. Jensen, „Bayesian Networks and Decision Graphs“, Springer-Verlag New York, 2001. Advanced IWS 06/07 Graphical Models- Inference - Variable Elimination Wolfram Burgard, Luc De Raedt, Kristian Kersting, Bernhard Nebel Albert-Ludwigs University Freiburg, Germany
Outline • Introduction • Reminder: Probability theory • Basics of Bayesian Networks • Modeling Bayesian networks • Inference • Excourse: Markov Networks • Learning Bayesian networks • Relational Models
E A B C D Elimination in Chains • Forward pass • Using definition of probability, we have - Inference (Variable Elimination)
E A B C D Elimination in Chains • By chain decomposition, we get - Inference (Variable Elimination)
E A B C D Elimination in Chains • Rearranging terms ... - Inference (Variable Elimination)
A has been eliminated X E A B C D Elimination in Chains • Now we can perform innermost summation - Inference (Variable Elimination)
B has been eliminated X E A B C D Elimination in Chains • Rearranging and then summing again, we get X - Inference (Variable Elimination) …
E A B C D Elimination in Chains with Evidence • Similar due to Bayes’ Rule • We write the query in explicit form - Inference (Variable Elimination)
Variable Elimination • Write query in the form • Iteratively • Move all irrelevant terms outside of innermost sum • Perform innermost sum, getting a new term • Insert the new term into the product Semantic Bayesian Network Eliminate irrelevant variables - Inference (Variable Elimination)
Smoking Visit to Asia Tuberculosis Lung Cancer Abnormality in Chest Bronchitis Dyspnoea X-Ray Asia - A More Complex Example - Inference (Variable Elimination)
S V L T B A X D We want to compute P(d)Need to eliminate: v,s,x,t,l,a,b Initial factors - Inference (Variable Elimination)
S V L T B A X D Compute: but in general the result of eliminating a variable is not necessarily a probability term ! We want to compute P(d)Need to eliminate: v,s,x,t,l,a,b Initial factors Eliminate:v - Inference (Variable Elimination)
S V L T B A X D Compute: We want to compute P(d)Need to eliminate: s,x,t,l,a,b Initial factors Eliminate:s - Inference (Variable Elimination) Summing onsresults in a factor with two argumentsfs(b,l). In general, result of elimination may be a function of several variables
S V L T B A X D Compute: We want to compute P(d)Need to eliminate: x,t,l,a,b Initial factors - Inference (Variable Elimination) Eliminate:x Note that fx(a)=1 for all values of a
S V L T B A X D Compute: We want to compute P(d)Need to eliminate: t,l,a,b Initial factors - Inference (Variable Elimination) Eliminate:t
S V L T B A X D Compute: We want to compute P(d)Need to eliminate: l,a,b Initial factors - Inference (Variable Elimination) Eliminate:l
S V L T B A X D Compute: We want to compute P(d)Need to eliminate: a,b Initial factors - Inference (Variable Elimination) Eliminate:a,b
Variable Elimination (VE) • We now understand VE as a sequence of rewriting operations • Actual computation is done in elimination step • Exactly the same procedure applies to Markov networks • Computation depends onorder of elimination - Inference (Variable Elimination)
S V L T B A X D Dealing with Evidence • How do we deal with evidence? • Suppose get evidence V = t, S = f, D = t • We want to compute P(L, V = t, S = f, D = t) - Inference (Variable Elimination)
S V L T B A X D Dealing with Evidence • We start by writing the factors: - Inference (Variable Elimination)
S V L T B A X D Dealing with Evidence • We start by writing the factors: • Since we know that V = t, we don’t need to eliminate V • Instead, we can replace the factors P(V) and P(T|V) with - Inference (Variable Elimination)
S V L T B A X D Dealing with Evidence • We start by writing the factors: • Since we know that V = t, we don’t need to eliminate V • Instead, we can replace the factors P(V) and P(T|V) with • This “selects” the appropriate parts of the original factors given the evidence • Note that fP(v) is a constant, and thus does not appear in elimination of other variables - Inference (Variable Elimination)
S V L T B A X D Given evidence V = t, S = f, D = t, computeP(L, V = t, S = f, D = t ) Dealing with Evidence • Initial factors, after setting evidence - Inference (Variable Elimination)
S V L T B A X D Given evidence V = t, S = f, D = t, computeP(L, V = t, S = f, D = t ) Dealing with Evidence • Initial factors, after setting evidence • Eliminating x, we get - Inference (Variable Elimination)
S V L T B A X D Given evidence V = t, S = f, D = t, computeP(L, V = t, S = f, D = t ) Dealing with Evidence • Initial factors, after setting evidence • Eliminating x, we get • Eliminating t, we get - Inference (Variable Elimination)
S V L T B A X D Given evidence V = t, S = f, D = t, computeP(L, V = t, S = f, D = t ) Dealing with Evidence • Initial factors, after setting evidence • Eliminating x, we get • Eliminating t, we get • Eliminating a, we get - Inference (Variable Elimination)
S V L T B A X D Given evidence V = t, S = f, D = t, computeP(L, V = t, S = f, D = t ) Dealing with Evidence • Initial factors, after setting evidence • Eliminating x, t, a, we get • Finally, when eliminating b, we get - Inference (Variable Elimination)
S V L T B A X D Given evidence V = t, S = f, D = t, computeP(L, V = t, S = f, D = t ) Dealing with Evidence • Initial factors, after setting evidence • Eliminating x, t, a, we get • Finally, when eliminating b, we get - Inference (Variable Elimination)
Outline • Introduction • Reminder: Probability theory • Basics of Bayesian Networks • Modeling Bayesian networks • Inference (VE, Junction Tree) • Excourse: Markov Networks • Learning Bayesian networks • Relational Models