100 likes | 110 Views
Explore a hybrid algorithm combining Clique Tree structure and Non-Serial Dynamic Programming techniques to efficiently calculate marginal and joint beliefs in Bayesian networks, optimizing complexity. Learn Symbolic Probabilistic Inference (SPI) methods for maintaining control over table size and time complexity, with Factor Trees algorithm for deriving desired beliefs. Discover how reordering terms and distribution of summations can significantly reduce computational operations for belief extraction.
E N D
Mark Bloemeke Artificial Intelligence Laboratory University of South Carolina Marco Valtorta Artificial Intelligence Laboratory University of South Carolina A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Presentation by Instructor Sreeja Vallabhan Marco Valtorta Marginal and Joint Beliefs in BN
Abstract Methods (algorithms) to Update Probability in Bayesian Network • Using a structure (Clique Tree) and perform local message based calculation to extract the belief in each variable. • Using Non Serial Dynamic Programming Techniques to extract the belief in some desired group of variables.
Goal Present a hybrid algorithm based on Non Serial Dynamic Programming Techniques and possessing the ability to retrieve the belief in all single variables.
Symbolic Probabilistic Inference (SPI) Consider the Bayesian Network with DAG G = (V, E) and conditional probability tables where are the parents of vi in G. Total joint probability using Chain Rule of Bayesian Network (1) Using marginalization to retrieve belief in any subset of variables V’ as (2) SPI is based on these two equations
Symbolic Probabilistic Inference (SPI) In SPI, to maintain control over the size and time complexity of the resulting tables: • Variables are ordered before calculations • Summations are pushed down into products
Symbolic Probabilistic Inference (SPI) Consider the following Bayesian Network Assuming that each variable has two states, P(A,C) require a total of 92 significant operations From equation (1) and (2), the joint probability of the variable A and C
Symbolic Probabilistic Inference (SPI) With a single re-ordering of the terms combined by equation (1) followed by the distribution of the summation from (2): This requires only 32 significant operations.
Factor Trees Two Stage method for deriving the desired joint and single beliefs. • Creation of Factor Tree. • Passing algorithm on the Factor Tree to retrieve desired joint and single beliefs.
Factor Trees Algorithm • Start by Calculating the optimal factoring order for the network given the target set of variables whose joint is desired. • Construct a Binary Tree showing the combination of initial probability table and conformal table. • Label edges between table along which variables are marginalized with the variables marginalized before combination. • Add an additional head that has an empty label above the current root, a conformal table labeled with the target set of variables, that has no variables.