410 likes | 531 Views
Impact of Structuring on Bayesian Network Learning and Reasoning. M ieczysław .A. .K ł opotek Institute of Computer Science, Polish Academy of Sciences, Warsaw, Poland,. First Warsaw International Seminar on Soft Computing Warsaw, September 8th, 2003. Agenda. Definitions
E N D
Impact of Structuring on Bayesian Network Learning and Reasoning Mieczysław.A..Kłopotek Institute of Computer Science, Polish Academy of Sciences, Warsaw, Poland, First Warsaw International Seminaron Soft Computing Warsaw, September 8th, 2003
Agenda • Definitions • Approximate Reasoning • Bayesian networks • Reasoning in Bayesian networks • Learning Bayesian networks from data • Structured Bayesian networks (SBN) • Reasoning in SBN • Learning SBN from data • Concluding remarks
Approximate Reasoning • One possible method of expressing uncertainty: Joint Probability Distribution • Variables: causes, effects, observables • Reasoning: How probable is that a variable takes a given value if we kniow the values of some other variables • Given: P(X,Y,....,Z) • Find: P(X=x | T=t,...,W=w) • Difficult, if more than 40 variables have to be taken into account • hard to represent, • hard to reason, • hard to collect data)
Bayesian Network The method of choice for representing uncertainty in AI. Many efficient reasoning methods and learning methods Utilize explicit representation of structure to: • provide a natural and compact representation of large probability distributions. • allow for efficient methods for answering a wide range of queries.
Bayesian Network • Efficient and effective representation of a probability distribution • Directed acyclic graph • Nodes - random variables of interests • Edges - direct (causal) influence • Nodes are statistically independent of their non descendants given the state of their parents
A Bayesian network Pr(r,s,x,z,y)= Pr(z) . Pr(s|z) . Pr(y|z) . Pr(x|y) . Pr(r|y,s)
Applications of Bayesian networks • Genetic optimization algorithms with probabilistic mutation/crossing mechanism • Classification, including text classification • Medical diagnosis (PathFinder, QMR), other decision making tasks under uncertainty • Hardware diagnosis (Microsoft troubleshooter, NASA/Rockwell Vista project) • Information retrieval (Ricoh helpdesk) • Recommender systems • other
Reasoning – the problem with a Bayesian network • Fusion algorithm of Pearl elaborated for tree-like networks only • For other types of networks transformations to trees: • transformation to Markov tree (MT) is needed (Shafer/Shenoy, Spiegelhalter/Lauritzen) – except for trees and polytrees NP hard • Cutset reasoning (Pearl) – finding cutsets difficult, the reasoning complexity grows exponentially with cutset size needed • evidence absorption reasoning by edge reversal (Shachter) – not always possible in a simple way
T S R Z Y X Towards MT – moral graph Parents of a node in BN connected, edges not oriented
T S R Z Y X Towards MT – triangulated graph All cycles with more than 3 nodes have at least one link between non-neighboring nodes of the cycle.
T S R Y Z X Towards MT – Hypertree Hypertree = acyclic hypergraph
The Markov tree Y,S,R Z,T,Y T,Y,S Y,X Hypernodes of hypertree are nodes of the Markov tree
Y,S Y,S,R Z,S Z,T,S Z,Y,S Y Y,X Junction tree – alternative representation of MT Common BN nodes assigned to edges joining MT nodes
msg msg Y,S Z,S Y,S,R Z,T,S Z,Y,S msg Y Y,X Efficient reasoning in Markov trees, but .... MT node contents projected onto common variables are passed to the neighbors
Triangulability test - Triangulation not always possible All neighbors need to be connected
Edge reversal Evidence absorption Evidence absorption reasoning Efficient only for good-luck selection of conditioning variables
Cutset reasoning – fixing values of some nodes creates a (poly)tree Node fixed Hence edge ignorable
How to overcome the difficulty when reasoning with BN • Learn directly a triangulated graph or Markov tree from data (Cercone N., Wong S.K.M., Xiang Y) • Hard and inefficient for long dependence chains, danger of large hypernodes • Learn only tree-structured/polytree structured BN (e.g. In Goldberg’s Bayesian Genetic Algorithms, TAN text classifiers etc.) • Oversimplification, long dependence chains lost • Our approach: Propose a more general class of Bayesian networks that is still efficient for reasoning
What is a structured Bayesian network • An analogon of well-structured programs • Graphical structure: nested sequences and alternatives • By collapsing sequences and alternatives to single nodes, one single node obtainable • Efficient reasoning possible
Structured Bayesian Network (SBN), an example For comparison: a tree-structured BN
means 0,1 or 2 arrows SBN construction steps
Reasoning in SBN • Either directly in the structure • Or easily transformable to Markov tree • Direct reasoning consisting of • Forward step (leave node/root node valuation calculation) • Backward step (intermediate node valuation calculation
means 0,1 or 2 arrows Reasoning in SBN forward step A A E C P(B|A) B B P(B|C,E)
..... ..... ..... ..... A A A A C C C C B D ..... D B B B ..... ..... ..... (b) (c) (d) (a) Reasoning in SBN backward step: local context Joint distribu-tion of A,B known, joint C,D or C sought
Msg2(A,B) A,B,............ A,B,C,D A,B Msg1(A,B) Reasoning in SBN – backward step: local reasoning P(A)*P(B|A,D) Not needed
R I K A F L B M N G C D O E H P J S Towards a Markobv tree – an example
R I K A F L B M N G C D O E H P J S Towards a Markobv tree – an example
K,L,R A,B,I L,M,N,R F,G,I B,C,D,I M,N,O,R N,O,R C,D,E,I G,H,I D,E,I I,H,E,R O,P,R E,H,R,J H,R,J R,J,P P,J,S Markov tree from SBN
R I K A F L B M N G C D O E H P J S Structured Bayesian network – a Hierarchical (Object-Oriented) Bayesian network
Learning SBN from Data • Define the DEP() measure as follows: DEP(Y,X)=P(x|y)-P(x|y). • Define DEP[](Y,X)= (DEP(Y,X) )2 • Construct a tree according to Chow/Liu algorithm using DEP[](Y,X) with Y belonging to the tree and X not.
Continued .... • Let us call all the edges obtained by the previous algorithm “free edges”. • During the construction process the following type of edges may additionally appear “node X loop unoriented edge”, “node X loop oriented edge”, “node X loop transient edge”. • Do in a loop (till termination condition below is satisfied): • For each two properly connected non-neighboring nodes identify the unique connecting path between them.
Continued .... • Two nodes are properly connected if the path between them consists either of edges having the status of free edges or of oriented, unoriented (but not suspended) edges of the same loop, with no pair of oriented or transient oriented edges pointing in different directions and no transient edge pointing to one of the two connected points. • Note that in this sense there is at most one path properly connecting two nodes.
Continued .... • Connect that a pair of non-neighboring nodes X,Y by an edge, that maximizes DEP[](X,Y), the minimum of unconditional DEP and conditional DEP given a direct successor of X on the path to Y. • Identify the loop that has emerged from this operation.
Continued .... • We can have one of the following cases: • (1)it consists entirely of free edges • (2)it contains some unoriented loop edges, but no oriented edge. • (3)It contains at least one oriented edge. • Depending on this, give a proper status to edges contained in a loop: “node X loop unoriented edge”, “node X loop oriented edge”, “node X loop transient edge”. • (details in written presentation).
Concluding Remarks • new class of Bayesian networks defined • completely new method of reasoning in Bayesian networks outlined • Local computation – at most 4 nodes involved • applicable to a more general class of networks then known reasoning methods • new class Bayesian networks easily transfornmed to Markov trees • new class Bayesian networks – a kind of hierarchical or object-oriented Bayesian networks • Can be learned from data