480 likes | 729 Views
Lecture 6: Junction Tree Algorithm. Machine Learning CUNY Graduate Center. Today. Graphical Models Representing conditional dependence graphically Inference Junction Tree Algorithm. Undirected Graphical Models. A. D. C. B.
E N D
Lecture 6: Junction Tree Algorithm Machine Learning CUNY Graduate Center
Today • Graphical Models • Representing conditional dependence graphically • Inference • Junction Tree Algorithm
Undirected Graphical Models A D C B • In an undirected graphical model, there is no trigger/response relationship. • Represent slightly different conditional independence relationships • Conditional independence determined by graph separability.
Undirected Graphical Models Different relationships can be described with directed and undirected graphs. Cannot represent:
Undirected Graphical Models Different relationships can be described with directed and undirected graphs.
Probabilities in Undirected Graphs • Clique: a set of nodes such that there is an edge between every pair of nodes that are members of the set • We will defined the joint probability as a relationship between functions defined over cliques in the graphical model
Probabilities in Undirected Graphs Guanrantees a sum of 1 • Potential Functions: positive functions over groups of connected variables (represented by maximal cliques of graphical model nodes) • Maximal cliques: if a clique of nodes A are not a proper subset of a clique B, then A is a maximal clique.
Logical Inference NOT AND XOR • In logical inference, nodes are binary, edges represent gates. • AND, OR, XOR, NAND, NOR, NOT, etc. • Inference: given observed variables, predict others • Problems: uncertainty, conflicts, inconsistency
Probabilistic Inference NOT AND XOR NOT • Rather than a logic network, use a BayesianNetwork • Probabilistic Inference: given observed variables, calculate marginals over others. • Logic networks are generalized by Bayesian Networks
Probabilistic Inference NOT AND XOR NOT-ish • Rather than a logic network, use a BayesianNetwork • Probabilistic Inference: given observed variables, calculate marginals over others. • Logic networks are generalized by Bayesian Networks
Inference in Graphical Models General Problem: Given a graphical model, for any subsets of observed and expected variables find Direct approach can be quite inefficient if there are many irrelevant variables
Marginal Computation Graphical models provide efficient storage by decomposing p(x) into conditional probabilities and a simple MLE result. Now look for efficient calculation of marginals which will lead to efficient inference.
Brute Force Marginal Calculation • First approach: have CPTs and graphical model. We can compute arbitrary joints. • Assume 6 variables
Computation of Marginals • Pass messages (small tables) around the graph • The messages are small functions that propagate potentials around and undirected graphical model. • The inference technique is the Junction Tree Algorithm
Junction Tree Algorithm • Efficient Message Passing for Undirected Graphs. • For Directed Graphs, first convert to undirected. • Goal: Efficient Inference in Graphical Models
Junction Tree Algorithm Moralization Introduce Evidence Triangulate Construct Junction Tree Propagate Probabilities
Moralization • Converts a directed graph to an undirected graph. • Moralization “marries” the parents. • Insert an undirected edge between every pair of nodes that have a child in common. • Replace all directed edges with undirected edges.
Junction Tree Algorithm Moralization Introduce Evidence Triangulate Construct Junction Tree Propagate Probabilities
Introduce Evidence • Given a moral graph, identify the observed variables. • Reduce probability functions since we know some are fixed. • Only keep probability functions over remaining nodes.
Slices Differentiate potential functions from slices Potential Functions are related to joint probabilities over groups of nodes, but aren’t necessarily correctly normalized, and can even be initialized to conditionals. A slice of a potential function is a row or column of the underlying table (in the discrete case) or unnormalized marginal (in the continuous case)
Separation from Introducing Evidence Observing nodes separates conditionally independent sets of variables Normalization Calculation. Don’t bother until the end when we want to determine an individual marginal.
Junction Trees • Construction of junction trees. • Each node represents a clique of variables. • Edges connect cliques • There is a unique path from node to root • Between each clique node is a separator node. • Separators contain intersections of variables
Triangulation A A ABD B B D D B D BC DE E C C E E C CE • Constructing a junction tree. • Need to guarantee that a Junction Graph, made up of cliques and separators of an undirected graph is a Tree. • Eliminate any chordless cycles of four or more nodes.
Junction Tree Algorithm Moralization Introduce Evidence Triangulate Construct Junction Tree Propagate Probabilities
Triangulation When eliminating cycles there may be many choices about which edge to add. Want to keep the largest clique size small – small potential functions Triangulation that minimizes the largest clique size is NP-complete. Suboptimal triangulation is acceptable (poly-time) and doesn’t introduce many extra dimensions.
Triangulation When eliminating cycles there may be many choices about which edge to add. Want to keep the largest clique size small – small potential functions Triangulation that minimizes the largest clique size is NP-complete. Suboptimal triangulation is acceptable (poly-time) and doesn’t introduce many extra dimensions.
Triangulation Examples A A A B D A C E A A F
Junction Tree Algorithm Moralization Introduce Evidence Triangulate Construct Junction Tree Propagate Probabilities
Constructing Junction Trees A ABD ABD BD B D D BCD BCD CD CD C E CDE CDE • Junction trees must satisfy the Running Intersection Property: • All nodes on a path between a clique node V and clique node W must include all nodes in V ∩ W • Junction trees will have maximal separator cardinality.
Forming a Junction Tree • Given a set of cliques, connect the nodes, s.t. the Running Intersection Property holds. • Maximize the cardinality of the separators. • Maximum Spanning Tree (Kruskal’s algorithm) • Initialize a tree with no edges. • Calculate the size of separators between all pairs • O(N2) • Connect two cliques with the largest separator cardinality without creating a loop. • Repeat until all nodes are connected.
Junction Tree Algorithm Moralization Introduce Evidence Triangulate Construct Junction Tree Propagate Probabilities
Propagating Probabilities • We have a valid junction tree. • What can we do with it? • Probabilities in Junction Trees: • De-absorb smaller cliques from maximal cliques. • Doesn’t change anything, but is a less compact description.
Conversion from Directed Graph X1 X2 X3 X4 X1 X2 X3 X4 X1 X2 X2 X2 X3 X3 X3 X4 Example conversion. Represent CPTs as potential and separator functions (with a normalizer)
Junction Tree Algorithm • Goal: Make marginalsconsistent • Junction Tree Algorithm sends messages between cliques and separators until this consistency is reached.
Message passing A B B B C If they agree, finished. Otherwise, iterate. • Send a message from a clique to a separator • The message is what the clique thinks the marginal should be. • Normalize the clique by each message from the separators s.t. agreement is reached
Message Passing A B B B C
Junction Tree Algorithm When convergence is reached – clique potentials are marginals and separator potentials are submarginals. p(x) is consistent across all of the message passing. This implies that, so long as p(x) is correctly represented in the potential functions, the JTA can be used to make each potential function correspond to an appropriate marginal without impacting the overall probability function.
Converting a DAG to a Junction Tree X3 X4 X1 X2 X3 X4 X2 X3 X5 X3 X5 X1 X2 X6 X7 X5 X6 X5 X7 • Initialize separators to 1 and clique tables to CPTs • Run JTA to convert potential functions (CPTs) to marginals.
Evidence in a Junction Tree Conditional Initialize as usual. Update with a slice rather that the whole table
Efficiency of the Junction Tree Algorithm • Construct CPTs • Polynomial in # of data points • Moralization • Polynomial in # of nodes • Introduce Evidence • Polynomial in # of nodes • Triangulate • Suboptimal = polynomial. Optimal = NP • Construct Junction Tree • Polynomial in the number of cliques • Identifying cliques = polynomial in the number of nodes • Propagate Probabilities • Polynomial in the number of cliques • Exponential in the size of cliques
Hidden Markov Models Q1 Q2 Q3 Q4 X1 X2 X3 X4 Powerful graphical model to describe sequential information.
Research Projects • Run a machine learning experiment • Identify a problem/task. • Find appropriate data • Implement one or more ML algorithm • Evaluate the performance. • Write a report of the experiment • 4 pages including references • Abstract • One paragraph describing the experiment • Introduction • Describe the problem/task • Data • Describe the data set, features extracted, cleaning processes • Method • Describe the algorithm/approach • Results • Present and Discuss results • Conclusion • Summarize the experiment and results • Teams of two people are acceptable. • Requires a report from each participant (written independently) describing who was responsible for the components of the work.
Sample Problems/Tasks • Vision/Graphics • Object Classification • Facial Recognition • Fingerprint Identification • Fingerprint ID • Handwriting recognition • Non english languages? • Language • Topic classification • Sentiment analysis • Speech recognition • Speaker identification • Punctuation restoration • Semantic Segmentation • Recognition of Emotion, Sarcasm, etc. • SMS Text normalization • Chat participant Id • Twitter classification • Twitter threading
Sample Problems/Tasks • Games • Chess • Checkers • Poker • Blackjack • Go • Recommenders (Collaborative Filtering) • Netflix • Courses • Jokes • Books • Facebook • Video Classification • Motion classification • Segmentation
ML Topics to explore in the project • L1-regularization • Non-linear kernels • Loopy belief propagation • Non-parametric Belief propagation • Soft-decision trees • Analysis of Neural Network Hidden Layers • Structured Learning • Generalized Expectation • One-class learning • Evaluation Measures • Cluster Evaluation • Semi-supervised evaluation • Graph Embedding • Dimensionality Reduction • Feature Selection • Graphical Model Construction • Non-parametric Bayesian Methods • Latent Dirichlet Allocation
Next Time • Hidden Markov Models • Sampling in Graphical Models