500 likes | 994 Views
Markov Random Fields ( MRF). Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.). Outline. Introduction Conditional Independence Properties Factorization Properties Illustration: Image De-noising Relation to Directed Graphs. Introduction. Based on a undirected graph
E N D
Markov Random Fields (MRF) Presenter:Kuang-Jui Hsu Date :2011/5/23(Tues.)
Outline • Introduction • Conditional Independence Properties • Factorization Properties • Illustration: Image De-noising • Relation to Directed Graphs
Introduction • Based on a undirected graph • The MRF model has a simple form and is easy to use • Based on conditional independence properties
Conditional Independence Properties • In an undirected graph, there are three sets of nodes, • denoted A, B, C, and A is conditionally independent • of B given C • Shorthand notation: <=> p(A|B, C) = p(A|C) Conditional independence property
Simple Form • A node will be conditionally independent of all other • nodes conditioned only on neighbouring nodes
Factorization Properties • In a directed graph • Generalized form:
In an Undirected Graph • Consider two nodes and that are not • connected. • Must be conditionally independent • So, the conditional independence property can be • expressed as The set x of all variables with and removed Factorization Property
Clique • This leads us to consider a graphical concept: • Clique Clique: Maximal Clique:
Potential Function • Define the factors in the potential function by using the clique • Generally, consider the maximal cliques, because other cliques must be the subsets of maximal cliques
Potential Function • Potential function over the maximal cliques of the graph Clique The set of variables in that clique • The joint distribution: Equal to zero or positive • Partition function: a normalization constant
Partition Function • A model with M discrete nodes each having K states, • then the evaluation involves summing over states • The normalization constant is the major limitations • Needed for parameter learning • Exponential growth • Because it will be a function of any parameters that govern the potential functions
Connection between Conditional Independence And Factorization • Define : • For any node , the following conditional property holds • The neighborhood of • All nodes expect • Define : • A distribution can be expressed as • The Hammerley-Clifford theorem states that the sets • and identical.
Potential Function Expression • It is convenient to express them as exponentials • Restrict the potential function to be positive • Energy function • Boltzmann distribution • The total energy is obtained by adding the energies of • each of the maximal energy
Illustration: Image De-noising • Noisy image • Described by an array of binary pixel values • , where the index i = 1, . . ., D runs over • all pixels.
Illustration: Image De-noising • Noise-free image • Described by an array of binary pixel values • , and randomly flipping the sign of pixels with some small probability
Create the MRF Model • A strong correlation between the neighbouringpixles • A strong correlation between and • MRF model: • The graph has two types of cliques, • each of which contain two variables. • The clique form , uses the form of the energy function • The clique form , uses the form of the energy function • The parameters and are positive, • and are neighbour
The Energy Function • The complete energy function: • 1.postitve • 2.negative • The joint distribution
Solve by ICM • For the purpose of image restoration, find an image x having a high probability • Use a simple iterative technique called iterated condition mode ( ICM) • Simply an application of coordinate-wise gradient ascent
The steps of ICM Evaluate the total energy for -1 and 1 choose the lower energy, and update • Stop until convergence
Result Use ICM Use graph-cut
Relation to Directed Graphs • Solve the problem of taking a model that is specified using a directed graph and trying to convert it to undirected graph Directed graph Undirected graph
Relation to Directed Gaphs This is easily done by identifying
Relation to Directed Graphs • Consider how to generalize this construction • This can be achieved if the clique potentials of the undirected graph are given by the conditional distributions of the directed graph. • Ensure that the set of variables that appears in each of • conditional distributions is a member of at least one • clique of the undirected graph
Generalize This Construction • For nodes having one parent
Convert the Directed Graph to the Undirected Graph • For nodes having more than one parent • Moral graph • Involving the four variables, so they must belong to a • single clique if this conditional distribution is to be • absorbed in a clique potential • The process has become known as moralization
Convert the Directed Graph to the Undirected Graph • Discard some conditional independence properties • In fact,we can simply using a fully connected undirected graph • However, this would discard all conditional properties • The moralization adds the fewest extra links and so • retain the maximum number of independence properties
Special Graph • There are two type of graph that can express different • conditional independence properties • Type 1: dependence map(D-map) • Type 2: Independence map(I-map)
Dependence Map(D-Map) • Every conditional independence statement satisfied by • the distribution is reflected in the graph • A completely disconnected graph
Independence Map(I-Map) • Every conditional independence statement implied by • a graph is satisfied by a specific distribution • A full connected graph • Aperfect map: bothI-map and D-map
Perfect Map • Undirected • graph • Directed • graph • The set of all distributions P over a given set of variables