180 likes | 402 Views
Dirichlet process tutorial. Bryan Russell. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A. Goals. Intuitive understanding of Dirichlet processes and applications Minimize math and maximize pictures
E N D
Dirichlet process tutorial Bryan Russell TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAA
Goals • Intuitive understanding of Dirichlet processes and applications • Minimize math and maximize pictures • Motivate you to go through the math to understand implementation
Disclaimers • What I’m about to tell you applies more generally • We’ll gloss over lots of math (especially measure theory); look at the original papers for details
What’s this good for? • Principled, Bayesian method for fitting a mixture model with an unknown number of clusters • Because it’s Bayesian, can build hierarchies (e.g. HDPs) and integrate with other random variables in a principled way
Multinomial weights: prior probabilities of the mixtures
For each data point, choose cluster center h
Generate points x from the Gaussian mixture h
Let us be more Bayesian… Put a prior over mixture parameters For Gaussian mixtures, this is a normal inverse-Wishart density
Suppose we do not know the number of clusters We could sample Gaussian parameters for each data point However, the parameters may all be unique, i.e. there is one Gaussian mixture for each data point--overfitting
Dirichlet processes to the rescue Draws from Dirichlet processes have a nice clustering property: Normal inverse-Wishart density Concentration parameter is a density over the parameters and is discrete with probability one
Visualizing Dirichlet process draws Think of these as prior weights over the parameters
DP mixture model This model has a bias to “bunch” parameters together. The concentration parameter controls this “bunching” property: lower values will find fewer clusters and vice versa for higher values