200 likes | 340 Views
Dynamic Non-Parametric Mixture Models and The Recurrent Chinese Restaurant Process (SDM, 2008). Timeline: A Dynamic Hierarchical Dirichlet Process Model for Recovering Birth/Death and Evolution of Topics in Text Stream (UAI 2010). Amr Ahmed and Eric P. Xing. Presented by Bo Chen. Outiline.
E N D
Dynamic Non-Parametric Mixture Models and The Recurrent Chinese Restaurant Process (SDM, 2008) Timeline: A Dynamic Hierarchical Dirichlet Process Model for Recovering Birth/Death and Evolution of Topics in Text Stream(UAI 2010) Amr Ahmed and Eric P. Xing Presented by Bo Chen
Outiline • 1. Dirichlet process (DP) • 2. Temporal Dirichlet process mixture model and the recurrent Chinese restaurant process (RCRP) • 3. Hierarchical Dirichlet process (HDP) • 3. Dynamic hierarchical Dirichlet process • 4. Inference • 5. Experimental results • 6. Conclusions
Dirichlet Process (DP) If we integrate out G, The posterior of G • Perspectives: • Chinese restaurant process or Polya urn model • Stick-breaking process • A limit of finite mixture models. • Normalized Gamma process
The Temporal Dirichlet Process Mixture Model Intuition: The TDPM seek to model cluster parameters evolution over time using any time series model, and to capture cluster popularity evolution over time via the rich-gets-richer effect, i.e. the popularity of cluster k at time t is proportionable to how many data points were associated with cluster k at time t-1. Notations:
Modeling Higher-Order Dependencies One problem with the above construction of the TDPM is that it forgets too quickly especially when it models cluster popularity at time t + 1 based on its usage pattern at time t, while ignoring all previous information before time epoch t. Moreover, once a cluster is dead, it can no longer be revived again. Clearly, in some applications one might want to give a slack for a cluster before declaring it dead.
Non-Parametric Dynamic Topic Model Cons: The above simple DTM allows each document x to be generated from a single component (topic), thus making it suboptimal in modeling multi-topic documents.
Building Infinite Dynamic Topic Models (iDTM) Use the RCRF process as a prior over word assignment to topics in a mixed-membership model, we get the infinite dynamic topic model (iDTM).
Inference via Gibbs Sampling (1) In order to calculate the last two lines, the authors use the following approximation:
Inference via Gibbs Sampling (2) Sample the whole topic chain via M-H:
Simulation Results Epochs: 20; Vocabulary: 16; Topics: 8; Docs in each epoch: 100; Words in each doc: 50
Timeline of the NIPS Conference (1) Years: 1987-1999; Vocabulary: 3379; Documents: 1740
Conclusions and Future Works • 1. To addressed the problem of modeling time-varying document collections topic model, iDTM, that can adapt the number of topics, the word distributions of topics, and the topics' trend over time. • 2. Extend the Gibbs sampler to sample all the hyperparameters of the model. • 3. Extend the model to evolve an HDP at various levels, for instance, lower levels might correspond to conferences, and the highest level to time. This framework will enable us to understand topic evolution within and across different conferences or disciplines.