1 / 31

Modeling and Estimation of Dependent Subspaces

Modeling and Estimation of Dependent Subspaces. J. A. Palmer 1 , K. Kreutz-Delgado 2 , B. D. Rao 2 , Scott Makeig 1 1 Swartz Center for Computational Neuroscience 2 Department of Electrical and Computer Engineering University of California San Diego September 11, 2007. Outline.

cynara
Download Presentation

Modeling and Estimation of Dependent Subspaces

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Modeling and Estimation of Dependent Subspaces J. A. Palmer1, K. Kreutz-Delgado2, B. D. Rao2, Scott Makeig1 1Swartz Center for Computational Neuroscience 2Department of Electrical and Computer Engineering University of California San Diego September 11, 2007

  2. Outline • Previous work on adaptive source densities • Types of dependency • Variance dependency • Skew dependency • Non-radially symmetric dependency • Normal Variance-Mean Mixtures • Examples from EEG

  3. Independent Source Densities A general classification of sources: Sub- and Super-Gaussian • Super-Gaussian = more peaked, than Gaussian, heavier tail • Sub-Gaussian = flatter, more uniform, shorter tail than Gaussian Super-Gaussian Gaussian Sub-Gaussian Sub- AND Super-Gaussian

  4. Extended Infomax The (independent) source models used in the Extended Infomax algorithm (Lee) are: Super-Gaussian (Logistic) Sub-Gaussian (Gaussian mixture)

  5. Scale Mixture Representation • Gaussian Scale Mixtures (GSMs) are sums of Gaussians densities with different variances, but all zero mean: • A random variable with a GSM density can be represented as a product of Standard Normal random variable Z,and an arbitrary non-negative random variable : Gaussian Scale Mixture Gaussians • Sums of random number of random variables leads to GSM (Renyi) • Multivariate densities can be modeled by product non-negative scalar and Gaussian random vector: X = 1/2 Z

  6. Super-Gaussian Mixture Model • Generalize of Gaussian mixture model to super-Gaussian mixtures: • The update rules are similar to the Gaussian mixture model, but include the variational parameters ,

  7. Gaussian Scale Mixture Examples 1 • Generalized Gaussian, 0 <  < 2: • Mixing density is related to positive alpha stable density, S/2:

  8. Gaussian Scale Mixture Examples 2 • Generalized Logistic,  > 0: • Mixing density is Generalized Kolmogorov:

  9. Gaussian Scale Mixture Examples 3 • Generalized Hyperbolic: • Mixing density is Generalized Inverse Gaussian:

  10. Dependent Subspaces • Dependent sources modeled by Gaussian scale mixture, i.e. Gaussian vector with common scalar multiplier, yielding “variance dependence”:

  11. Dependent Multivariate Densities • Multiply Gaussian vector by common scalar: • For GSM evaluated at : • Taking derivatives of both sides:

  12. Dependent Multivariate Densities • Thus, given univariate GSM, can form multivariate GSM: • Define the linear operator V : • Then we have, • Posterior moments can be calculated for EM:

  13. Examples in R3 • Given a univariate GSM p(x), a dependent multivariate density in R3 is given by: • Example: Generalized Gaussian: • Example: Generalized Logistic:

  14. Non-radial Symmetry • Use Generalized Gaussian vectors to model non-radially symmetric dependence:

  15. Generalized Hyperbolic • The Generalized Hyperbolic density (Barndorff-Nielsen, 1982) is a GSM: • For a Generalized Gaussian scale mixture, • The posterior is Generalized Inverse Gaussian:

  16. Hypergeneralized Hyperbolic • This yields the “Hypergeneralized Hyperbolic density” • The posterior moment for EM is given by:

  17. Generalized Gauss. Scale Mixtures • More generally, evaluating a multivariate GSM at xp/2: • Integrating this over Rdwe get: • Thus, given a multivariate GSM, we can formulate a multivariate GGSM:

  18. Skew Dependence • Skew is modeled with “location-scale mixtures”:

  19. Skew Models • For a multivariate GSM: • Now, for any vector , we have: • This can be written in the form, • This is equivalent to the following model:

  20. EEG brain sources ocular sources scalp muscle sources external EM sources heartbeat

  21. Pairwise Mutual Information

  22. Maximize Block Diagonality

  23. Variance Dependency • Variance dependence can be estimated directly using 4th order cross moments • Find covariance of source power: • Finds components whose activations are “active” at the same times, “co-modulated”

  24. Mutual Information/Power Covariance • Most of the dependence in mutual information is captured by covariance of power. Summed over 50 lags • Some pairs of sources are more sensitive to variance dependence.

  25. Variance Dependent Sources

  26. However product density is approximately “radially symmetric” Radially symmetric non-Gaussian densities are dependent Marginal Histograms are “Sparse”

  27. Conclusion • We described a general framework for modeling dependent sources • Estimation of model parameters is carried out using the EM algorithm • Models include variance dependency, non-radial symmetric dependence, and skew dependence • Application to analysis of EEG sources

More Related