310 likes | 480 Views
Modeling and Estimation of Dependent Subspaces. J. A. Palmer 1 , K. Kreutz-Delgado 2 , B. D. Rao 2 , Scott Makeig 1 1 Swartz Center for Computational Neuroscience 2 Department of Electrical and Computer Engineering University of California San Diego September 11, 2007. Outline.
E N D
Modeling and Estimation of Dependent Subspaces J. A. Palmer1, K. Kreutz-Delgado2, B. D. Rao2, Scott Makeig1 1Swartz Center for Computational Neuroscience 2Department of Electrical and Computer Engineering University of California San Diego September 11, 2007
Outline • Previous work on adaptive source densities • Types of dependency • Variance dependency • Skew dependency • Non-radially symmetric dependency • Normal Variance-Mean Mixtures • Examples from EEG
Independent Source Densities A general classification of sources: Sub- and Super-Gaussian • Super-Gaussian = more peaked, than Gaussian, heavier tail • Sub-Gaussian = flatter, more uniform, shorter tail than Gaussian Super-Gaussian Gaussian Sub-Gaussian Sub- AND Super-Gaussian
Extended Infomax The (independent) source models used in the Extended Infomax algorithm (Lee) are: Super-Gaussian (Logistic) Sub-Gaussian (Gaussian mixture)
Scale Mixture Representation • Gaussian Scale Mixtures (GSMs) are sums of Gaussians densities with different variances, but all zero mean: • A random variable with a GSM density can be represented as a product of Standard Normal random variable Z,and an arbitrary non-negative random variable : Gaussian Scale Mixture Gaussians • Sums of random number of random variables leads to GSM (Renyi) • Multivariate densities can be modeled by product non-negative scalar and Gaussian random vector: X = 1/2 Z
Super-Gaussian Mixture Model • Generalize of Gaussian mixture model to super-Gaussian mixtures: • The update rules are similar to the Gaussian mixture model, but include the variational parameters ,
Gaussian Scale Mixture Examples 1 • Generalized Gaussian, 0 < < 2: • Mixing density is related to positive alpha stable density, S/2:
Gaussian Scale Mixture Examples 2 • Generalized Logistic, > 0: • Mixing density is Generalized Kolmogorov:
Gaussian Scale Mixture Examples 3 • Generalized Hyperbolic: • Mixing density is Generalized Inverse Gaussian:
Dependent Subspaces • Dependent sources modeled by Gaussian scale mixture, i.e. Gaussian vector with common scalar multiplier, yielding “variance dependence”:
Dependent Multivariate Densities • Multiply Gaussian vector by common scalar: • For GSM evaluated at : • Taking derivatives of both sides:
Dependent Multivariate Densities • Thus, given univariate GSM, can form multivariate GSM: • Define the linear operator V : • Then we have, • Posterior moments can be calculated for EM:
Examples in R3 • Given a univariate GSM p(x), a dependent multivariate density in R3 is given by: • Example: Generalized Gaussian: • Example: Generalized Logistic:
Non-radial Symmetry • Use Generalized Gaussian vectors to model non-radially symmetric dependence:
Generalized Hyperbolic • The Generalized Hyperbolic density (Barndorff-Nielsen, 1982) is a GSM: • For a Generalized Gaussian scale mixture, • The posterior is Generalized Inverse Gaussian:
Hypergeneralized Hyperbolic • This yields the “Hypergeneralized Hyperbolic density” • The posterior moment for EM is given by:
Generalized Gauss. Scale Mixtures • More generally, evaluating a multivariate GSM at xp/2: • Integrating this over Rdwe get: • Thus, given a multivariate GSM, we can formulate a multivariate GGSM:
Skew Dependence • Skew is modeled with “location-scale mixtures”:
Skew Models • For a multivariate GSM: • Now, for any vector , we have: • This can be written in the form, • This is equivalent to the following model:
EEG brain sources ocular sources scalp muscle sources external EM sources heartbeat
Variance Dependency • Variance dependence can be estimated directly using 4th order cross moments • Find covariance of source power: • Finds components whose activations are “active” at the same times, “co-modulated”
Mutual Information/Power Covariance • Most of the dependence in mutual information is captured by covariance of power. Summed over 50 lags • Some pairs of sources are more sensitive to variance dependence.
However product density is approximately “radially symmetric” Radially symmetric non-Gaussian densities are dependent Marginal Histograms are “Sparse”
Conclusion • We described a general framework for modeling dependent sources • Estimation of model parameters is carried out using the EM algorithm • Models include variance dependency, non-radial symmetric dependence, and skew dependence • Application to analysis of EEG sources