190 likes | 281 Views
Independent Factor Analysis. H. Attias University of California. 1. Statistical Modeling and Bind Source Seperation. BBS (blind source separation) problem L’ sensors , L source signals Source signals : mutually independent Sources are not observable and unknown
E N D
Independent Factor Analysis H. Attias University of California
1. Statistical Modeling and Bind Source Seperation • BBS (blind source separation) problem • L’ sensors , L source signals • Source signals : mutually independent • Sources are not observable and unknown • Mixing process(linear) and noise unknown • Orderly Factor Analysis • Cannot perform BSS • Gaussian model for p(xj) : 2nd order statistics -> rotation-invariant in factor space • Attacks : projection pursuit, generalized additive models
ICA • Mixing is squre (L’ = L), invertible, instantaneous and noiseless • Non-Gaussian p(xj) : not rotation-invariant, maximum-likelihood of mixing matrix is unique • p(xj) : restricted • Gradient-ascent maximization methods • IFA • p(xj) : non-Gaussian • Generative model : independent sources • EM method • 2 steps • Learning IF model : mixing matrix, noise covariance, source density • Source reconstruction
2. Independent Factor Generative Model • Noise : • IF parameters : • Model sensor decity
Source Model : Factorial Mixture of Gaussians • P(xi) : need to be general & t ractable • MOG (mixture of Gaussian model) • q(xi) : work as hidden states
Strongly constraint • Modification of mean & variance of a single source states qi would result in shifting a whole column of q => “factorial MOG” • Sensor Model
Generation of sensor signal y • (i) Pick a unit qi for each source i with probability • (ii) • Top-down first-order Markov chain
Co-adaptive MOG • Rotation & scailing of whole line of states
3. Learning the IF Model • Error Function & the Maximum Likelyhood • Kullback-Leibler distance • Maximizing E : maximizing likelyhood of data • Relation to mean square point-by-point distance
EM Algorithm • (E) step :calculate the expected value of the complete-data likelyhood • (M) step : minimize
4. Recovering the Sources • If noise free and mixing is invertible, • 2 ways • LMS estimator, MAP estimator • Both are non-linear functions of the data • Each satisfies a different optimality criterion
LMS Estimator • Minimizes where, • MAP Estimator • Maximizes the source posterior p(x | y) • Simple way : iterative method of gradient ascent
5. IFA : Simulation Results • 5sec-long speech, music signal and synthesized signal
6. IFA with Many Sources: Factorized Variational Approximation • EM becomes intractable as the number of sources in IF model increases. (exponentially with number of sources) • Intractability is the choise of p’ as the exact posterior • Variational approach : feedforward probabilistic models • Factorized Posterior • IF model : sources conditioned on a data vector are correlated • : non-diagonal • In the factorized variational approximation : even when conditioned on a data vector, the sources are independent
Mean-Field Equations • Learning rules for are similarly derived by fixing W=W’ and solving