1 / 42

Machine Learning

Explore the concepts and techniques of Blind Source Separation using Independent Component Analysis (ICA). Learn how to recover original source signals from mixed observations, and discover applications in image processing, biomedical signal processing, speech and audio processing, telecommunication, bioinformatics, and financial data analysis.

gamezr
Download Presentation

Machine Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Machine Learning • Supervised Learning • Classification and Regression • K-Nearest Neighbor Classification • Fisher’s Criteria & Linear Discriminant Analysis • Perceptron: Linearly Separable • Multilayer Perceptron & EBP & Deep Learning, RBF Network • Support Vector Machine • Ensemble Learning: Voting, Boosting(Adaboost) • Unsupervised Learning • Dimensionality Reduction: Principle Component Analysis • Independent Component Analysis • Clustering: K-means • Semi-supervised Learning & Reinforcement Learning

  2. Motivation • Cocktail Party Problem

  3. = X

  4. Problem Statement • Unknown mutually independent source signals with zero means • The model for sensor signals • Problem • To recover the original signals except for a permutation of indices and scales • Estimation of an unmixing matrix u W x A s

  5. Blind Source Separation by ICA ? mixing environment initial system Unknown sources and the mixing environment

  6. Assumptions and limitations on ICA • Basic assumptions: • Individual source signals are statistically independent of the other source signals. • # of sources ≤ # of observations • x = As s=Wx • Limitations: • Scale and sign indeterminacy • We cannot determine the variance of sources • x = As = (A/α) (αs)=A’s’ • Permutation indeterminacy • We cannot determine the order of sources • Sources should be non-Gaussian. At most one source can be Gaussian.

  7. 1 p 0 1/2 1 InfoMax Algorithm • What is Entropy? • Definition: The average amount of information of the source. • Ex) Two Symbol Sources

  8. Entropy • Differential Entropy • Bounded Case: Uniform Distribution • Unbounded Case: Normal Distribution (Gaussian) • Joint Entropy • Mutual Information

  9. Objective Function of ICA • Objective Function of ICA • Minimizing Mutual Information • Maximizing Entropy H(y) u W x A s

  10. Probability Density Function

  11. Maximizing Output Entropy: InfoMax • Joint entropy at the outputs • Pdf of the outputs • Joint entropy at the outputs where

  12. One Input One Output • Stochastic gradient ascent learning rule

  13. N→N Network • Stochastic gradient ascent learning rule • Jacobian of the transform

  14. Learning rule where (Score function)

  15. Natural Gradient • Natural Gradient (or Relative Gradient)(Amari, Neural Comuptation, 1998. [3]) • How can we decide the score ft.:Ext. ICA (T.-W. Lee et al., 1999.[4])

  16. Flex. ICA: Generalized Gaussian (S. Choi. 2000.[6])

  17. III. Comparison with PCA and Unifying Framework • PCA • Calculate covariance matrix • Using SVD, find M eigenvalues and eigenvectors. • Choose L eigen vectors corresponding to L largest eigen values. L Largest eigen values M-L small eigen values

  18. PCA basis ICA basis Comparison of PCA and ICA • Independent = uncorrelated ? • Only for Gaussian data • dependence = correlation • Generally, • dependence ≠ correlation

  19. Whitened signal and independent signal Observations Whitened Independent

  20. Max. H(y): InfoMax Minimizing Mutual Information Negentropy Maximization Unifying Information-Theoretic Framework(1)

  21. Maximum Likelihood Estimation Unifying Information-Theoretic Framework(2)

  22. High-Order Moments and Cumulants Nonlinear PCA Bussgang Algorithms (Te-Won Lee et al., Computers and Mathematics with Applications, 2000. [1]) Unifying Information-Theoretic Framework(3)

  23. Applications of ICA • Image processing • e.g. denoising, feature extraction, etc • Biomedical data analysis • e.g. anaysis of EEG, MEG, ECG, etc… • Speech and audio processing • e.g. speech separation, feature extraction • Telecommunication • e.g. MIMO System, etc. • Bioinformatics • e.g. micro-array data analysis, etc • Financial data analysis

  24. Image Processing • Independent Components of Natural Scenes (Bell and Sejnowski, Vision Research 1996.[2])

  25. Independent Components are Edge Filters

  26. Biomedical Signal Processing • Blind Source Separation of EEG Signals

  27. Convolved Mixtures: ANC vs. BSS • ANC • BSS Signal Source x(n) u1(n) Mixing System (?) W(z) Noise Source r1(n) x1(n) u1(n) Source 1 Mixing System (?) W21(z) W12(z) x2(n) u2(n) Source 2

  28. Adaptive Noise Canceling • System output • Minimize the output power • The LMS algorithm

  29. ICA-Based Approach to ANC [8] • Maximizing entropy • Set dummy output • Learning rules of adaptive filter coefficients in ANC (Park et al. 2002) y y 1 2 u v where w(k) x r 1 n s where

  30. Known Noise Source Input Signal with Noise Output Signal after Process

  31. ANC + Blind Source Separation • Hyung-Min Park, Sang-Hoon Oh and Soo-Young Lee, 2001

  32. Mic.1 Mic.2 Output 1 Output 2

  33. Time Domain Approach to BSS • Convolved mixtures • Feedback architecture • Learning rules

  34. Frequency Domain Approach to BSS (1) • In the frequency domain • Complex score function • Learning rule

  35. Frequency Domain Approach to BSS (2) • Performance limitation • Contradiction between long reverberation covering and insufficient learning data • Long reverberation long frame size • Small number of frames insufficient learning data • Mixtures combined from different time ranges of ICs • Delayed mixtures

  36. Filter Bank Approach to BSS[9] • 2x2 network for the oversampled filter bank approach to BSS

  37. Nonuniform Filter Bank Approach to BSS • 2x2 network for the nonuniform oversampled filter bank approach to BSS

  38. = X X = V. Independent Vector Analysis • Independent Component Analysis • Independent Vector Analysis[11]

  39. = X Model Definition of IVA dependent Independent

  40. VI. Summary • ICA is motivated to solve cocktail party problem. • It is also a method for data driven linear decomposition. • InfoMax Algorithm • Unifying Information Theoretic Framework • Successful Applications in Many Areas • IVA is motivated to solve BSS of convolutive mixtures in Frequency Domain.

  41. References • Te-Won Lee, M. Girolami, A. J. Bell, and T. J. Sejnowski, “A unifying information-theoretic framework for independent component analysis,” Computers and Mathematics with Applications, Vol. 39, pp. 1-21, 2000. • A. J. Bell and T. J. Sejnowski, “The “independent components” of natural scenes are edge filters,” Vision Research, Vol. 37, pp. 3327-3328, 1997. • S. Amari, “Natural gradient works efficiently in learning,” Neural Computation, Vol. 10, pp. 251-276, Feb. 1998. • T.-W. Lee, M. Girolami, and T. J. Sejnowski, “Independent component analysis using an extended infomax algorithm for mixed sub-Gaussian and super-Gaussian sources,” Neural Computation, Vol. 11, pp. 417-441, 1999. • M. Girolami, A. Cichocki, and S.-I. Amari, "A common neural-network model for unsupervised exploratory data analysis and independent component analysis," IEEE Trans. Neural Networks, vol. 9, no. 6, Nov. 1998. • S. Choi, A. Cichocki, and S. Amari, “Flexible independent component analysis,” J. VLSI Signal Processing-Systems forSignal, Image, and Video technology, Vol. 26, pp. 25-38, Aug. 2000.

  42. References • S.-H. Oh, A. Cichocki, S. Choi, S.-I. Amari, and S.-Y. Lee., "Comparison of ICA/BSS algorithms in noisy environment," Proc. ICONIP, vol. 2, pp. 1192-1197, Nov. 2000. • Hyung-Min Park, Sang-Hoon Oh, and Soo-Young Lee, “Adaptive noise cancelling based on independent component analysis,” IEE Electronics Letters, vol. 38, no. 15, pp. 832-833, July 2002. • Hyung-Min Park, Chandra Shekhar Dhir, Sang-Hoon Oh, and Soo-Young Lee, “A filter bank approach to independent component analysis for convolved mixtures,” Neurocomputing, Vol. 69, pp. 2065-2077, Oct. 2006. • Jong-Hwan Lee, Sang-Hoon Oh, and Soo-Young Lee, “Binaural semi-blind dereverberation of noisy convoluted speech signals,” Accepted for publication in Neurocomputing. • T. Kim, T. Eltoft, and T.-W. Lee, “Independent vector analysis: an extension of ICA to multivariate components,” Proc. ICA2006, pp. 165-172, 2006. • J.-H. Lee, et al., “Independent vector analysis(IVA): multivariate approach for fMRI group study,” NeuroImage, Vol. 40, pp. 86-109, 2008.

More Related