1 / 27

EEG Signal Processing Techniques: PCA and ICA Algorithms

Learn about preprocessing methods like visual inspection, filtering, PCA, and ICA for EEG data analysis. Understand matrix representation, FastICA, and mathematical formulations. Explore the concepts of non-Gaussianity and entropy in signal processing.

ydennis
Download Presentation

EEG Signal Processing Techniques: PCA and ICA Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ME (Signal Processing), IISc: Neural Signal Processing, Spring 2014 Brain Electrophysiological Signal Processing: Preprocessing Kaushik Majumdar Indian Statistical Institute Bangalore Center kmajumdar@isibang.ac.in

  2. Benbadis and Rielo, 2008: http://emedicine.medscape.com/article/1140247-overview Heart-Rate and Muscle Artifacts in EEG ME (Signal Processing), IISc: Neural Signal Processing

  3. Preprocessing • Visual Inspection • Filtering • Principal Component Analysis (PCA) • Independent Component Analysis (ICA) ME (Signal Processing), IISc: Neural Signal Processing

  4. Matrix Representation of Multi-Channel EEG • M is an m x n matrix, whose m rows represent m EEG channels and n columns represent n time points. • Often during EEG processing we are to find a matrix W such that WM is the processed signal. ME (Signal Processing), IISc: Neural Signal Processing

  5. Majumdar, under preparation, 2013 EOG Identification by Principal Component Analysis (PCA) ME (Signal Processing), IISc: Neural Signal Processing

  6. PCA Algorithm (cont.) ME (Signal Processing), IISc: Neural Signal Processing

  7. PCA Algorithm (cont.) PCA Rotation and (Stretching or Contracting) ME (Signal Processing), IISc: Neural Signal Processing

  8. Wallstrom et al., Int. J. Psychophysiol., 53: 105-119, 2004 Performance of PCA in EOG Removal EOG ME (Signal Processing), IISc: Neural Signal Processing

  9. Independent Component Analysis (ICA) • In PCA data components are assumed to be mutually orthogonal, which is too restrictive. PCA components Original data sets ME (Signal Processing), IISc: Neural Signal Processing

  10. ICA (cont.) • PCA will give poor results if the covariance matrix has eigenvalues close to each other. ME (Signal Processing), IISc: Neural Signal Processing

  11. ICA as Blind Source Separation (BSS) S1 S4 Four musicians are playing in a room. From the outside only music can be heard through four microphones. No one can be seen. How the music heard from outside can be decomposed into four sources? S2 S3 1 2 4 3 ME (Signal Processing), IISc: Neural Signal Processing

  12. Mathematical Formulation A is mixing matrix, x is sensor vector, s is source vector and n is noise, which is to be eliminated by filtering. ME (Signal Processing), IISc: Neural Signal Processing

  13. Mathematical Formulation (cont.) Given find such that Any estimation technique of is called an ICA technique or BSS technique in general. ME (Signal Processing), IISc: Neural Signal Processing

  14. Hyvarinen and Oja, Neural Networks, 13: 411-430, 2000 ICA Algorithm: FastICA Whitening: • Normalization (make mean zero). • Make variance one i.e., E expectation, x is the vector of signals and I is identity matrix. ME (Signal Processing), IISc: Neural Signal Processing

  15. FastICA (cont.) B is orthogonal matrix and D is diagonal matrix of E will satisfy Whitening complete ME (Signal Processing), IISc: Neural Signal Processing

  16. Non-Gaussianity • ICA is appropriate only when probability distribution of the data set is non-Gaussian. • Gaussian distribution is of the form ME (Signal Processing), IISc: Neural Signal Processing

  17. Entropy of Gaussian Variable • A Gaussian variable has the largest entropy among a class of random variables with equal variance (for a proof see Cover & Thomas, Elements of Information Theory). Here we will give an intuitive argument. ME (Signal Processing), IISc: Neural Signal Processing

  18. Entropy of a Random Variable X More information Less (zero) information ME (Signal Processing), IISc: Neural Signal Processing

  19. Gaussian Random Variable Has Highest Entropy: Intuitive Proof • By Central Limit Theorem (CLT) the mean of a class of random variables (class is signified by uniform variance) follows normal distribution as the number of members in the class tends to infinity (i.e., becomes very large). • Infinite observations hold infinite or maximum amount of information. ME (Signal Processing), IISc: Neural Signal Processing

  20. Intuitive Proof (cont.) • Therefore a random variable with normal distribution has the highest information content. • So it has the highest entropy. If each variable in a class of random variables admits only finite number of nonzero values, the one with uniform distribution will have the highest entropy. ME (Signal Processing), IISc: Neural Signal Processing

  21. Non-Gaussianity as Negentropy H is entropy and J negentropy. J is to be maximized. When J is maximum y is reduced to a component. This can be shown by calculating the kurtosis for component and sum of components including the said component (See Hyvarinen & Oja, 2000, P. 7). ME (Signal Processing), IISc: Neural Signal Processing

  22. Steps of FastICA after Whitening g is in the form of either of the two ME (Signal Processing), IISc: Neural Signal Processing

  23. Exercise • FastICA has been implemented in EEGLAB (in runica function). Remove artifacts from sample EEG data using the ICA implementation in EEGLAB. ME (Signal Processing), IISc: Neural Signal Processing

  24. Concept of Independence in PCA and ICA • In PCA independence means orthogonality i.e., pairwise dot product is zero. • In ICA independence is statistical independence. Let x, y be random variables, p(x) is probability distribution function of x and p(x,y) is joint probability distribution function of (x,y). If p(x,y) = p(x).p(y) holds we call x and y are statistically independent. ME (Signal Processing), IISc: Neural Signal Processing

  25. Independence (cont.) • If vectors v1 and v2 are orthogonal they are independent. Say not, then a1v1 + a2v2 = 0 implies, a1v1.v1 + a2v2.v1 = 0 or a1 = 0. Similarly a2 = 0. • If v1 = cv2 then both of them must have same probability distribution or p(v1,v2) = p(v1) = p(v2). If v1 and v2 are linearly independent p(v1,v2) = p(v1).p(v2) may or may not hold. • If p(v1,v2) = p(v1).p(v2) holds then v1 and v2 are linearly independent. ME (Signal Processing), IISc: Neural Signal Processing

  26. Conditions for ICA Applicability • Sources are statistically independent. • Propagation delays in the mixing medium are negligible. Sources are time varying. Mixing medium delays may affect sources in different locations differently and thereby corrupting their temporal structures. • Number of sources = number of sensors. ME (Signal Processing), IISc: Neural Signal Processing

  27. References • Benbadis and Rielo, EEG artifacts, eMedicine, available online at http://emedicine.medscape.com/article/1140247-overview, 2008. • Hyvarinen and Oja, Independent component analysis: algorithms and applications, Neural Networks, vol. 13, p. 411-431, 2000. • Majumdar, A Brief Survey of Quantitative EEG Analysis (under preparation), Chapter 2. ME (Signal Processing), IISc: Neural Signal Processing

More Related