10 likes | 150 Views
An AdaBoost -Based Weighting Method for Localizing Human Brain Magnetic Activity R. Takashima, T. Takiguchi , Y. Ariki ( Kobe University ) T. Imada , J.-F. L. Lin, P. K. Kuhl ( I-LABS, University of Washington ) M. Kawakatsu , M. Kotani ( Tokyo Denki University ). 3-Q-26.
E N D
An AdaBoost-Based Weighting Method for Localizing Human Brain Magnetic Activity R. Takashima, T. Takiguchi, Y. Ariki (Kobe University) T. Imada, J.-F. L. Lin, P. K. Kuhl (I-LABS, University of Washington) M. Kawakatsu, M. Kotani (Tokyo Denki University) 3-Q-26 Recording of MEG Responses to Vowels Abstract This paper shows that pattern classification based on machine learning is a powerful tool to analyze human brain activity data obtained by magnetoencephalography (MEG). An AdaBoost algorithm can simultaneously estimate both the classification boundary and the weight of each MEG sensor. The estimated weight indicates how the corresponding sensor is useful for classifying the MEG response patterns. A 122-channel whole-scalp Neuromag MEG system. Subjects: four right-handed Japanese. We used two speech sounds (Japanese vowels), /a/ and /o/. 1,300 ~ 1,500 ms 200 ms 200 ms • The subject’s task was to press • a reaction key with the index finger for the stimulus /a/ • and another reaction key with the middle finger for the stimulus /o/. • The mean reaction times: 495.1 ms for /a/ and 497.3 ms for /o/ • Average: more than 80 epochs Feature Extraction The signal obtained by averaging over 80 MEG epochs was converted into a representation more amenable to subject-independent recognition. : the observation at the m-th sensor at time t Analysis Results : the magnitude feature Subject-dependent model -100 ~ 0 ms 0 ~ 100 ms 50 ~ 150 ms The normalized MEG magnitude feature vector is . MEG-Sensor Weighting Based on AdaBoost Training data: for /a/ and for /o/ ) Initialize the training data weight: For n = 1,…, N Build a decision stump (weak learner), for the -th dimension and calculate the error of the weak learner using the training data weight. The threshold for the -thdimension which outputs the minimum error can be found using a grid search algorithm. The decision stump which outputs the minimum error among all dimensions is defined as the -th weak learner. Update the training data weight using. (This leads to the increase of the training data weight for the data misclassified by . ) Subject-independent model -100 ~ 0 ms 0 ~ 100 ms 50 ~ 150 ms 100 ~ 200 ms 150 ~ 250 ms 200 ~ 300 ms 100 ~ 200 ms 150 ~ 250 ms 200 ~ 300 ms • The larger weights in the latency range: 50 ~ 150 ms, 100 ~ 200 ms, and 150 ~ 250 ms, are seen to be in the left language area. • Future work: • Noise-robust feature extraction end The MEG-sensor weight is calculated using