450 likes | 617 Views
An Emotion Recognition Journey. Lucy Kuncheva. A long time ago in a galaxy far, far away. I’d better move to Cardiff. around the summer of 2008,. we came in contact with the School of Psychology at Bangor University. I’d better move to Brunel.
E N D
An Emotion Recognition Journey Lucy Kuncheva
A long time ago in a galaxy far, far away...
I’d better move to Cardiff... around the summer of 2008, we came in contact with the School of Psychology at Bangor University. I’d better move to Brunel...
Areas of activation in the brain in response to emotion stimuli. Amygdala
Areas of activation in the brain in response to emotion stimuli. The limbic system
fMRI data were acquired from 16 right-handed healthy US college male students (aged 20–25).
How far are we from MIND READNING? this far...
March 2009 EPSRC proposal New approaches for fMRI data analysis
April 2010 FP7 proposal Lost an entire month of my life to this....
Kuncheva L.I., J. J. Rodriguez, C. O. Plumpton, D. E. J. Linden and S. J. Johnston, Random Subspace Ensembles for fMRI Classification, IEEE Transactions on Medical Imaging, 29 (2), 2010, 531-542. KunchevaL.I., J. J. Rodriguez, Classifier Ensembles for fMRI Data Analysis: An Experiment, Magnetic Resonance Imaging, 28 (4), 2010, 583-593. KunchevaL.I. and C. O. Plumpton, Choosing parameters for Random Subspace ensembles for fMRI classification, Proc. Multiple Classifier Systems (MCS'10), Cairo, Egypt, LNCS 5997, 2010, 54-63. Plumpton C. O., L. I. Kuncheva, N. N. Oosterhof and S. J. Johnston, Naive random subspace ensemble with linear classifiers for real-time classification of fMRI data, Pattern Recognition, 45 (6), 2012, 2101-2108. Plumpton C. O., L. I. Kuncheva, D. E. J. Linden and S. J. Johnston, On-line fMRI Data Classification Using Linear and Ensemble Classifiers, Proc. ICPR 2010, Istanbul, Turkey, 2010, 4312-4315. 70 41 9 7 5
Joe Freeman 10/11 3rd year project fMRI Visualiser Joey Owen 10/11 3rd year project fMRI Voxel selection Cat Plumpton Jamie Blacker 09/10 3rd year project Fractals and fMRI PhD real-time fMRI data analysis Adam Williams 09/10 3rd year project Emotion recognition from fMRI data Colin Steele 08/09 3rd year project fMRI Data analysis Tom Gardner 10/11 3rd year project Environments for emotion recognition
2010 End of the fMRI era... Genre in crisis
Brain-computer interface through EEG Summer 2010 Peripheral devices Game accessories EEG headsets Inexpensive Accessible Enter Tom Christy!
Summer 2010 And just like that... The idea was born... A game controlled by emotion
Summer 2010 Sa’ad Martin
Autumn 2010 Not as easy as it looked...
AFFECTIVE COMPUTING TACcelebrates its 5th Anniversary The Galvactivator: A glove that senses and communicates skin conductivity
“Affective Computing is an area of computing that relates to, arises from, or influences emotions.” Rosalind Picard, 1995 Valence POSITIVE happy content LAHV HAHV excited calm Arousal PASSIVE ACTIVE angry depressed LALV HALV fearful sad NEGATIVE
EXPRESSION OF EMOTION - MODALITIES physiological behavioural facial expression central nervous system eye tracking interaction with the computer EEG gesture peripheral nervous system fMRI speech fNIRS posture pulse rate EMG pressure on mouse pulse variation respiration skin to drag-click speed Galvanic skin response blood pressure dialogue with tutor
facial expression EEG eye tracking fNIRS posture fMRI gesture EMG speech pulse rate pulse variation respiration blood pressure pressure on mouse Galvanic skin response drag-click-zoom-type speed dialogue with tutor skin to
Detecting Stress During Real-World Driving Tasks Using Physiological Sensors Jennifer A. Healey and Rosalind W. Picard IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, VOL. 6, NO. 2, JUNE 2005 • The subject wore five physiological sensors: • an electrocardiogram (EKG) on the chest • an electromyogram (EMG) on the left shoulder • a chest cavity expansion respiration sensor (Resp.) around the diaphragm, • skin conductivity sensor on the left hand • skin conductivity sensor on the left foot
AMBER Experiment #1 NeuroSky EEG Galvanic skin response (+) Positive emotion Whale and ocean sounds... Pulse signal Tom • (-) Negative emotion • Michael Jackson • Cheeky girls
Individual classifiers Ensembles
A.M.B.E.R. Advanced Multimodal Biometric Emotion Recognition NeuroSky EDA Pulse reader My lovely plant Cast: Tom’s R2D2 (show off!) EMOTIV Nia Serious wired-up man Tom Christy Very important supervisor Lucy Kuncheva
A.M.B.E.R. Advanced Multimodal Biometric Emotion Recognition Media stars overnight!
DATA & Collaborators HARDWARE Tom Lucy
2013 2012 2011 HARDWARE Tom
2013 pulse reader “emotional mouse” Version 1 Reincarnation Version 2 EDA sensors (Galvanic skin response)
This is what Google returned 2nd on “Guillaume Thierry”! In the meantime: Guillaume Thierry Me: Would you like to collaborate on emotion recognition from EEG data? Guillaume: Yes, of course, but listen what a fantastic idea occurred to me just now!!!” Christoph Klein
March 2011 EPSRC proposal Stephan Boehm Let’s give it a go
Seminar May 2012 On-going collaboration with the University of Salzburg Hello Salzburg!
Hello Ramon Mollineda! Hello Juan Rodriguez!
Video: Arch Enemy (My Apocalypse) Volunteers Participants All
Provoked? Acted? Spontaneous? Self-reported?
So, WHAT are we RECOGNISING? Emotions are very difficult to define and explicate. Experiments for provoking emotion vary considerably, and so do the results reported in the literature ( from near chance to 95% accuracy). Most emotion measuring modalities are intrusive and annoying. Emotions are individual for each person. The measured signals are difficult to analyse. There is a bottleneck of idiosyncratic feature extraction and parameter tuning. There is no unified protocol. Benchmark data collections are not available. There is no consensus about the type of experiment to validate a hypothesis (provoked, controlled, acted, spontaneous emotion).
Maybe not... We can detect CHANGE in the physiological responses and the EEG, which may be associated with some emotion. If we need to ACT upon detecting an emotion rather than NAMING it, we still may have a chance. Are we doomed?
Tom Lucy Simple, transparent and generic technologies for feature extraction. The emotional mouse State-of-the-art data analysis Other ingenious input devices User-friendly EEG headsets Unified protocols,benchmark data
Tom Lucy THE END Simple, transparent and generic technologies for feature extraction. The emotional mouse State-of-the-art data analysis Other ingenious input devices User-friendly EEG headsets Unified protocols,benchmark data