160 likes | 361 Views
EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS. T. Balomenos, A.Raouzaiou, S.Ioannou, A.Drosopoulos, K.Karpouzis and S.Kollias. Image, Video and Multimedia Systems Laboratory National Technical University of Athens. Outline. Facial Expression Estimation Face Detection
E N D
EMOTION ANALYSIS IN MAN-MACHINE INTERACTION SYSTEMS T. Balomenos, A.Raouzaiou, S.Ioannou, A.Drosopoulos, K.Karpouzis and S.Kollias Image, Video and Multimedia Systems LaboratoryNational Technical University of Athens
Outline • Facial Expression Estimation • Face Detection • Facial Feature Extraction • Anatomical Constraints - Anthropometry • FP Localization • FAP calculation • Expression Profiles • Expression Confidence enforcement - Gesture analysis
quick rejection quick rejection y y Variance > T # Skin Color Pixels > T preprocessing n n 1. Subtract mean 2. Divide by std.dev. no face no face face no face 1 classification classification subspace project. 0 Y=LTX Red. SVM(Y) Red. SVM(Y) no face 0 1 Classify Face Detection
Multiple cue Facial Feature boundary extraction: eyes & mouth, eyebrows, nose • Edge-based mask • Intensity-based mask • NN-based (Y,Cr,Cb, DCT coefficients of neighborhood) mask Each mask is validated independently
Final mask validation through Anthropometry Facial distances Male/Female separationmeasured by the US Army 30 year period The measured distances are normalized by division with Distance 7, i.e. the distance between the inner corners of left and right eye, both points the human cannot move.
Anthropometry based confidence DA5n, DA10n: distances in figures normalized by division with distance DA7: (DA5n=DA5/DA7,DA10n=DA10/DA7) DAewn: eye width (calculated from DA5 and DA7) DAewn=((DA5-DA7)/2)/DA7
FAP-based description (Facial Animation Parameters) • Discrete features offer a neat, symbolic representation of expressions • Not constrained to a specific face model • Suitable for face cloning applications • MPEG-4 compatible: unified treatment of analysis and synthesis parts In MMI environments
FAPs estimation • Absence of clear quantitative definition of FAPs • It is possible to model FAPs through FDP feature points movement using distances s(x,y) e.g. close_t_r_eyelid (F20) - close_b_r_eyelid (F22) D13=s (3.2,3.4) f13= D13 - D13-NEUTRAL
Sample Profiles of Anger A1:F4[22, 124], F31[-131, -25], F32[-136,-34], F33[-189,-109], F34[-183,-105], F35[-101,-31], F36[-108,-32], F37[29,85], F38[27,89] A2:F19[-330,-200], F20[-335,-205], F21[200,330], F22[205,335], F31[-200,-80], F32[-194,-74], F33[-190,-70], F34=[-190,-70] A3:F19 [-330,-200], F20[-335,-205], F21[200,330], F22[205,335], F31[-200,-80], F32[-194,-74], F33[70,190], F34[70,190]
Gesture Analysis • Gestures too ambiguous to indicate emotion on their own • Gestures are used to support the confidence outcome of facial expression analysis HMM gesture class probabilities to emotional state transformation table Cr/Cb based hand detection
Emotion analysis system overview G: the value of a corresponding FAP f: Values derived from the calculated distances
calculated FP distances rules activated recognised emotion System Interface
Conclusions • Estimation of a user’s emotional state based on a fuzzy rules architecture • MPEG-4 :a compact and established means for HCI • Evaluation approach based on anthropometric models and measurements • Work validating the described developments in the framework of the IST ERMIS project and HUMAINE