390 likes | 399 Views
This project aims to detect and model facial expressions in real-time using a webcam. The system enhances web meetings and eliminates the need for specific hardware, making it more accessible to users.
E N D
LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun (05521661) Wong Chi Kin (05524554)
Outline • Project Overview • Motivation • Objective • System Architecture • Face Coordinate Filter • Implementation • Facial expression analysis • Face modelling • Demonstration • Future work • Conclusion
Project Overview • Detect the facial expression • Draw corresponding model
Motivation • Face recognition technology has become more common • Webcam has high resolution enough • Computation power is high enough
Objective • Enrich the functionality of web-cam • Make net-meeting more interesting • Users are not required to pay extra cost on specific hardware • Recognize human face generically
Face Coordinate Filter • Our system is based on this filter and built on top of this filter • Input: video source • Output: the coordinate of vertices
Implementation - Calibration • Face mesh coordinates => pixel coordinates WHERE
Implementation – Facial Expression Analysis • We assume that the face coordinate filter works properly • Detect eye blinking and mouth opening by coordinate system • With sample movies, record the coordinate changes • Plot graph • Statistic Analysis
Implementation – Facial Expression Analysis • Using vertex pair (33, 41), (34, 40), (35, 39) Part of Face mesh - Eye
Implementation – Facial Expression Analysis • Using vertex pair (69, 77), (70, 76), (71, 75) – outer bound of lips • Using vertex pair (93, 99), (94, 98), (95, 97) – inner bound of lips Part of Face mesh - Mouth
Implementation – Facial Expression Analysis • Distance between two vertices < 1 unit • There exists other factors affect the difference • Distance between camera and user • User moves his or her head quickly • for reference
Implementation – Facial Expression Analysis • Three methods • Competitive Area Ratio • Horizontal Eye-Balance • Nearest-Colour Convergence
Competitive Area Ratio To detect whether the mouth is opened or closed
Competitive Area Ratio • We can get the area of the triangle by
Horizontal Eye-Balance • To detect whether the head is inclined
Horizontal Eye-Balance Approach I
Horizontal Eye-Balance Approach I However…
Horizontal Eye-Balance Approach II
Nearest-Colour Convergence • Retrieve pixel colour in the specific area • Treat pixel colour into three different channel (RGB) • Take the average value in each channel
Nearest-Colour Convergence • Colour space difference: • Eye is closed if:
Direct 3D • The characters we will be used in the system modelling
Texture Sampler Mouth Opened Eye Closed
Texture Sampler • Pre-production of image
Texture Sampler • Loading the texture • Mapping object coordinate to texel coordinates
Texture Sampler • Prepare the index buffer
Facial Expression Modelling • Update the object coordinates • Normalize the coordinates geometry
Demonstration • We are going to play a movie clip which demonstrate our system
Future Work • Improve the preciseness of face detection • Use 3-D model instead of 2-D texture • Allow net-meeting software to use it
Conclusion • We have learnt with DirectShow and Direct3D • We have developed search methods to detect the facial expressions
End • Thank you! • Q&A