120 likes | 265 Views
Canada. Computational Video Group Groupe Vidéo Informatique. National Research Council Canada Conseil national de recherches Canada. Institute for Information Technology Institut de technologie de l'information.
E N D
Canada Computational Video Group Groupe Vidéo Informatique National Research Council Canada Conseil national de recherches Canada Institute for Information Technology Institut de technologie de l'information In Proc. IEEE Conf. on Automatic Face and Gesture Recognition, Washington DC, May 21-22, 2002Dmitry O. Gorodnichywww.cv.iit.nrc.ca/~dmitry
(2) Face-Tracking Based User Interfaces • Replacing cumbersome track-ball (track-stick) on laptops. • Extra degree of control (e.g. to switch the focus of attention). • Hands free control (e.g. for handicap users). • Interactive games: • more physical, entertaining, 3D control, multiple-user Fig.1. A user plays an aim-n-shoot Bubble-Frenzy game aiming the turret by pointing with her nose. (slight rotation of head allows to aim precisely in 180o range)
(3) Key Issues and Approaches 1.Speed(in real time). 2.Affordability (with cheap easy-to-install, but low-quality USB cams) 3.Robustness(to normal head motion).4.Precision(with pixel precision) • Fig. 2. Two users play a virtual ping-pong game, bouncing the ball with their heads. Image-based • tracking allows one to track heads, however it doesn’t allow one to pin-point with head. Image-based Face Tracking: - Uses global facial cues: skin colour, head shape, head motion - Doesn’t require high-quility images, robust, butnot precise
(4) Question: What features to use? Feature-based Face Tracking • Should be used for precise tracking. However, it’s not robust. • [ Bradsky, Toyama, Gee, Cipolla, Zelinsky, Matsumoto, Yang, Baluja, Newman, …] • …”still not ready for practical implementation” • Fig. 3. Tracking eyes • (from [Gorodnichy97]). • Feature f is associated with vector Vf (obtained by centering a mask on the feature) • Features are tracked by template matching with Vfin the local area of interest (calculated with image-based cues) • The pixel u=(i,j) which has the largest score s(Vu, Vf)is returned Proposition 1: Robust and precise tracking can be achieved by designing an invariant to head motion feature template.
(5) Edge-based Features – not good • Features are conventionally thought of • as visually distinctive (ie with large DI(f) ). • Hence, the commonly used features • are edge-based, such as • corners of brows, eyes, lips, nostrils etc. • They however are • not robust • not always visible Desired feature properties: 1. Uniqueness: s(Vf, Vu) min 2. Robustness: s(Vft=0, Vft) max 3. Continuity (for sub-pixel accuracy):the closer a pixel u in an image is to the pixel corresponding to f, the larger the score between Vu and Vf (Then evidence-based convolution can be applied to refine feature position u)
(6) Convex-shape features – much better • Definition 1: Convex-shape feature is defined as an extremum of a convex-shape surface • Shape-from-Shading theory shows, that these features exhibit the desired properties (for the fixed camera-user-light configuration) • Nose feature • Definition 2:Nose feature is the extremum of the 3D nose surface curvature defined as z=f(x,y) in camera centered coordinate system. • Thus defined, Nose feature is • Very robust • Can be detected with sub-pixel precision • PLUS, It is always visible!
(7) Nouse TMFace Tracking Technology • Based on tracking the convex-shape nose feature. • Enables precise hand-free 2D control in a) joystick or b) mouse modes. • Allows aiming and drawing with the nose. • Just think of your nose as a chalk or a joystick handle! • NB: Left/Right head motion is very natural and can be easily applied for • control, provided it can tracked precisely. • Affordable and downloadable. Uses a generic USB camera! Zero intialization of Nouse Using Nouse for Painting
(8) Performance: Robustness & Precision The range of head motion tracked Robustness to rotation Robustness to scale ‘Yes’ motion Test: The user rotates his head only! (the shoulders do not move) ‘No’ motion
(9) On Importance of Two Cameras • For humans: it ismuch easier to track with two eyes than with one eye. • Not only extends tracking from 2D to 3D, • but also makes tracking more precise and robust! • For computers however: … • 1. The relationship between “eyes” is not known. • 2. Tracking of features is not robust (to rotation and scale) • StereoTracker from CVG NRC: • Tracks face in 3D with two USB cams • to control a virtual man, by using • 1) Projective Vision Theory • and • 2) robust Nose Feature Tracking
(10) StereoTracking with USB webcams • Stage 1: Self-calibration • The relationship between the cameras is represented using the • Fundamental MatrixF: (uleft, F uright)=0 • F can be found automaticallyfor any two • camerasby observing the same scene • using www.cv.iit.nrc.ca/research/PVT: • find cornersmatchingfilteringrobust solution with 7 selected corners (RANSAC)F • Stage 2: Feature selection and calibration verification • Select features in one image • Verify that the epipolar line • passes thru each feature in • the second image • Use nose tip feature and two other common features (eg brow corners) • More at www.visioninterface.org/vi2002
(11) Using Nose for StereoTracking Proposition 2 : With F known, the tracked 3D feature is the one that minimizes the epipolar error defined by Proposition 3 : First detect convex-shape nose feature. Then use rigidity constraint to find other features.
(12) Conclusions • Nose is a very unique feature. Humans are lucky to have it! • Nose allows us to track a face very robustly and precisely. • Pointing with Nose is natural. • This makes 2D perceptual user interfaces a reality! • Nose helps recovering other facial features. • Two cameras (even bad webcams) make tracking more robust. • This makes 3D face tracking affordable, precise and robust. • Use your Nose as Mouse! – Use Nouse! • NouseTM is open for public evaluation at • www.cv.iit.nrc.ca/research/Nouse • Acknowledgements • Nouse TM is trademark of Computational Video Group IIT NRC • Work done with Gerhard Roth, Shazad Malik • BubbleFrenzy game is provided by www.extendedreality.com