220 likes | 378 Views
Basic Working of an Eye-Tracker. Mr DJ Wium Department of Computer Science and Informatics UFS. EYE-TRACKERS COME IN VARIOUS SHAPES AND SIZES. EOG. Tower-mounted. Mobile Phone. Scleral Coil. Portable. Desktop. Wearable. MY FOCUS: VIDEO-BASED EYE-TRACKERS USING FEATURE DETECTION.
E N D
Basic Working of an Eye-Tracker Mr DJ Wium Department of Computer Science and Informatics UFS
EYE-TRACKERS COME IN VARIOUS SHAPES AND SIZES EOG Tower-mounted Mobile Phone Scleral Coil Portable Desktop Wearable
MY FOCUS: VIDEO-BASED EYE-TRACKERS USING FEATURE DETECTION Video-based? Any eye-tracker that uses video frames to determine where one is looking.
Appearance-Based VS. Feature-based Eye-tracking Appearance-based: Manipulation of image without feature detection. Feature-based: Eye features are detected.
All eye-trackers used today are video-based and uses feature detection
FEATURES THAT ARE DETECTED • Pupil Centre • Glints (Corneal Reflections)
What are glints? • Glints are light (in this case infra-red) that reflects off the cornea. Also known as first Purkinje image.
HARDWARE REQUIRED FOR FEATURE DETECTION • A video camera that is sensitive to both optical and IR. • One or more IR light source(s). • This is what causes the glints! • IR is not visible and thus not distracting. • Try this later on: Look at the eye-tracker using your phone’s camera.
Pupil and Glint Detection • Various algorithms available. • All algorithms are based on colour contrast. • Has to be performed for every frame! • Important to do it efficiently to ensure a high frequency. Starburst Ellipse fitting
HOW do the features tell where I AM looking? • Relative positions of pupil centre and glint(s) carry information about the position of the eyes relative to the eye-tracker. • Two approaches to using this information: • Do regression on the pupil-glint vectors. • Reconstruct the eye’s position in 3D space using an eye model. • Both approaches require calibration.
Calibration • Calibration is required because of: • Differences in participants’ eyes. • Changes in environmental conditions (lighting etc.). • Changes in relative position of hardware and participant. • Demonstration of Tobii Studio calibration.
Regression – An Analogy • A little more complex than this, but the same principle is used. • Red dots indicate the known pupil-glint vectors (y) for specific screen positions (x). • Where the moving dot paused and shrinked. • We calculate the blue line that fits the red dots best. • In our case the blue line represent pupil-glint vectors for every position on the screen. • Our line is more complex, but still a function of screen position (x). • For every pupil-glint vector (y) we now get, we can determine the screen position (x).
Eye Model Type of calibration required depends on the number of cameras and IR light sources used.
What does the raw data look like? • Example of Excel Sheet with raw data exported from Tobii Studio. • Demonstration of raw data in graphichal form.
Events • Hard to interpret raw data, therefore it needs to be grouped into ‘events’. • Mostly, eye movements can be classified as fixations and saccades. • Fixations: The eyes focus on a single spot. This is when cognitive processing takes place. • Saccades: The eyes move from one fixation to another.
Calculation of Events • Fixation Filters • Group together raw data points that are in close proximity, both spatially and temporally. • The remaining data points between fixations form saccades.
Fixation Filters • Numerous filters exist. • Filters have numerous parameters and thresholds, becausedifferent scenarios require different ways of calculating fixations. • The I-VT filter • Gap fill-in : Adds missing data by linear interpolation. • Noise reduction : Smoothes data by averaging. • Velocity calculator : Adjusts maximum eye velocity inside fixations. • Merge adjacent fixations : Merges fixations that are in close proximity together. • Discard short fixations : Discards fixations that are unrealistically short.
Data Quality • Major measures of data quality are: • Accuracy: the closeness of the calculated point of regard to the true one. • Precision: the spread of the raw data points during a fixation. • Robustness: % of participants that can be tracked with an eye-tracker. • Trackability: % of frames for which the eyes are tracked. Poor accuracy Poor precision
Typical Accuracy Values • Measured in degrees (°) of visual angle. • If one sits at a typical distance of 60cm from the stimulus, 1 cm ≈ 1°. • Acceptable accuracy depends on type of study. • If two different areas of interest are in close proximity, good accuracy is required. • Most manufacturers claim 0.5°, but reality is 0.7° - 1.7°. • Based on several sources and personal experience.
Thank You Dankie