320 likes | 843 Views
Past Projects, Current Work, & Future Potential. Eye Tracking Research. Andrew T. Duchowski. What is Eye Tracking?. Method for calculating what user is looking at Accomplished through computer vision techniques imaging pupil and corneal reflection of infra-red light
E N D
Past Projects, Current Work, & Future Potential Eye Tracking Research Andrew T. Duchowski
What is Eye Tracking? • Method for calculating what user is looking at • Accomplished through computer vision techniques imaging pupil and corneal reflection of infra-red light • Can be done in real-time, hence eyes can be used to point to windows, icons, menus, in place of hand-controlled mouse • Or…eye movements can be recorded for offline analysis • Recent improvement in technology (cameras) have revitalized the field
Old(er) Technology • Windows 95 PC • Requires chinrest • Cumbersome and difficult to operate Fig.1: VR and eye-tracking equipment
State-of-the-Art Next Generation Technology • Tobii ET-1750 eye tracker • Apple Mac G5 (ET client) • Dell PC (ET server) • 3-headed display Fig.2: Tobii user Fig.3: Tobii setup
Selected Past/Current Projects (~$1.3M) • NASA ($265,742, with IE): Visual inspection training • diagnostic eye tracking for aircraft inspection • NSF CAREER ($212,022): Computer graphics, education, service • real-time Level Of Detail, course development • NSF ITR ($313,361, with IE): Eye tracked avatars in Virtual Reality • eye tracked real-time “look at this” communication • NSF ATE ($389,655, with IE): Tech. transfer to Greenville Tech • aircraft inspection training and education • DoD/Navy/SPAWAR Charleston ($115,528): Usability engineering • interface design through eye movement analysis
NASA: Visual Inspection • Visual inspection in VR cargo bay (VRST 2001) • 21/2 D scanpaths (EuroGraphics 2002) • Eye movement analysis in 3D (ETRA 2002) • 3D scanpath comparison (VSS 2003) Fig.4: Real and virtual aircraft cargo bay (2 1/2 D and 3D models) and eye movements
NASA: Visual Inspection • Our most recent success: our first publication at SIGCHI (nominated for best paper, top 5% of 371 paper submissions)
NSF CAREER: Real-Time Graphics and Video • Gaze-contingent terrain generation showed feasibility of real-time graphics changing with gaze direction (Smart Graphics 2000) • Models gave 10-fold speedup in frame rate (EuroGraphics 2001) • Video demonstrates in-hardware image processing on GPU-programmable graphics cards (ADC 04 invited talk) Fig.5: Gaze-contingent terrain, model, video
NSF CAREER: Course Development Fig.6: Example of student work: VR maze navigation experiment • Quality of coursework is steadily improving • Scanpaths showed user attention to exocentric map during navigation of a 3D maze (EGVE 04)
NSF CAREER: Course Development • Our most recent success: undergraduate paper at Graphics Interface 2005 (DuPont award)
NSF CAREER: Service & Scholarship Fig.7: ETRA conference proceedings and text • Eye Tracking Research & Applications conference brings together researchers from around the world, now in its 4th year
NSF ITR: Eye Tracked Avatars • Research showed advantage of visual deictic reference in VR for training purposes (ETRA 04) Fig.8: Avatars in Collaborative Virtual Environment (CVE)
NSF ATE: Technology Transfer • Currently in the process of transferring VR technology to Greenville Tech for aircraft inspection training • Expanding training simulators to borescope inspection Fig.9: Physical and virtual engine components for inspection Fig.10: User working with WindowVR
DoD/Navy/SPAWAR Charleston: Usability Engineering • Collaboration between the military (Navy), industry (EMA), and academia (Clemson University) • Using eye movements to design user interfaces (project just started) • Navy expects to apply to design of Command & Control applications Fig.11: Collecting scanpaths and “hot spots”
Future Projects • NSF / DoD: seeking funding for next generation eye tracking equipment • currently only 1 tracker for entire class (and me!) • Future projects: • NSF? • DoD? • BMW / ICAR? Fig.12: UAV navigation training Fig.13: HUD design/analysis
References Danforth, R., Duchowski, A., Geist, R., McAliley, E. (2000). A Platform for Gaze-Contingent Virtual Environments. In Smart Graphics (Papers from the 2001 AAAI Spring Symposium, Technical Report SS-00-04), Menlo Park, CA, AAAI, pp. 66-70. Duchowski, A., Cournia, N., and Murphy, H. (2004). Gaze-Contingent Displays: Review and Current Trends. In Adaptive Displays Conference, 7 August, Los Angeles, CA. Duchowski, A., Medlin, E., Cournia, N., Gramopadhye, A., Melloy, B., and Nair, S. (2002). 3D Eye Movement Analysis for VR Visual Inspection Training. In Eye Tracking Research & Applications (ETRA), 25-27 March, New Orleans, LA, ACM, pp. 103-110, 155. Duchowski, A., Medlin, E., Gramopadhye, A., Melloy, B., and Nair, S. (2001). Binocular Eye Tracking in VR for Visual Inspection Training. In VRST Proceedings, ACM, pp. 1-8.
References Duchowski, A., Marmitt, G., Desai, R., Gramopadhye, A., and Greenstein, J. (2003). Algorithm for Comparison of 3D Scanpaths in Virtual Reality. In VSS 03 Proceedings, Vision Sciences Society (posters). Marmitt, G., Duchowski, A. (2002). Modeling Visual Attention in VR: Measuring the Accuracy of Predicted Scanpaths. In EuroGraphics 2002 (Short Presentations), 2-6 September, Saarbrucken, Germany, EuroGraphics. Murphy, H. and Duchowski, A. (2002). Perceptual Gaze Extent & Level Of Detail in VR: Looking Outside the Box. In SIGGRAPH 02 Conference Abstracts & Applications, San Antonio, TX, ACM. Murphy, H. and Duchowski, A. (2001). Gaze-Contingent Level Of Detail. In EuroGraphics (Short Presentations), Manchester, UK, EuroGraphics. Vembar, D., Iyengar, N., Duchowski, A., Clark, K., Hewitt, J., Pauls, K. (2004). Effect of visual cues on human performance in navigating through a virtual maze. In EuroGraphics Symposium on Virtual Environments, 8-9 June, Grenoble, France, EuroGraphics.