130 likes | 147 Views
EyeProxy enables spontaneous and minimally distractive remote communication using physical surrogates and eye gestures. The system, based on attention research, enhances verbal interactions with visual cues.
E N D
eyePROXY Lenko Grigorov, CISC 839 Supervisor: Roel Vertegaal Additional support by Skaburskis A and Changuk S Lenko Grigorov, CISC 839
Motivation • Nonverbal communication communication using number of cues, precedes verbal communication • Physical surrogates make available physical communication cues • Spontaneous interaction interaction established without need to involve complex procedures • Availability of other party participants separated geographically, other party available? • Interruption level device – minimally distractive, provide different levels of interruption Lenko Grigorov, CISC 839
Research • Based on work by Baha Jabarin and James Wu • Jabarin B, Wu J, The physical proxy: an attention-based mechanism for establishing distributed verbal communication, 2002 • Related research • Kuzuoka H, Greenberg S, Mediating Awareness and Communication through Digital but Physical Surrogates, CHI’99, 1999 • Vertegaal R, The GAZE Groupware System, CHI’99, 1999 • Greenberg S, Peepholes: Low cost awareness of one’s community, ACM SIGCHI’96 • Tang J, et al. Supporting distributed groups with a Montage of lightweight interactions, CSCW’94, 1994 • Vertegaal R, Designing Attentive Interfaces, ETRA’02 • others Lenko Grigorov, CISC 839
Prototype • Sony camera rotates eyeballs no visual inspection, no eye tracking • eyeTracker reports user’s eye position relative to current camera position, so eyeballs can target • Calls controlled through eyegaze GUI only for setup, ringing, sending eye gestures • Proxies communicate via internet exchange state information • H323 VoIP connection Lenko Grigorov, CISC 839
Overview Lenko Grigorov, CISC 839
Project Development • Added eye tracker to detect gaze • Implemented states and event handlers • Developed a set of eye gestures • Incorporated VoIP system Lenko Grigorov, CISC 839
Eye tracking • Implementation using eyeCONTACT sensor • Analogue video input • MS DirectShow filter to locate pupils reports pupil coordinates • If no pupils – locate with scanning along path • Tracking – keep pupils centered in image • Allows for • Natural interaction • User attention • … but very hard to locate users Lenko Grigorov, CISC 839
States and events (1) • Explicit states • States for eye attention, call, proxy movements… • Can be communicated via network • eyePROXYs learn about other party • Via exchange of messages Lenko Grigorov, CISC 839
Events onRing() onEyeContact() onEyesLost() onHangup() onNod() onBeep() … States call_request tracking_user searching idle in_call in_call/idle … States and events (2) Lenko Grigorov, CISC 839
Eye gestures (1) • Set of “eye” movements • Communicate information non-verbally like in real conversations • Approximate real eye cues shake when party not available, look away at end of conversation… • Coupled with states of system gives feedback to the user about changes of states Lenko Grigorov, CISC 839
Eye gestures (2) Lenko Grigorov, CISC 839
Conclusion • Performs well as physical proxy for remote individual • Availability of other party verified with little disturbance • Facilitates casual communication • Both establishing and maintaining • Augments verbal communication with visual cues • Eye gestures Lenko Grigorov, CISC 839
Future development • Integration with eyeReason server • Addition of another filter to locate head • Kalman filter, recognizes skin • Recognition of user • Easier configuration • Support for multiple proxies • etc… Lenko Grigorov, CISC 839