1 / 26

Effect of Shared-attention for Human-Robot Interaction

Effect of Shared-attention for Human-Robot Interaction. Junji Yamato jy@acm.org NTT Communication Science Labs., NTT Corp. Japan Kazuhiko Shinozawa, Futoshi Naya ATR Intelligent Robot and Communication Labs. Aim. To build Social Robot/Agent Sub goal To establish Evaluation methods

kathie
Download Presentation

Effect of Shared-attention for Human-Robot Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effect of Shared-attention for Human-Robot Interaction Junji Yamato jy@acm.org NTT Communication Science Labs., NTT Corp. Japan Kazuhiko Shinozawa, Futoshi Naya ATR Intelligent Robot and Communication Labs.

  2. Aim • To build Social Robot/Agent • Sub goal • To establish • Evaluation methods • Design guidelines for communication of human-robot/agent

  3. Method • To measure the influence of Agent/Robot on users • Acceptance ratio of agent/robot recommendation

  4. Color name selection task • No “correct” answer • Easy to be influenced Blue or Green? Cobalt green or emerald green? Skin color or KARE-IRO? SUMIRE-IRO or AYAME-IRO? ---- ---- Total:30 questions. (from color name text book)

  5. Four experiments • Compared agent and robot • Compared agent and robot in physical world • Measured the effect of eye contact • Measured the effect of shared-attention Detailed description of Experiment 1 and 2 Shinozawa, K., Naya, F., Yamato, J., and Kogure, K. Differences in Effect of Robot and Screen Agent Recommendations on Human Decision-Making , IJHCS (to appear) Experiment 1, 2, and description of K4(robot) Yamato, J., Shinozawa, K., Brooks, R., and Naya, F. Human-Robot Dynamic Social Interaction. NTT Technical Review 1, 6(2003), 37-43. Available on-line http://www.ntt.co.jp/tr/ Back number -> Sep. 2003

  6. Experiment 1:Compare Agent and Robot Agent Robot Agent Robot • Conditions: 30 questions, 30 subjects in each group ‐Same question sequences, same voice, similar gesture • Measurement: acceptance ratio, questionnaire

  7. Experiment 1: Robot

  8. Experiment 1: Result • Acceptance:agent > robot (p<.01) • Familiarity:independent

  9. Gap Gap Gap Initial expectation Robot has more influence because it lives in 3D world, same as subjects. agent ○ robot ×

  10. Experiment 2: Compare in physical world Color plate Button box Button box • No recommendation (30 subjects) • Recommendation by robot(31 subjects) • Recommendation by agent (30 subjects)

  11. Experiments

  12. Experiment 2: Result • selection ratio:robot > agent( p < 0.05) robot>> no recommendation ( p < 0.01)

  13. × ○ × ○ Embodiment and communication Experiment 1 and 2: Results Consistency matters. Physical world Media world agent robot

  14. Why robot is better? • Easy to detect gaze • Eye contact • Shared attention/joint attention Measure the effect of eye contact and shared-attention

  15. Experiment 3: Effect of eye contact (mutual gaze) • Eye contact was established by face tracking • Eye contact time: period that subject looked at robot and robot looked at subject • Eye contact time and selection ratio? • Two groups (14 subjects each) • Eye contact, and NO eye contact

  16. Robots

  17. Selection ratio • Higher selection ratio for eye contact group • K4: No E.C. < E.C.(p=0.012) • Rabbit: No E.C. < E.C. (p=0.003)

  18. Experiment 4: Effect of shared-attention • Shared attention: • Period that robot looks at an object and subject looks at the same object. (color plate, button box) • SA time and selection ratio • Is there correlation?

  19. Establishing shared-attention • Robot looks at color plate and button box by prepared program • Eye contact established by face tracking Example: video

  20. Experimental conditions • 28 subjects • SA time = 51.7 sec (total for 30 questions) • (Longer than in Experiment 3 ) • Selection ratio. Average: 0.57 S.D.= 0.14 • Some subjects were positive, and others were not. Clear contrast, from the questionnaire. Example: Robot is prompting wrong choice. I feel the robot forced me to select his recommendation (negative).

  21. SA time and selection ratio • No correlation Selection ratio Shared-Attention time (count) 50count=1sec.

  22. Clustering subjects by TEG(Ego-gram) • Ego-gram based on transactional analysis • Measure three ego-states by questionnaire • CP, NP (critical parent, nurturing parent) • A (adult) • FC, AC (free child, adapted child) • TEG (Tokyo Univ. Egogram)is common in Japan

  23. High/Low TEG measurement and SA time. • Strong correlation in SA time and acceptance ratio for high AC (Adapted Child) group

  24. SA time and selection ratio (high AC & low CP group) • Positive correlation(Speaman’s r=0.51,p=0.051).

  25. SA time and selection ratio • High-SA group = high selection ratio (p<0.05) (high AC group)

  26. Result and Discussion • High AC subject (obedient type) showed positive correlation between SA time and selection ratio. • No significant difference between SA time itself and selection ratios for high AC and low AC groups • Eye contact and shared-attention promote close communication. Some people like such intimate relation, but others don’t. It depends on the character. • SA is effective. Even SA was not “actually” realized. We do not need to develop image understanding technology; we just have to fake it.

More Related