1 / 9

Improving Human Computer Interaction in a Classroom Environment using Computer Vision

Improving Human Computer Interaction in a Classroom Environment using Computer Vision. by Joshua Flachsbart, David Franklin and Kristian Hammond in Proceedings of the 5th international conference on Intelligent User Interfaces 2000. Overview.

hung
Download Presentation

Improving Human Computer Interaction in a Classroom Environment using Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving Human Computer Interaction in a Classroom Environment using Computer Vision by Joshua Flachsbart, David Franklin and Kristian Hammond in Proceedings of the 5th international conference on Intelligent User Interfaces 2000

  2. Overview • Trying to improve HCI with multiple (visual) input modes • Discusses chosen imput modes, their constraints, contexts and utilisation Mika P. Nieminen

  3. Environment: Intelligent Classroom • Confined space enables better sensing, plan (task) recognition and plan execution (helping) • Current supported functions: • Automatic filming of a lecture • Powerpoint presentation aids • VCR control • Person tracking, icon recognition Mika P. Nieminen

  4. Tasks out of bulk data • Using contextual information, mostly from visual sensors, to locate the user and figure out their goals/tasks • Hand locations important for pointing and drawing command icons Mika P. Nieminen

  5. Person tracking methods 1/2 • Background Subtraction Input Mode • Slower • Does not recognisedifferent users • Loses immobile users Mika P. Nieminen

  6. Person tracking methods 2/2 • Color Histogram Input Mode • Users histogram must be taught • Does not track hands • Same engine as BSIM Mika P. Nieminen

  7. Icon Recognition • Drawn on board • Remembers learnt icons • Recognition by matchingSalient Features • Video ’smart’ zooms inusing context Mika P. Nieminen

  8. Future Work • Show that multiple input modes function better • Improvements under development • Allow switching modes on the run • Allowing input modes to adapt on the run (Example ’fading’ persons) Mika P. Nieminen

  9. Conclusions • Sounds nice, might work, needs more study • Limited applications Mika P. Nieminen

More Related