1 / 13

Problem Statement

Combining Human and Machine Capabilities for Improved Accuracy and Speed in Visual Recognition Tasks Research Experiment Design Sprint: IVS Flower Recognition System Amir Schur. Problem Statement Investigate human-computer interaction in applications of pattern recognition where

eben
Download Presentation

Problem Statement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Combining Human and Machine Capabilities for Improved Accuracy and Speed in Visual Recognition Tasks Research Experiment Design Sprint:IVS Flower Recognition System Amir Schur

  2. Problem Statement • Investigate human-computer interaction in applications of pattern recognition where • higher accuracy is required than is currently achievable by automated systems • and there is enough time for a limited amount of human interaction • Measure the degree of accuracy and speed improvement attributable to the human interaction

  3. Background • Most existing automated visual recognition tasks yield far from production-level results • And manual task is too cumbersome and time consuming • We need to find a place in between where speed is highly increased and accuracy is still acceptable

  4. Interactive Visual System (IVS)for Flower Identification • Originally developed at RPI for desktop PC, system called “Caviar” • Later developed into a handheld application in an RPI/Pace project, called IVS

  5. Overview of Interactive TasksIVS Flower Identification System • Object segmentation • Feature extraction (numerous tasks) • Matching/classifying Each task can be done by human only, automated only, or combination of human and computer

  6. Detail of IVS Interactive Tasks IVS for flower identification has six system activities: • Determining the dominant color of the flower petal • Determining secondary (less dominant) color of the flower petal • Determining color of the stamen or center portion of the flower • Counting the number of petals • Getting the horizontal and vertical bounds of the target flower • Getting the horizontal and vertical bounds of a flower petal Original software developed for Desktop and Palm Pilot, Java code Uses k-Nearest Neighbor (thus accurate training data is required) Color determination utilizes RGB color schema

  7. Design Sprint: Overview • Three separate experiments: • human only • machine only (no human subjects necessary) • human and machine combined • Currently IVS has • 75 training photos (three photos each of 25 flowers) • 20 test photos • Half the subjects will start with human-only identification, followed by machine + human, using existing 20 test images • Other half vice versa: human + machine then human only. This group will also collect new flower images. (Balanced experimental design: no subjects get an unfair advantage)

  8. Design Sprint: Human Only Scenario • Capture time and accuracy • Ideas: • Use IVS test photos • Use good flower guide to identify photos

  9. Design Sprint: Machine Only Scenario • Capture time and accuracy • Ideas: • First iteration: use existing 20 test photos • Record top 10 choices • More digital images must be acquired with correct identification to enlarge training data

  10. Design Sprint: Human + Machine • Capture time and accuracy of combination of human + machine activity • Ideas: • Segregate each available automated task. Run all automated except for one, where this part is done by human input. • Segregate group of tasks (color determination, background segmentation). Perform one task with computer and another with human.

  11. Anticipated Experimental Outcomes Accuracy Machine + human Machine only Human only Time Time vs accuracy in Visual Recognition Tasks

  12. Analysis of Results • Time required by human will dictate the need of machine assistance. How much time is saved by using human + machine tool? • Accuracy level of human + machine will dictate the need of such tool. Can it achieve the same level of accuracy as compared to human only? • What is the maximum capability of machine only in terms of time and accuracy?

  13. Possible Extensions • Many functions can be extended: • Utilize different automated methods for color recognition (HSB, LAB, YCbCr, etc). • Utilize automated texture based methods (gabor and gist) • Utilize automated contour based pattern recognitions (distance vs angles, distances projection, min/max ratio, area ratio, automated number of petals counting) • More seamless integration of human and machine input. Currently it’s one or the other: cannot update machine’s cropping and outlining result, cannot update machine’s color determination.

More Related