1 / 16

Using Artificial Queries to Evaluate Image Retrieval

Using Artificial Queries to Evaluate Image Retrieval. Nicholas R. Howe Department of Computer Science Cornell University. How Do We Compare Image Retrieval Algorithms?. Different research groups use images from different sources. Image sets are of different sizes. Tasks are different.

alagan
Download Presentation

Using Artificial Queries to Evaluate Image Retrieval

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Artificial Queries to Evaluate Image Retrieval Nicholas R. Howe Department of Computer Science Cornell University

  2. How Do We CompareImage Retrieval Algorithms? • Different research groups use images from different sources. • Image sets are of different sizes. • Tasks are different. • Each researcher identifies set of queries and targets through subjective criteria. • Can’t share keys because image sets are not standard.  Answer: Badly! Workshop on Content-Based Access of Image and Video Libraries

  3. How It’s Usually Done • Each researcher tests a proposed algorithm against a few baselines. • e.g., Color Histograms. • No data to compare latest techniques. • Test sets are different. • Implementation of baselines may differ also. Workshop on Content-Based Access of Image and Video Libraries

  4. Some Difficulties • Given a query, which target is most relevant? • Context will determine answer. ? Workshop on Content-Based Access of Image and Video Libraries

  5. What Should a Good Test Do? • Provide comparable results even with different image sets. • Offer insight into the behavior of different retrieval algorithms. • Run quickly. • Allow for easy implementation. Workshop on Content-Based Access of Image and Video Libraries

  6. f Proposal: Altered-Image Queries Retrieved ranks: Image Library 1 Look for rank of original: Query 2 3 Altered Image Image from Library etc. Workshop on Content-Based Access of Image and Video Libraries

  7. The Crop Test • Crop image to k% of its original area. • Simulates close-up shot of same subject. Original Crop-50 Workshop on Content-Based Access of Image and Video Libraries

  8. The Jumble Test • Shuffle tiles in image divided on an hk grid. • Simulates image with similar elements in a different arrangement. Jumble-44 Original Workshop on Content-Based Access of Image and Video Libraries

  9. The Low-Con Test • Decrease contrast to k% of its original range. • Simulates altered lighting conditions and/or camera differences. Original Low-Con-80 Workshop on Content-Based Access of Image and Video Libraries

  10. Typical results • Most retrievals are at low rank. • A few retrievals are at much higher rank. Median: 26 Mean 205 • Median and mean summarize the results of • multiple repetitions. Workshop on Content-Based Access of Image and Video Libraries

  11. Difficulty of Altered-Image Queries • Both mean and median increase with difficulty. • Note order-of-magnitude changes. Workshop on Content-Based Access of Image and Video Libraries

  12. How Stable are Altered-Image Queries? • Ran Crop-50 on three entirely different sets of 6000 images. • Some consistency even with different test sets. • Look for order-of-magnitude change. Workshop on Content-Based Access of Image and Video Libraries

  13. Does the Number of Images Matter? • Found linear dependence on number of images. Workshop on Content-Based Access of Image and Video Libraries

  14. How Many Queries Must Be Run? • Small % of total image set gives decent figure. Workshop on Content-Based Access of Image and Video Libraries

  15. Comparing Algorithms Using Altered-Image Queries • Three algorithms compared using altered image queries. • Especially good or bad performance can be identified. Workshop on Content-Based Access of Image and Video Libraries

  16. Final Thoughts • Altered-Image Queries are... • Well defined. • Easy to implement. • Consistent over different image sets. • A useful addition to our evaluation toolkit. • Also offer diagnostic potential. Workshop on Content-Based Access of Image and Video Libraries

More Related