290 likes | 670 Views
Contents. Introduction to image retrievalIntroduction to textureMethods of ExtractingEvaluation of some approaches and resultsReferences. 2. Content-based Image Retrieval. An image search engine Works by image content (Color, texture, shape) instead of annotated textsConsist of:Database of Pr
E N D
1. Zahra Mansoori
z_mansoori@ce.sharif.edu Evaluation of Texture Features for Content-based Image Retrieval 1
2. Contents Introduction to image retrieval
Introduction to texture
Methods of Extracting
Evaluation of some approaches and results
References 2
3. Content-based Image Retrieval An image search engine Works by image content (Color, texture, shape) instead of annotated texts
Consist of:
Database of Primitive Image
Feature Extraction Module
Indexing Module
Search and Retrieval Module
User interface (Input: query image, Output: Similar images)
Examples: IBM QBIC, Virage, VisualSEEK, … 3
4. CBIR CBIR modules and flowchart 4
5. What is texture? A key component of human visual perception about the nature and three dimensional shape of physical objects
Can be regarded as a similarity grouping in an image
One of essential Features to consider when querying image database
Normally defined by grey levels
5
6. Texture analyzing Rottenly :
it is required to convert image into gray scald mode
Inspecting batch of pixels in order to find the relationship between them 6
7. Methods of analyzing Approaches to texture analysis are usually categorized into
Structural,
Statistical,
Model-based and
Transform
7
8. Structural approaches Represent texture by well-defined primitives called microtexture and a hierarchy of spatial arrangements of those primitives
Define the primitives and the placement rules to define the texture 8
9. Statistical approaches Represent the texture indirectly by the non-deterministic properties
These properties govern the distributions and relationships between the grey levels of an image 9
10. Model-based approaches Attempt to interpret an image texture by use of, respectively, generative image model
10
11. Transform approaches Represent an image in a space whose co-ordinate system (such as frequency or size)
Interpretation in this space will be closely related to the characteristics of its texture 11
12. Problem & Experimental Set up To Evaluate three texture extraction method to use in Content-based Image Retrieval
Image Collection: Corel Collection
Similarity Measure: Manhattan Metric 12
13. Co-occurrence matrix Definition
One of the earliest methods
Also called GLCM stands for Gray-level Co-occurrence Matrix
Extract 2nd-order statistics from an image
Very successful method 13
14. Co-occurrence matrix (cont.) Let C be the GLCM, so Ca,d(i,j) will be the co-occurrence of pixels with grey values i and j at a given distance d and in a given direction a
Should be symmetric or asymmetric
Usually:
All pixel intensities are quantized into smaller number of available gray levels (8, 16, 64, …). For example if 8 is selected, the target matrix will be 8 x 8.
Values of a are one of values such as 0, 45, 90 and 135. Using all of them may bring more accuracy. 14
15. Co-occurrence matrix Calculating Co-occurrence matrix from a gray scaled image 15
16. Co-occurrence matrix (cont.) Feature Extraction:
Once the GLCM has been created, various features can be computed from it.
All these features are supported by MATLAB
16
17. Co-occurrence matrix – Evaluation Results Distance between 1 and 4 pixels gave the best performance
There was no significant differences between symmetrical and asymmetric matrices
Tiling of the image gave a large increase in retrieval which flatted out by 9 x 9 tiles
The concatenated (cat) features gave better result at all points than the rotationally invariant summed matrices (sum)
The best feature was homogeneity
17
18. Co-occurrence features Mean average precision Retrieval 18
19. Tamura Extract features that correspond to human perception
Contains six textural features:
Coarseness
Contrast
Directionality
Line-likeness
Regularity
Roughness
19
20. Tamura (cont.) 20 First three are most important
Coarseness
direct relationship to scale and repetition rates
calculated for each points of image
Contrast
dynamic range of gray levels in an image
calculated for non-overlapping neighborhoods
Directionality
Measure the total degree of directionality
calculated for non-overlapping neighborhoods
21. Tamura (cont.) 21 Another approach: Tamura CND Image
Spatial joint of coarseness-contrast-directionality distribution (view as RGB distribution)
Extract color histogram style feature from Tamura CND Image
22. Tamura – Evaluation Results 22 Increasing k value for coarseness decrees the performance
Optimum value = 2
Performance of directionality is poor
23. Tamura features Mean average precision Retrieval 23
24. Gabor filter Special case of the short-time Fourier transform
Time-frequency analysis
It is used to model the responses of human visual system
A two dimensional Gabor function
Advantage/disadvantage:
Very popular
Time consuming calculation
Generate complete but non orthogonal basic set so redundancy of data will be occurred
24
25. Gabor filter (cont.) Manjunath et al reduced redundancy by using Gabor wavelet functions
The Features is computed by
Filtering the image with a bank of orientation and scale sensitive filters and,
Computing the mean and standard deviation of the output in the frequency domain
25
26. Gabor filter – Evaluation Results 26 Better for homogeneous textures with fixed size because of specific filter dictionary
Widely used to search for an individual texture tile in an aerial images database
Best response Usage:
Process image for 7x7 tiling and apply filters on
Just 2 scales and 4 orientations
27. Gabor wavelet Mean average precision Retrieval
27
28. References Howarth P. and Ruger S., "Evaluation of Texture Features for Content-Based Image Retrieval," in Third International Conference, CIVR 2004, Dublin, Ireland, 2004.
Deselaers Th., "Features for Image Retrieval," 2003
Materka A. and Strzelecki M. , "Texture Analysis Methods – A Review," Technical University of Lodz, Institute of Electronics, Brussels, COST B11 1998.
Manjunath B.S. and Ma W.Y., "Texture features for browsing and retrieval of image data," Transactions on Pattern Analysis and Machine Intelligence, vol. 18, pp. 837-842, 1996.
Schettini R. ; Ciocca G. and Zuffi S., "A Survey of Methods for Color Image Indexing and Retrieval in Image Databases."
28
29. Appendix: Performance measures of an Information Retrieval System Every document is known to be either relevant or non-relevant to a particular query
Precision: The fraction of the documents retrieved that are relevant to the user's information need
Precision = (Relevant images n Retrieved Images) / Retrieved Images
Recall: The fraction of the documents that are relevant to the query that are successfully retrieved
Recall = (Relevant images n Retrieved Images) / Relevant images
Average Precision: The precision and recall are based on the whole list of documents returned by the system. Average precision emphasizes returning more relevant documents earlier. It is average of precisions computed after truncating the list after each of the relevant documents in turn:
AveP = ?r = 1:n (P(r) . rel(r)) / Relevant images
where r is the rank, N the number retrieved, rel() a binary function on the relevance of a given rank, and P() precision at a given cut-off rank. 29