240 likes | 343 Views
Finding Similar Image Quickly Using Object Shapes. Heng Tao Shen Dept. of Computer Science National University of Singapore Presented by Chin-Yi Tsai. Outline. Motivation Related Work A Hierarchical Partitioning Framework Via A ngle M apping
E N D
Finding Similar Image Quickly Using Object Shapes Heng Tao Shen Dept. of Computer Science National University of Singapore Presented by Chin-Yi Tsai
Outline • Motivation • Related Work • A Hierarchical Partitioning Framework Via Angle Mapping • Hierarchical Partitioning With Shape Representations • Hierarchical Partitioning With Angle Vectors • Experiments • Conclusion
Introduction • There are two requirements for a content-based image -> effectiveness, efficiency • Identifying relevant image quicklyfrom a large images • A framework for fast image retrieval based on object shapes extracted from object within images
… Level N-2 Level N-1 Level N Logical Images Introduction (cont’d) coarser fewer partitions fewer dimensionality
Introduction (cont’d) • Angle mapping (AM) replaces a sequence of connected edges by a smaller number of edges • angle > ? • length < ? • Dimensionality
Introduction (cont’d) • Two hierarchical structure to facilitate speedy retrieval • Hierarchical Partitioning on Shape Representation (HPSR) • shape representation as the indexing key • Hierarchical Partitioning on Angle Vector (HPAV) • angle information as the indexing key
Related Work • For content-based retrieval system, to map physical objects into logical representation • Color histogram, 2D-strings, symbolic image • Decomposing an image into its individual object • Several indexing structure (dimensionality) • R-tree, R+-tree, R*-tree, TV-tree, NR-tree • HPSR, HPAV ( Angle Mapping )
A Hierarchical Partitioning Framework Via Angle Mapping • The framework maps high-D into multiple level low-D • AM approximates a shape based on the angles between edges • Sharper angles and longer edges carry more important information about shape • Angle Interval (AI) • AI[i] = (90 + 90 * (i - 1) / N, 90 + 90 * i / (N)] or • AI[i] = (270 - 90 * (i - 1) / N, 270 - 90 * i / (N)] • If N=3, (150, 180), (120, 150), (90, 120) • Prune Length Threshold (PLT)
Logical Level-3 Level-2 Level-1 A Hierarchical Partitioning Framework Via Angle Mapping AI[3] = ( 150, 180 ) AI[2] = ( 120, 150 ) AI[1] = ( 90, 120 )
A Hierarchical Partitioning Framework • Dimension Reduction Ratio • DRR(i, i-1)=[Dim(R(i)) – Dim(R(i-1))] / Dim(R(i)) (9-3)/9 = 2/3
Algorithm: A Hierarchical Partitioning Framework • 1. Find the shape dominating outline and initialize it as level N representation • 2. for level i from N to 1 do • 2.1 check if angle lies in AI[i] • 2.2 check edge < PLT • 2.3 go back 2.1 or 2.2 • 2.4 get level-i representation • 2.5 if there are too many shapes at level i • 2.5.1 cluster similar shapes • 2.5.2 identify a representative from the shape ( N = 2 )
Hierarchical Partitioning With Shape Representations • Two shapes that are similar are grouped as a cluster • Representation Reduction Ratio • RRR(i, i-1) = NumberOfNodesAtLevel(i) / NumberOfNodesAtLevel(i-1) Level N-1 RRR(N,N-1) =(6-4)/6=1/3 Level N
Algorithm: HPSR • 1. Initialize n to N • 2. For all logical shapes, apply AM to get their level-n representation • 3. Merge identical representation into partition • 4. If number of partition is too big, cluster similar representation given distance threshold • 5. For each partition, produce partition’s representative as a node at level n • 6. Build connection between level-n and their children • 7. if n>1, map level n nodes to level n-1 representations and decrease n by 1, then go to step 3. ( 2-level HPSR Indexing )
Algorithm: Query Processing in HPSR • 1. Generate query shape’s N multi-level representation • 2. Initialize level n as 1 • 3. Compute the similarity between query shape’s level-n representation and level n nodes • 4. If no similar node is found, return null • 5. If leaf level is reached, get shape’s logical representation and compute the similarity with query shape. Return those similar shapes • 6. Increase n by 1 and retrieval the similar node’s children as level-n nodes. Go back to step 3 ( 2-level HPSR Indexing )
HPSR Matching • To get the similarity between two dominating outlines, this paper applies the turning function a(x) V v 1 x
Hierarchical Partitioning With Angle Vectors AI[3] = ( 150, 180 ) AI[2] = ( 120, 150 ) AI[1] = ( 90, 120 ) AI[0] = ( 0, 90) AV for logical representation 5 3 0 1 AV for level-3 representation 3 3 1 AV for level-2 representation 4 2 3 AV for level-1 representation
( HPSR ) ( HPAV ) 1. To construct the HPAV structure, we need to build the HPSR structure first 2. Only difference between HPSR and HPAV is the node representation 3. In HPAV tree, each level has much lower fixed dimensions 4. HPAV has fewer storage requirement
Similarity measure for AV comparison Definition:
Experiments • Set up • P-III 700MHz with 128 Mbytes of RAM • To evaluate the two structure HPSR and HPAV on their effectiveness and efficiency, as well as storage requirement
Tuning Mapping Level and PLT • During the Angle Mapping process, two parameter may affect the results • Mapping level N, Prune Length Threshold (PLT) • Choose the optimal value for N and PLT
more Angle Interval too many level lower-dimension coarser approximation
Conclusion • A framework for partitioning image database based on shapes of objects • Use hierarchy of approximation of shapes that reduces the dimensionality of shapes . . . (AM) • Meet the user’s performance requirement • Two indexing structure based on the framework • HPSR, HPAV