1 / 1

What is a Good Nearest Neighbors Algorithm for Finding Similar Patches in Images?

dim = argmax dim [var(pts dim )] split = median(pts dim ) L_pts = [pts dim < split] R_pts = [pts dim > split]. PCA Tree. axis = pca(pts).eigvec[0] curpts = project(pts, axis) split = median(curpts) L_pts = [curpts < split] R_pts = [curpts > split]. Ball Tree.

minty
Download Presentation

What is a Good Nearest Neighbors Algorithm for Finding Similar Patches in Images?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. dim = argmaxdim[var(ptsdim)] split = median(ptsdim) L_pts = [ptsdim < split] R_pts = [ptsdim > split] PCA Tree axis = pca(pts).eigvec[0] curpts = project(pts, axis) split = median(curpts) L_pts = [curpts < split] R_pts = [curpts > split] Ball Tree pt1, pt2 = chooseRefPts(pts) d1 = d(pts, pt1) d2 = d(pts, pt2) L_pts = [d1 < d2] R_pts = [d2 < d1] Vantage Point (vp) Tree Given a set of points P and query point q, the NN problem is: ε-NN Search: Find set of points PCwithin distance ε: k-NN Search: Find set of k points PC closest to q: pt = chooseRefPt(pts) split = median(d(pts, pt)) L_pts = [d(pts,pt) < split] R_pts = [d(pts,pt) > split] ε-NN: p4, p6, p8 2-NN: p6, p8 4-NN: p6 , p8, p4, p3 What is a Good Nearest Neighbors Algorithm for Finding Similar Patches in Images? Neeraj Kumar*, Li Zhang†, Shree K. Nayar* *Columbia University, †University of Wisconsin-Madison http://www.cs.columbia.edu/CAVE/projects/nnsearch/ Motivation Nearest Neighbor Approaches Performance Evaluation Construction and Search Performance All methods organize points into a tree structure, with the only difference being the function used to split a set of points into different child nodes (shown below). Query Patch • Cons. cost is number of distance function evaluations • Search speed is improvement over brute-force search Results kd-Tree Brute-Force Search: 2125 ms vp-Tree Search: 1.25 ms Brute-Force Search: 1375 ms vp-Tree Search: 3.43 ms Brute-force search for all patches would take >250 hrs/image! Search Speed vs. Distance/Patch Size • Fast search would greatly speed-up many vision algorithms: • Object Recognition – e.g., using SIFT [Lowe 2003] • Image Denoising using non-local means [Buades et al. 2005] • Shape Matching using self-similarity [Schechtman & Irani 2007] • Texture Synthesis using image quilting [Efros & Freeman 2002] Exponential dropoff in speed Fewer points within same “average per-pixel” distance The Nearest Neighbors Problem Exponential dropoff in speed Closest k points increasingly distant Search Speed vs. Input Set Size Random Images Video Frames Example results for query point q: Implementation Tricks High similarity within frames No similarity within images • We use various optimizations on all trees: • Compute Lp norms using lookup tables • Pre-calculate distances within each leaf node • Use priority queues for k-NN searches Related Work Conclusions • Surveys: [Chavez et al. 2001, Shakhnarovich et al. 2006] • Approximate NN: [Arya et al. 1998, Indyk and Motwani 1998] • Nearest Neighbors using L∞: [Nene and Nayar 1997] • Performance Comparisons: [Mikolajczyk and Matas 2007] Evaluation Dataset

More Related