320 likes | 853 Views
Class-Specific Hough Forests for Object Detection. Zhen Yuan Hsu Advisor : S.J.Wang. Gall, J., Lempitsky , V.: Class- specic hough forests for object detection. In: IEEE CVPR(2009 ). Outline. Related work Why we use Random forest What’s Hough forest
E N D
Class-Specific Hough Forestsfor Object Detection Zhen Yuan Hsu Advisor:S.J.Wang Gall, J., Lempitsky, V.: Class-specichough forests for object detection. In: IEEECVPR(2009)
Outline • Related work • Why we use Random forest • What’s Hough forest • How Hough forest work for object detection
Implicit shape models: Training • Extract 25x25 patches around Harris corners. • Generate a codebook of local appearance patchesusing clustering. • For each cluster, extract its center and store it in the codebook. • For each codebook entry, store all positions it was found relative to object center.
Implicit shape models: Testing • Given test image, extract patches, match to codebook entry • Cast votes for possible positions of object center • Search for maxima in voting space • Extract weighted segmentation mask based on stored masks for the codebook occurrences Match 、offset
Why we use Random forest Time、Training data Random forest
Decision tree x1>w1 x2 Yes No x2>w2 W2 Yes No x1 W1
A Forest v v tree t1 tree tT …… leaf nodes split nodes category c category c
What’s Randomness Randomness – Data and Split fuction for each node: Split fuction is randomlyselected.
Binary Tests split node • selected during training from a random subset of all split functions. . P .q :16*16 imagefeature a threshold choice
Randomness - Split fuction • Try several lines, chosen at random • Keep line that best separates data • information gain • Recurse
Random forest for object detection Object localization x:regression Classfying patch belong to object c:classification
What’s Hough forest Random forest Hough vote Hough forest
Hough Forests:Training • Supervised learning • Label: negative orbackgroundsamples (blue) positive samples (red) offset vectors (green) Feature of local patch
Hough Forests:Training …… leaf nodes split nodes CL : positive sample patch proportion
Leaves two important informationfor voting: 1.CL: positive sample patch proportion 2. DL={di} , iϵA Stop criteria Leaf condition: 1. number of image patches < ϵ 2.a threshold based on minimum of uncertainty(Class-label , Offset vector)
Quality of Binary Tests • Goal: Minimize the Class-label uncertainty and Offset uncertainty: • Type of uncertainty is randomly selected for each node • Class-label uncertainty: • Offset uncertainty: A=the set of all image patch={ } Ci=class label
Original image Detection Position y . Interest points Matched patches
Detection Position y . …… 1.CL: positive sample patch proportion 2.DL={di} iϵA Possible Center of objet:y+di
Hough vote d2 Position y . d3 d1 Probabilistic votes Source: B. Leibe
Hough vote • For location x and given image patch I(y) and tree T • x:center of bounding box • x≈y+di • Confidence vote: 1.CL =weight 2.di :offest vector Over all trees: Accumulation over all image patches:
Multi-Scale and Multi-Ratio • Multi Scale: 3D Votes (x, y, scale) • Multi-Ratio: 4D Votes (x, y, scale, ratio)
UIUC Cars - Multi Scale • Wrong (EER) • Correct
reference • http://mi.eng.cam.ac.uk/~tkk22/iccv09_tutorial • 利用霍夫森林建構行人偵測技術- 清華電機系 陳仕儒碩士論文2012 • An Introduction to Random Forests forMulti-class Object Detection, J.Gall