340 likes | 349 Views
Tracking by Sampling Trackers. Junseok Kwon* and Kyoung Mu lee C omputer V ision L ab. Dept. of EECS Seoul National University, Korea Homepage: http://cv.snu.ac.kr. Goal of Visual Tracking. Robustly tracks the target in real-world scenarios. Frame #43. Frame #1.
E N D
Tracking by Sampling Trackers Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage: http://cv.snu.ac.kr
Goal of Visual Tracking • Robustly tracks the target in real-world scenarios Frame #43 Frame #1
Bayesian Tracking Approach edge Intensity Maximum a Posteriori (MAP) estimate
State Sampling Visual tracker Guided by Scale X position Y position State space MAP estimate by Monte Carlo sampling
Problem of previous works Tracking environment changes Fixed Visual tracker can not reflect the changing tracking environment well. Conventional trackers have difficulty in obtaining good samples.
Our approach : Tracker Sampling State sampling Tracker sampling Tracker #1 Scale Y position X position Tracker #2 Scale Y position X position Tracker space Tracker #M Scale Y position X position Sampling tracker itself as well as state
Two challenges Tracker space Tracker space Tracker #1 Tracker #2 Tracker #M How the tracker space is defined? When and which tracker should be sampled?
Challenge 1 : Tracker Space Tracker space • Tracker space • Nobody tries to define tracker space. • Very difficult to design the space because the visual tracker is hard to be described.
Bayesian Tracking Approach • Go back to the Bayesian tracking formulation Updating rule
Bayesian Tracking Approach • What is important ingredients of visual tracker? 1. Appearance model 2. Motion model 3. State representation type 4. Observation type
Tracker Space Appearance model Motion model State representation Observation
Challenge 2 : Tracker Sampling Motion model Appearance model State representation type Observation type Tracker #m Tracker space • Tracker sampling • When and which tracker should be sampled ? • To reflect the current tracking environment.
Reversible Jump-MCMC Delete Delete Delete Delete Add Add Add Add Set of sampled appearancemodels Set of sampled motionmodels Set of sampled state representation types Set of sampled observation types Sampled basic trackers We use the RJ-MCMC method for tracker sampling.
Sampling of Appearance Model Sparse Principle Component Analysis* Appearance models • Make candidates using SPCA* • The candidates are PCs of the target appearance. * A. d’Aspremont et. al.A directformulation for sparse PCA using semidefinite programming.Data Min. SIAMReview, 2007.
Sampling of Appearance Model Our method has the limited number of models • Accept an appearance model • With acceptance ratio
The accepted model increase the total likelihood scores for recent frames • When it is adopted as the target reference
Sampling of MotionModel Motion models K-Harmonic Means Clustering (KHM)* • Make candidates using KHM* • The candidates are mean vectors of the clusters for motion vectors. * B. Zhang, M. Hsu, and U. Dayal. K-harmonic means - a data clusteringalgorithm. HP Technical Report, 1999
Sampling of MotionModel Our method has the limited number of models • Accept a motion model • With acceptance ratio
The accepted model decreases the total clustering error of motion vectors for recent frames • When it is set to the mean vector of the cluster
Sampling of State Representation Fragment 1 Position Fragment 2 Intensity Edge State representation Vertical Projection of Edge (VPE)* • Make candidates using VPE* • The candidates describe the target as the different combinations of multiple fragments. * F.Wang, S. Yua, and J. Yanga. Robust and efficient fragments-basedtracking using mean shift. Int. J. Electron. Commun., 64(7):614–623,2010.
Sampling of State Representation Our method has the limited number of types • Accept a state representation type • With acceptance ratio
The accepted type reduce the total variance of target appearance in each fragmentfor recent frames
Sampling of Observation Gaussian Filter Bank (GFB)* • Make candidates using GFB* • The candidates are the response of multiple Gaussian filters of which variances are different. * J. Sullivan, A. Blake, M. Isard, and J. MacCormick. Bayesian objectlocalisation in images. IJCV, 44(2):111–135, 2001.
Sampling of Observation Our method has the limited number of types • Accept an observation type • With acceptance ratio
The accepted type makes more similar between foregrounds, but more different with foregrounds and backgroundsfor recent frames
Overall Procedure Tracker sampling Tracker #M Tracker #2 Tracker #1 Tracker space State sampling Scale Y position X position Scale Interaction Y position X position Scale Y position X position
Qualitative Results Iron-man dataset
Qualitative Results Matrix dataset
Qualitative Results Skating1 dataset
Qualitative Results Soccer dataset
Quantitative Results Average center location errors in pixels MC : Khan et. al.MCMC-based particle filteringfor tracking a variable number of interacting targets. PAMI 2005. IVT : Ross et. al.Incremental learning forrobust visual tracking. IJCV 2007. MIL : Babenko et. al.Visual tracking with onlinemultiple instance learning. CVPR 2009. VTD: Kwon et. al. Visual tracking decomposition. CVPR 2010.
Summary • Visual tracker sampler • New framework, which samples visual tracker itself as well as state. • Efficient sampling strategy to sample the visual tracker.