1 / 23

Robust Multi-Pedestrian Tracking in Thermal-Visible Surveillance Videos

Robust Multi-Pedestrian Tracking in Thermal-Visible Surveillance Videos. Alex Leykin , Yang Ran, and Riad Hammoud. Goal. Create a pedestrian tracker that operates in: Varying illumination conditions Crowded environment

esmerelda
Download Presentation

Robust Multi-Pedestrian Tracking in Thermal-Visible Surveillance Videos

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Robust Multi-Pedestrian Tracking in Thermal-Visible Surveillance Videos Alex Leykin, Yang Ran, and RiadHammoud

  2. Goal Create a pedestrian tracker that operates in: • Varying illumination conditions • Crowded environment To achieve it we create a fusion pedestrian tracker that uses input from: • IR camera • RGB camera Our approach consists of three stages: BG Subtraction Bayesian tracker Pedestrian Classifier

  3. codebook codeword Background Model Two stacks of codeword values (codebooks) • Color • μRGB • Ilow • Ihi • Thermal • thigh • tlow

  4. Adaptive Background Update • If there is no match create new codeword • Else update the codeword with new pixel information • If >1 matches then merge matching codewords • Match pixel p to the codebook b I(p) > Ilow I(p) < Ihigh (RGB(p)∙ μRGB) < TRGB t(p)/thigh > Tt1 t(p)/tlow > Tt2

  5. Subtraction Results Color model only Combined color and thermal model

  6. state prior probability observation likelihood Tracking Location of each pedestrian is estimated probabilistically based on: • Current image • Model of pedestrians • Model of obstacles The goal of our tracking system is to find the candidate state x` (a set of bodiesalong with their parameters) which, given the last known state x, will best fitthe current observation z P(x’| z, x) = P(z|x’) · P(x’|x)

  7. Tracking – Accepting the State x’ and x candidate and current states P(x) stationary distribution of Markov chain mt proposal distribution Candidate proposal state x’is drawn with probability mt(x’|x) and then accept it with the probability α(x, x’)

  8. body coordinatesare weighted uniformlywithin the rectangular region R of the floor map. U(x)R and U(y)R  variation from Kalman predicted position d(xt, x’t−1) and d(y, y’t−1) Tracking: Priors Constraintson the body parameters: N(hμ, hσ2) and N(wμ,wσ2)body width andheight Temporal continuity: d(wt, wt−1) and d(ht, ht−1) variation from the previous size N(μdoor, σdoor) distance to the closest door (for new bodies)

  9. Tracking Likelihoods: Distance weight plane Problem: blob trackers ignore blob position in 3D (see Zhao and Nevatia CVPR 2004) Solution: employ “distance weight plane” Dxy = |Pxyz, Cxyz| where P and C are world coordinates of the camera and reference point correspondingly and

  10. Tracking Likelihoods: Z-buffer 0 = background, 1=furthermost body, 2 = next closest body, etc

  11. Tracking: Likelihoods Color observation likelihood is based on the Bhattacharya distance between candidate and observed color histograms Implementation of z-buffer (Z) and distance weight plane (D) allows to compute multiple-body configuration with one computationally efficient step. Let I - set of all blob pixels O - set of body pixels Then

  12. Tracking: Jump-Diffuse Transitions • Add a new body • Delete a body • Recover a recently deleted body • Change body dimensions • Change body position (optimize with mean shift)

  13. H t t-1 Tracking: Anisotropic Weighted Mean Shift Classic Mean-Shift Our Mean-Shift t

  14. Tracking Results

  15. Finding Gait in Spatio-temporal Space Symmetries of the gait patterns • Periodic Pattern Grouping Theory: • A two-dimensional pattern that repeats along one dimension is called a frieze pattern in the mathematics and geometry literature • Group theory provides a powerful tool for analyzing such patterns • Mapping gait into repetitive texture • Translational symmetry: Class P4 • Detection: verifying spatio-temporal texture • Localization: extract orientation (trajectory), frequency (period), representative motif (signature)

  16. Finding Gait in Spatio-temporal Space Classifying Pedestrians X-t Image Extract Lattice Signature Results Details in Y. Ran, I. Weiss, Q. Zheng, and L. S. Davis. Pedestrian detection via periodic motion analysis. IJCV 2007

  17. Classification Results

  18. Tracking results

  19. Pedestrian Detection

  20. Contributions • Robust to illumination changes • Resolving track initialization ambiguity with MCMC • Non-unique body-blob correspondence • Gait detector runs in real time

  21. Future Work • Extend binary background mask with foreground probability values • Incorporate these probabilities into appearance-based fitness equation for particle filter-based tracker • Utilize tracklet stitching (via particle tracker) to decrease the number of broken paths

  22. Aknowledgements Organizers of OTCBVS Benchmark Dataset Collection http://www.cse.ohio-state.edu/otcbvs-bench

  23. Thank you! alexleykin.zapto.org

More Related