530 likes | 791 Views
Multi-camera Video Surveillance: Detection, Occlusion Handling, Tracking and Event Recognition. Oytun Akman. Overview. Surveillance Systems Single Camera Configuration Moving Object Detection Tracking Event Recognition Multi -c amera Configuration Moving Object Detection
E N D
Multi-camera Video Surveillance: Detection, Occlusion Handling, Tracking and Event Recognition Oytun Akman
Overview • Surveillance Systems • Single Camera Configuration • Moving Object Detection • Tracking • Event Recognition • Multi-camera Configuration • Moving Object Detection • Occlusion Handling • Tracking • Event Recognition
Single Camera Configuration • Moving Object Detection (MOD) • Tracking • Event Recognition
Single Camera ConfigurationMoving Object Detection (MOD) Input Image - Background Image = Foreground Mask
Single Camera ConfigurationMoving Object Detection (MOD) • Frame Differencing(M. Piccardi, 1996) • Eigenbackground Subtraction(N. Oliver, 1999) • Parzen Window (KDE) Based MOD(A. Elgammal, 1999) • Mixture of Gaussians Based MOD(W. E. Grimson, 1999)
Single Camera ConfigurationMOD – Frame Differencing • Foreground mask detection • Background model update
Single Camera ConfigurationMOD – Eigenbackground Subtraction • Principal Component Analysis (PCA) • Reduce the data dimension • Capture the major variance • Reduced data represents the background model (http://web.media.mit.edu/~tristan/phd/dissertation/chapter5.html)
Single Camera ConfigurationMOD – Parzen Window Based • Nonparametrically estimating the probability of observing pixel intensity values, based on the sample intensities
Single Camera ConfigurationMOD – Mixture of Gaussians Based • Based on modeling each pixel by mixture of K Gaussian distributions • Probability of observing pixel value xN at time N, where (assuming that R,G,B are independent)
Single Camera ConfigurationTracking • Object Association • Mean-shift Tracker(D. Comaniciu, 2003) • Cam-shift Tracker(G. R. Bradski, 1998) • Pyramidal Kanade-Lucas-Tomasi Tracker (KLT)(J. Y. Bouguet, 1999) (A constant velocity Kalman filter is associated with each tracker)
Single Camera ConfigurationTracking – Object Association • Oi(t) = OJ(t+1) if • Bounding box overlapping • D(Oi(t), OJ(t+1)) < thresholdmd, D() is a distance metric between color histograms ofobjects • Kullback-Leibler divergence • Bhattacharya coefficient
Single Camera ConfigurationTracking – Mean-shift Tracker • Similarity function between the target model q and the candidate model p(y) is where p and q are m-bin color histograms (http://www.lisif.jussieu.fr/~belaroussi/face_track/CamshiftApproach.htm)
Single Camera ConfigurationTracking - Mean-shift Tracker - Simulation Result
Single Camera ConfigurationTracking – Cam-shift Tracker • Backprojection image (probability distribution image) calculated • Mean-shift algorithm is used to find mode of probability distribution image around the previous target location
Single Camera ConfigurationTracking – Cam-shift Tracker - Simulation Result
Single Camera ConfigurationTracking – Pyramidal KLT • Optical flow d=[dx dy] of the good feature point (corner) is found by minimizing the error function (http://www.suri.it.okayama-u.ac.jp/research/2001/s-takahashi/s-takahashi.html)
Single Camera ConfigurationTracking - Pyramidal KLT - Simulation Results
Single Camera ConfigurationEvent Recognition - Hidden Markov Models (HMM) • GM - HMMs, trained by proper object trajectories, are used to model the traffic flow(F. Porikli, 2004) (F. Bashir, 2005) m:starting frame number in which the object enters the FOV n:end frame number in which the object leaves the FOV
Single Camera ConfigurationEvent Recognition – Simulation Result
Multi-camera Configuration • Background Modeling • Occlusion Handling • Tracking • Event Recognition
Multi-camera ConfigurationBackground Modeling • Three background modeling algorithms • Foreground Detection by Unanimity • Foreground Detection by Weighted Voting • Mixture of Multivariate Gaussians Background Model
Multi-camera ConfigurationBackground Modeling • Common field-of-view must be defined to specify the region in which the cameras will cooperate
Multi-camera ConfigurationBackground Modeling - Unanimity • If (x is foreground) && (xI is foreground) foreground
Multi-camera ConfigurationBackground Modeling – Weighted Voting and are the coefficients to adjust the contributions of the cameras. Generally, the contribution for the first camera(reference camera with better positioning) is larger than the second one, and
Multi-camera ConfigurationBackground Modeling – Weighted Voting
Multi-camera ConfigurationBackground Modeling – Mixture of Multivariate Gaussians • Each pixel modeled by mixture of K multivariate Gaussian distributions where
Multi-camera ConfigurationBackground Modeling – Mixture of Multivariate Gaussians Input image Mixture of Multivariate Gaussians Single camera MOG
Multi-camera ConfigurationBackground Modeling - Conclusions • Projections errors due to the planar-object assumption • Erroneous foreground masks • False segmentation results • Cameras must be mounted on high altitudes compared to object heights • Background modeling by unanimity • False segmented regions are eliminated • Any camera failure failure in final mask • Solved by weighted voting • In multivariate MOG methodmissed vehicles in single camera MOG method can be segmented
Multi-camera ConfigurationOcclusion Handling • Occlusion • Primary issue of surveillance systems • False foreground segmentation results • Tracking failures • Difficult to solve by using single-camera configuration • Occlusion-free view generation by using multiple cameras • Utilization of 3D information • Presence of different points of views
Multi-camera ConfigurationOcclusion Handling – Block Diagram
Multi-camera ConfigurationOcclusion Handling -Background Subtraction • Foreground masks are obtained using background subtraction
Multi-camera ConfigurationOcclusion Handling – Oversegmentation • Foreground mask is oversegmented using “Recursive Shortest Spanning Tree” (RSST) and K-means algorithms RSST K-means
Multi-camera ConfigurationOcclusion Handling – Top-view Generation
Multi-camera ConfigurationOcclusion Handling – Top-view Generation • Corresponding match of a segment is found by comparing the color histograms of the target segment and candidate segments on the epipolar line RSST K-means
Multi-camera ConfigurationOcclusion Handling – Clustering • Segments are grouped using “shortest spanning tree” algorithm using the weight function RSST K-means
Multi-camera ConfigurationOcclusion Handling – Clustering • After cutting the edges greater than certain threshold RSST K-means
Multi-camera ConfigurationOcclusion Handling – Conclusions • Successful results for partially occluded objects • Under strong occlusion • Epipolar matching fails • Objects are oversegmented or undersegmented • Problem is solved if one of the cameras can see the object without occlusion • RSST and K-means • have similar results • K-means has better real time performance
Multi-camera ConfigurationTracking – Kalman Filters • Advantage: continuous and correct tracking as long as one of the cameras is able to view the object • Tracking is performed in both of the views by using Kalman filters 2D state model: State transition model: Observation model:
Multi-camera ConfigurationTracking – Object Matching • Objects in different views are related to each other via homography
Multi-camera ConfigurationTracking – Simulation Results Multi-camera Tracking
Multi-camera ConfigurationTracking – Simulation Results Single-camera Tracking Single-camera Tracking
Multi-camera ConfigurationTracking – Simulation Results Multi-camera Tracking
Multi-camera ConfigurationTracking – Simulation Results Single-camera Tracking Single-camera Tracking
Multi-camera ConfigurationEvent Recognition - Trajectories • Extracted trajectories from both of the views are concatenated to obtain a multi-view trajectory
Multi-camera ConfigurationEvent Recognition – Viterbi Distances of Training Samples
Multi-camera ConfigurationEvent Recognition – Simulation Results with Abnormal Data • Average distance to GM_HMM_1 : 10.20 • Average distance to GM_HMM_2 : 10.06 • Average distance to GM_HMM_1+2: 20.04