1 / 17

Personal Driving Diary: Constructing a Video Archive of Everyday Driving Events

Personal Driving Diary: Constructing a Video Archive of Everyday Driving Events. IEEE workshop on Motion and Video Computing ( WMVC) 2011 IEEE Workshop on Applications of Computer Vision (WACV) 2011. Electronics and Telecommunications Research Institute

cosima
Download Presentation

Personal Driving Diary: Constructing a Video Archive of Everyday Driving Events

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Personal Driving Diary:Constructing a Video Archive of Everyday Driving Events IEEE workshop on Motion and Video Computing ( WMVC) 2011 IEEE Workshop on Applications of Computer Vision (WACV) 2011 Electronics and Telecommunications Research Institute M. S. Ryoo, Jae-Yeong Lee, JiHoonJoung, Sunglok Choi, and Wonpil Yu

  2. Introduction • It illustrates important driving events of the user. • Enable interactive search of video segments • Help the user to analyze his/her driving habits and patterns • The objective is to construct a system that automatically annotates and summarizes videos.

  3. Framework

  4. geometry component(1/2) • visual odometry [9] • To measure the self-motion of the camera.

  5. geometry component(2/2) • visual odometry • Feature (SIFT) detection for each frame • Matching is performed using KLT optical flows • Estimating a ground plane using regular patternson the ground (e.g. lane and crosswalk) • It enables globallocalization of other objects on it.

  6. Detection component(1/3)

  7. Detection component(2/3) • Detect pedestrians • Adopt histogram of oriented gradients (HOG) features [3] and apply a sliding windows method • Filtering out windows with little vertical edges

  8. Detection component(3/3) • Vehicle detection • Apply the Viola and Jones’ method [15] to detect rear-view of appearing vehicles Rectangle features [15] P. Viola and M. Jones. Rapid object detection using a boosted cascade of simple features. In CVPR, 2001.

  9. Tracking component • A single hypothesis for each object • Relies on color appearance model of the object • Each object hypothesis is computed using its position, size, and color histogram

  10. Event analysis component • The role is to label all ongoing events of the vehicle given a continuous video sequence. • They are recognized by hierarchically analyzing the relationships among the detected sub-events. • Spatio-Temporal Relationship Decision Tree.

  11. Decision Trees • Rules for classifying data using attributes. • The tree consists of decision nodes and leaf nodes. • A decision node has two (or more branches), each representing values for the attribute tested. • A leaf node attribute produces a homogeneous result (all in one class), which does not require additional classification testing. intermediate node

  12. Decision Tree Example feature Outlook event sunny rain overcast Yes Humidity Windy result high normal false true No Yes No Yes

  13. Entropy Maximizes the gain E(Current set) – E(All child sets) Entropy: a formula to calculate the homogeneity of a sample. Entropy = -1*(0.5log2(0.5) + 0.5log2(0.5)) = +1 Entropy = -1*(0.1log2(0.1) + 0.9log2(0.9)) = 0.47

  14. Spatio-Temporal Relationship Decision Tree Binary decision tree Describing a condition of a particular sub-event (e.g. its duration greater than a certain threshold)

  15. Spatio-Temporal Relationship Decision Tree • The system recognizes the sub-events using four types of features. • Extracted from local 3-D XYT trajectories. • Time intervals of all occurring sub-events are recognized, and are provided to the system for the further analysis. • Describing a condition of a particular sub-event • A relationship between two sub-events

  16. Experiments • Dataset of driving events • The dataset is segmented into 52 scenes, where each of them contains 0 to 3 events.

More Related