1 / 21

Rule-based Action Recognition using Object Trajectories

Rule-based Action Recognition using Object Trajectories. UCF VIRAT Efforts. Recognition Process. Input: Time ordered series of 2 dimensional locations of object centroid in image coordinates (trajectory) Output: Action(s) pertaining to the given trajectory Method:

fala
Download Presentation

Rule-based Action Recognition using Object Trajectories

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rule-based Action Recognition using Object Trajectories UCF VIRAT Efforts

  2. Recognition Process • Input: • Time ordered series of 2 dimensional locations of object centroid in image coordinates (trajectory) • Output: • Action(s) pertaining to the given trajectory • Method: • Compute multiple discriminative features for each trajectory • Classify action(s) using a comprehensive set of rules

  3. Trajectory Features • Dynamics based • For a curve, • r (t) = {(x0,y0),…,(xt,yt)} • Instantaneous speed, • v = || dr/ dt || • Average speed, • vav = ∑nvi/ n • Acceleration, • a = || dv / dt || • Arc length, • s = ∫ vdt

  4. Trajectory Features • Shape based • Capture geometrical information • Tangent vector, v = dr / dt • Unit Tangent vector, T = v / ||v|| • Curvature, k(t) = || dT / ds || = ||T’ / v|| • Four point cross ratio, • Cr (p1,p2,p3,p4) = (p3-p1)(p4-p2) / (p4-p1)(p3-p2)

  5. Rules • Rules are based on quantization of feature values • A decrease followed by an increase in unit tangent vector = Right turn, or vice versa • Two consecutive increase or decrease = U-turn

  6. Rules Trajectory Trajectory Trajectory dT / dt dT / dt dT / dt Going Straight Turn Right U-turn

  7. Rules • Accelerate / Decelerate events are detected directly from features • Deceleration to zero speed = Stopping • Maintaining speed and direction = Maintain distance ….

  8. Results • Trajectories extracted from the UCF Aerial Actions dataset • Handle walking, running, turning left and right, and taking U-turn

  9. Results Walking Forward

  10. Results Turn Right

  11. Results U-turn

  12. Results U-turn

  13. Results Turn Right

  14. Results Turn Left

  15. Results Running Forward

  16. VIRAT Events Person Actions Vehicle Actions • Standing • Walking • Running • Digging • Gesturing • Carry Object • Accelerate • Decelerate • Turning • Stopping • U-turn • Maintain distance

  17. Ideas & Future Work • Object Classification • Discriminate between similar trajectories of different objects • Separate person, vehicle and facility detectors • Person-Vehicle and Person-Facility events • Multiple Trajectories per object • Track multiple points on each object • Recognize stationary trajectories using object kinematics • Gesturing, Digging • Can also help or eliminate object detector / classifier

  18. Ideas & Future Work • Geo-Registration • Distance, speed, and acceleration in image plane may not be useful • Allow use of computed distances on ground • Do not have to be absolute longitude and latitude • Accelerate, Decelerate, Maintain Distance • Invariant features • Trajectory features should be invariant to: • Changes in view • Changes in scale • Projective invariant features can eliminate need for geo-registration

  19. Ideas & Future Work • Learning based framework • Learning and clustering of discriminative features • More robust compared to rule based methods • Representative of training trajectory samples • Associated confidence for each recognition • Recognition of unseen and composite events • Quality of Input Tracks • Availability of object trajectories is assumed • Features are highly dependant on tracking accuracy • Broken, merged and split tracks severely affect performance

  20. VIRAT Events • With provision of object recognition / classification, multiple trajectories and geo-registration Person Actions Vehicle Actions Multi-agent Actions • Standing • Walking • Running • Digging • Gesturing • Carry Object • Accelerate • Decelerate • Turning • Stopping • U-turn • Maintain distance • Loading • Unload • Open trunk • Close trunk • Getting into car • Getting out of car • Enter/exit building

  21. Thank You!

More Related