100 likes | 269 Views
Rick Kjeldsen. Camera Roll-out and Optimization. Camera Deployment Overview. Configure in SVS Create Camera / View Calibrate - RoU - Object Type - Color - Camera Mvt Detect. Evaluation and Deployment Event Analysis Video Pre-processing RoU tweaks.
E N D
Rick Kjeldsen Camera Roll-out and Optimization
Camera Deployment Overview Configure in SVS Create Camera / View Calibrate - RoU - Object Type - Color - Camera Mvt Detect Evaluation and Deployment Event Analysis Video Pre-processing RoU tweaks Select Cameras & Define Use Case Select Profile On basis of use case Alert Deployment and Optimization Customer IBM
Alert Deployment and Optimization Initial Configuration Classify view Select Initial Alert Configuration Define Alert Test / Tune Collect perf data Evaluate Remediate Monitor Adjudicate Alerts Additional tuning Maintenance Select Cameras Deploy Staged Event Testing Reject Customer IBM
Initial Camera Configuration - details Initial Alert Configuration • Select alert on basis of view characteristics • e.g. Motion Detection vs Directional Motion vs Region • Select initial alert parameters • RoI • Timeouts • Size • etc.
Test/Tune loop – detailsIterate as needed (weekly) • Deploy • FP < Threshold • HR FPR Evaluation • FPR & HR • Options remaining • Recommend Reject • High FP • Remediation not effective • No options remain • Review w/ customer • Alert Analysis • Estimate #FPs • Estimate # wild TPs • Determine trends/patterns • FP Tuning • Address primary cause of FPs • If HR impact, review w/ customer • Address primary cause of low HR • HR Tuning • As wild events available: • Detailed analysis of results • Select / Apply Remediation
Staged Event testing • Needed when wild events are rare • Customer stages events • As many as reasonable • Revisit cameras when possible • IBM evaluates results and tunes as needed • Options – depends on customer’s needs • All cameras or Sample ? • Only after FP tuning ? • Cause to Reject Camera ? • Absolute performance numbers unreliable without large numbers of events !
Typical Alert tuning success rate • 70% deployed with minimal reduction in Hit Rate • 35% minimal tuning (1-2 iterations) • 15% with moderate tuning (2-3 iterations) • 15% where tuning has minor HR impact • 15% could be deployed with more severe HR impact • 15% will be rejected
Hit Rate acceptance testing • Odds of failing acceptance test: • Assumptions: • Actual Alert detection rate: 75% • Minimum acceptable performance: 65% • Message: • An alert which performs better than required has a good chance of failing an acceptance test with a small number of trials. • If your customer insists you test every camera, you must insist on a fair test – lots of trials.
Misc • Camera management tools • Currently spreadsheet –based • Built-in support in future release