360 likes | 499 Views
VACE Executive Brief for MLMI. Dennis Moellman, VACE Program Manager. Briefing Outline. Introduction Phase II Evaluation Technology Transfer Phase III Conclusion. Introduction. What is ARDA/DTO/VACE?. ARDA – Advanced Research and Development Activity
E N D
VACE Executive Brief for MLMI • Dennis Moellman, VACE Program Manager
Briefing Outline • Introduction • Phase II • Evaluation • Technology Transfer • Phase III • Conclusion
Introduction What is ARDA/DTO/VACE? • ARDA – Advanced Research and Development Activity • A high-risk/high-payoff R&D effort sponsored by US DoD/IC • ARDA taking a new identity • In FY2007 under the DNI • Report to: ADNI(S&T) • Renamed: Disruptive Technology Office • VACE – Video Analysis and Content Extraction • Three Phase initiative begun in 2000 and ending 2009 • Winding down Phase II • Entering into Phase III
Context Video Exploitation Barriers • Problem Creation: • Video is an ever expanding source of imagery and open source intelligence such that it commands a place in the all-source analysis. • Research Problem: • Lack of robust software automation tools to assist human analysts: • Human operators are required to manually monitor video signals • Human intervention is required to annotate video for indexing purposes • Content based routing based on automated processing is lacking • Flexible ad hoc search and browsing tools do not exist • Video Extent: • Broadcast News; Surveillance; UAV; Meetings; and Ground Reconnaissance
Research Approach Video Exploitation • Research Objectives: • Basic technology breakthroughs • Video analysis system components • Video analysis systems • Formal evaluations: procedures, metrics and data sets • Evaluate Success: • Quantitative Testing Metric Current Need Accuracy <Human >>Human Speed >Real time <<Real time • Technology Transition • Over 70 technologies identified as deliverables • 50% have been delivered to the government • Over 20 undergoing government evaluation
Management Approach Geared for Success Management Philosophy – NABC • N – Need • A – Approach • B – Benefit • C – Competition
Source Video Language/User Technology Enhancement Filters UnderstandingEngine RecognitionEngine Visualization ExtractionEngine Intelligent Content Services 011010101010111 110100101011 01101010101 01101011 011010 0110 Concept Applications Reference Interests System View
Phase 1 Phase 2 Phase 3 Future Object Detection & Tracking Object/Scene Classification Object Recognition Object Modeling Content Extraction Simple Event Detection Event Recognition Complex Event Detection Scene Modeling Event Understanding Mensuration Indexing Video Browsing Summarization Filtering Intelligent Content Services Advanced query/retrieval using Q&A technologies Content-based Routing Video Mining Change Detection Video Monitoring Image Enhancement/Stabilization Camera Parameter Estimation Multi-modal fusion Enabling Technologies Integrity Analysis Motion Analysis Motion Analysis Event Ontology Event Expression Language Automated Annotation Language Evaluation VACE Interests Technology Roadmap
4% 10% 11% 39% 36% Funding Commitment to Success FY06 Allocations FY07 Allocations 4% 12% 20% 64%
Phase II Programmatics • Researcher Involvement: • Fourteen contracts • Researchers represent a cross section of industry and academia throughout the U.S. partnering to reach a common goal • Government Involvement: • Tap technical experts, analysts and COTRs from DoD/IC agencies • Each agency is represented on the VACE Advisory Committee, an advisory group to the ARDA/DTO Program Manager
Carnegie Mellon Univ. (2) (Robotics Inst.) (Informedia) . IBM T. J. Watson Center Univ. of Washington Wright State Univ. Univ. of Chicago Univ. of Illinois- Urbana-Champaign (2) Univ. of Illinois- Urbana-Champaign Boeing Phantom Works Purdue Univ. Virage TASC AFIT MIT BBN SRI Salient Stills Alphatech Columbia Univ. Univ. of Southern California Sarnoff Corp (2) Univ. of Maryland Univ. of Maryland (2) Univ. of Southern California / Info. Science Inst. Georgia Inst. Of Tech. Telcordia Technologies Univ. of Central Florida Prime Contractors (14) Sub Contractors (14) Phase II Demographics
Phase II Projects
Phase II Projects
Phase II Projects
Evaluation Goals • Programmatic: • Inform ARDA/DTO management of progress/challenges • Developmental: • Speed progress via iterative self testing • Enable research and evaluation via essential data and tools – build lasting resources • Key is selecting the right tasks and metrics • Gear evaluation tasks to research suite • Collect data to support all research
Evaluation The Team NIST USF Video Mining
Evaluation Plan Products Planning Results Task Definitions Dry-Run shakedown Determine Sponsor Requirements Protocols/Metrics Rollout Schedule Data Identification Assess required/existing resources Formal Evaluation Evaluation Resources Training Data Development Data Evaluation Data Develop detailed plans with researcher input Technical Workshops and reports Ground Truth and other metadata Scoring and Truthing Tools Recommendations Evaluation NIST Process
Algorithms System Output Video Data Results Evaluation Ground Truth Measures Annotation Evaluation NIST Mechanics
Evaluation 2005-2006 Evaluations P = Person; F = Face; V = Vehicle; T = Text
Evaluation Quantitative Metrics • Evaluation Metrics: • Detection: SFDA (Sequence Frame Detection Accuracy) • Metric for determining the accuracy of a detection algorithm with respect to space, time, and the number of objects • Tracking: STDA (Sequence Tracking Detection Accuracy) • Metric for determining detection accuracy along with the ability of a system to assign and track the ID of an object across frames • Text Recognition: WER (Word Error Rate) and CER (Character Error Rate) • In-scene and overlay text in video • Focused Diagnostic Metrics (11)
Evaluation Phase II Best Results
Evaluation Face Detection: BNews (Score Distribution)
Evaluation Text Detection: BNews (SFDA Score distribution)
Evaluation Open Evaluations and Workshops -- International • Benefit of open evaluations • Knowledge about others’ capabilities and community feedback • increased competition -> progress • Benefit of evaluation workshops • Encourage peer review and information exchange, minimize “wheel reinvention”, focus research on common problems, venue for publication • Current VACE-related open evaluations • VACE: Core Evaluations • CLEAR: Classification of Events, Activities, and Relationships • RT: Rich Transcription • TRECVID: Text Retrieval Conference Video Track • ETISEO: Evaluation du Traitment et de l’Interpretation de Sequences Video
Evaluation Expanded
Evaluation Schedule
TECH TRANSFER DTO Test and Assessment Activities Purpose: Move technology from lab to operation • Technology Readiness Activity • An independent repository for test and assessment • Migrate technology out of lab environment • Assess technology maturity • Provide recommendations to DTO and researchers
TECH TRANSFER DoD Technology Readiness Levels (TRL)
Technology Transfer Applying TRL DOD Technology Risk Scale RISK LOW HIGH DTO Control 8 9 Production DTO Influence UNCLASSIFIED CLASSIFIED 6 7 UNCLASSIFIED CLASSIFIED Info-X Test Facility UNCLASSIFIED IC/DOD Test Facility(s) Contractor Test Facility 4 5 Use in assessing project’s • Technology maturity • Risk level • Commercializationpotential 1 2 3
Technology Transfer TRA Maturity Assessments
Phase III BAA Programmatics • Contracting Agency: DOI, Ft. Huachuca, AZ • DOI provides COR • ARDA/DTO retain DoD/IC agency COTR’s and add more • Currently in Proposal Review Process • Span 3 FY’s and 4 CY’s • Remains open thru 6/30/08 • Funding objective: $30M over program life • Anticipate to grow in FY07 and beyond • Address the same data source domains as Phase II • Will conduct formal evaluations • Will conduct maturity evaluations and tech transfer
Phase III BAA Programmatics • Emphasis on technology and system approach • Move up technology path where applicable • Stress ubiquity • Divided into two tiers: • Tier 1: One year base with option year • Technology focus • Open to all – US and international • More awards for lesser funding • Tier 2: Two year base with option year(s) • Comprehensive component/system level initiative • Must be US prime • Fewer awards for greater funding
Phase III BAA Schedule
Summary Take-Aways • VACE is interested in: • Solving real problems with risky, radical approaches • Processing multiple data domains and multimodal data domains • Developing technology point solutions as well as component/system solutions • Evaluating technology process • Transferring technology into user’s space
Conclusion Potential DTO Collaboration • Invitations: • Welcome to participate in VACE Phase III • Welcome to participate in VACE Phase III Evaluations
Contacts Dennis Moellman, Program Manager Phones: 202-231-4453 (Dennis Moellman) 443-479-4365 (Paul Matthews) 301-688-7092 (DTO Office) 800-276-3747 (DTO Office) FAX: 202-231-4242 (Dennis Moellman) 301-688-7410 (DTO Office) E-Mail: dennis.moellman@dia.mil (Internet Mail) pmmatth@nsa.gov Location: Room 12A69 NBP #1 Suite 6644 9800 Savage Road Fort Meade, MD 20755-6644