170 likes | 287 Views
Collaboration to Meet Future T&E Needs. ITEA 14 September 2010. Mr. Mike Crisp Deputy Director, Air Warfare Operational Test and Evaluation. Need vs. Speed vs. Cost: A Balancing Act in Acquisition. Requirements Process stressed in Meeting Today’s Needs DoD Budgets Under Pressure
E N D
Collaboration to Meet Future T&E Needs ITEA 14 September 2010 Mr. Mike Crisp Deputy Director, Air Warfare Operational Test and Evaluation
Need vs. Speed vs. Cost:A Balancing Act in Acquisition • Requirements Process stressed in Meeting Today’s Needs • DoD Budgets Under Pressure • Warfighter Demand Cycle inside Acquisition OODA Loop • Concurrent Development, Production, Operations, and Support • Program Managers doing it all • Configuration Control • How many one or two of a kind can we manage? • DoD Acquisition Life Cycle Framework • Complex Systems Integration in Fluid Operational Environments • Adaptive and Flexible enough to support rapid fielding? • JUONS, Rapid Block upgrades, Milestones, Increments Innovative and Adaptive Approaches Required
Challenges for Building Effective Integrated Test Partnership • Limited Test Resources: Range Capabilities, Size, Forces (Manpower & Materiel) • Limited Test Time: Fielding & Production Schedules, Range Availability • Limited Test Articles: Unit Cost, Production Time • Requirements Validation: Do requirements satisfy mission requirements? Are they unambiguous, testable and technically realistic? • Test Planning: Integrated test approach or serial? Active Participation in T&E WIPT? Early Identification of Operational Test Requirements? • Test Execution: Operationally Representative Test Environments? Integrated Test Approach and Partnering Can be Improved
DOT&E Initiatives Integrated Testing Developmental Operational Live Fire Testers Engage Early Requirements are Unambiguous Testable Relevant to mission accomplishment Technically Realistic Improve Suitability Reliability Growth Field New Capabilities Rapidly Accelerated Testing
Field New Capabilities Rapidly Accelerated Testing Rapid Fielding • Number One Priority of SECDEF • Examples: • Extended Range Multi-Purpose UAS • Early Fielding Report based on Customer Test • Project ODIN • ATEC Observations Led to Improved Effectiveness • MRAP • DoD 5000 Process Adapted for Rapid Development ERMP UAS Counter IED OPS MRAP Live Fire Test 5
Testers Engage Early Requirements are … Engage Early • DOT&E Action Officer as a Team Member • Requirements Validation • Do requirements satisfy mission requirements? • Are they unambiguous, testable and technically realistic • Test Planning • Active Participation in T&E WIPT • Early Identification of Operational Test Requirements • Test Execution • Operationally Representative Test Environments
Integrated Testing Developmental Operational Live Fire Integrated Testing • Weapon System Acquisition Reform Act • Established Director of Developmental Test • Formally Implemented Integrated Testing • Test Process as a Continuum • Developmental Testing • Live-fire Testing • Operational Testing • Optimize Testing for Program Execution • Design of Experiments • Operational Test Evaluation Utilizes All Data
Improve Suitability Improve Suitability Reliability Growth • After System Effectiveness . . . Suitability is the Key Concern • Systems frequently judged Un-Suitable • Reliability and Maintainability account for 80% of Not Suitable ratings
Implementation (Continued) • Integrated Testing • Test Process as a Continuum • Developmental Testing • Live-fire Testing • Operational Testing • Use Statistical Analysis • Data Collected over Long Period & Multiple Tests • Design of Experiments • Normalization of Data
Common Activities • Replicating the “real world” environment as closely as practical • Need for a distributive live/virtual/constructive (LVC) representation of the Joint Operational Environment • Development and use of validated Tactics, Techniques, and Procedures (TTPs) • Development of COCOM/JS approved Joint Mission Threads • Data collection, management, archiving, and retrieval processes The Test and Training Communities are Missing Opportunities for Cost-effective, Relevant Partnerships
Common Activities • Replicating the “real world” environment as closely as practical • Need for a distributive live/virtual/constructive (LVC) representation of the Joint Operational Environment • Development and use of validated Tactics, Techniques, and Procedures (TTPs) • Development of COCOM/JS approved Joint Mission Threads • Data collection, management, archiving, and retrieval processes The Test and Training Communities are Missing Opportunities for Cost-effective, Relevant Partnerships
Bottom Line on What’s Needed • Strong partnership between the COCOMS, Test, and Training communities • Committed leadership for strategic institutional funding, co-investment, and cross-community management of joint test and training resources and infrastructure • Need to overcome preference for OPM (Other People’s Money) • Need to continue to work towards implementation of joint test and training roadmaps It Takes Time and Effort – but Potential for Increased Efficiencies and Effectiveness Justifies Both
Design of Experiments (DOE) Manage • Test planning is a science! • DOE is a scientific tool for developing robust test plans. • DOE equips us to determine: • Power • Confidence • Breadth of coverage • The iterative nature of DOE facilitates Integrated Test design Design Analyze DOE Plan Test
Design of Experiments • Integrated Test – DOE Approach • Inform each stage of testing from previous tests. • Span the battlespace through a compilation of tests. • Use screening experiments in DT to ensure a rigorous OT. • DOE Provides: • Most powerful allocation of test resources for a given number of tests. • A scientific, structured, objective way to plan tests. • An approach to integrated test. • A structured, mathematical analysis for summarizing test results. Developmental Test DT/OT Testing Operational Test
Design of Experiments:JASSM-ER • Binomial Distribution was Primary Method to Establish JASSM-ER Integrated Test Design • Resulted in 21-shot Design • Assumes ≤ 2 failures, to Demonstrate 0.80 Reliability with 80% Confidence • 24 Factors Considered; Refined Design by Reducing Factors Previously Evaluated in Baseline JASSM OT • Primary Factors: Range, Release Mach, Release Altitude, & Target Type • Ensures Key Performance Differences between Baseline JASSM & JASSM-ER are Evaluated • 21-event Test Design Created Using ¾–fractional Design to Estimate Main Effects & Interactions with Range • Uses 5 Combined DT/OT Shots 15
Conclusion • Our Goals • Greater Rigor and Effectiveness in Testing • Introduce New Statistical Developments • Results • Better Confidence, Power & Breadth in Tests • Our Products • More Effective and Suitable Systems • Rigor is Key to Constructing & Executing Efficient Tests • Only Efficient Testing can be Defended in the Current and Future Budget Environment