1 / 32

An Integrating Framework for Interdisciplinary Military Analyses

Approved for public release; distribution unlimited. An Integrating Framework for Interdisciplinary Military Analyses. 80 th MORS Symposium US Air Force Academy 11-14 June, 2012. James N. Walbert, Ph.D. SURVICE Engineering Company jim.walbert@survice.com 703-221-7370.

frye
Download Presentation

An Integrating Framework for Interdisciplinary Military Analyses

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approved for public release; distribution unlimited An Integrating Framework for Interdisciplinary Military Analyses 80th MORS Symposium US Air Force Academy 11-14 June, 2012 James N. Walbert, Ph.D. SURVICE Engineering Company jim.walbert@survice.com 703-221-7370 Paul H. Deitz, Ph.D. US AMSAA APG, MD 21005-5071 paul.h.deitz.civ@mail.mil 410-278-2786 DSN: 298-2786 LTC(R) Britt E. Bray Dynamics Research Corporation BBray@drc.com 785-550-5573 Wednesday 10:30 AM Room 3H2 Group CG D – Resources/Readiness/Training Version Date: 7 June 2012 Abstract Number CG-2790

  2. Problem Statement • In the main, acquisition programs are pursued without detailed explanation of the value added in operational context, relative to higher and lower level missions, using a standard language. • Effectiveness analyses (e.g., requirements, wargames, test, evaluation activities) are therefore not documented in a way that clearly relates system requirements to operational necessity using approved doctrinal terms. • Absent formal mission descriptions: • Material and soldier performance metrics are evaluated with incomplete knowledge of risk vs. reward trade-offs • Acquisition activities proceed without standard, shareable performance and effectiveness metrics • Specific analytic and test activities are prosecuted in isolation without the ability to integrate them holistically. • System-of-System analyses proceed in the absence of requisite operational “team” context obtainable only from formal operational specification.

  3. Towards a Solution • This requires Defense-wide framework, language, and processes common to and shared by all participants • Establish the pieces and how they fittogether • Resolve semantics and syntax issues • Since it’s about mission success, better start with the mission • Objective elements [facts!] are inherently quantifiable • Subjective elements [expert opinion!] must nevertheless be framed quantitatively Everyone is entitled to his own opinions, but not his own facts!

  4. The Blind Men & the Elephant — The“bottom-up” conundrum — Z Spear Snake Fan Rope Y Tree Wall X Single Object: Multiple Perceived Projections

  5. Today’s World: Multiple Defense Analytics — Metrics still developed in an ad hoc,“bottom-up” fashion — ● ● ● Effectiveness Analysis Cost Dev Test Research Op Test Material Analysis Requirements Challenges Defense Analytic ● ● ● ● ● ● Human Dimension Analysis Integrated OR/SA No Single Reference Object, ad hoc Connections

  6. Three Mappings from 3-D to 2-D Spaces Z Y-Z Plane X-Z Plane Desired Shape Y X X-Y Plane In general, mappings are only defined from higher to lower spaces!

  7. Materiel in n Space Mobility Plane Logistics Plane DESIRED SYSTEM Survivability Plane Taking projections beyond 3-D geometry to abstract spaces!

  8. How are missions prosecuted? • How do the professionals do it? • For many years, warfighters have used the Military Decision-Making Process [MDMP] as the underlying structure for planning, structuring, organizing, and executing all manner of missions (whether “kinetic” or not).

  9. Conduct Strategic Deployment and Redeployment Foster Multinational and Interagency Relations Conduct Foreign Humanitarian Assistance and Humanitarian and Civic Assistance Support Peace Operations Cooperate with and Support NGO’s and PVO’s SN 8.1.3 SN 8.1.9 SN 8 SN 1 SN 8.1.5 Foster Alliance and Regional Relations and Security Arrangements Cooperate With and Support Nongovernmental Organizations (NGOs) in Theater Cooperate With and Support Private Voluntary Organizations (PVOs) in Theater Coordinate Humanitarian and Civic Assistance Programs ST 8.2.4 ST 8.1 ST 8.2.11 ST 8.2.12 Concentrate Forces in Theater of Operations OP 1.2.3 Concentrate Tactical Forces BTT 1.2.1 Enemy team fires on disabled vehicle from church tower Gunner of disabled vehicle returns fire on church tower w/ 25mm auto-gun Platoon leader receives order from company commander to break contact with enemy Platoon leader orders evacuation of casualties Platoon leader calls company commander reports incident and informs commander of significant damage to church, and several civilian casualties Platoon leader orders gunner to cease fire on church U-ART 1.2.2.3.3.1 U-ART 1.2.6.1.1.2.1.1 U-ART 4.5.1.1.1 U-ART 1.2.6.1.1.2.1.2 U-ART 1.2.6.1.1.2.1.4 U-ART 1.2.6.1.1.2.1.3 MDMP Structure MOUT Mission Decomposition‡ NATIONAL STRATEGIC OPERATIONAL TACTICAL Horizontal Linkages Top-Down Linkages Evacuation of Civilian Casualties Order to Break Contact Order to Cease Fire Gunner Returns Fire Report of Civilian Casualties Receive Enemy Fire ‡Mission build by Dynamics Research Corporation, Nov 2000 9/32

  10. The Military Decision-Making Process [MDMP] • The MDMP is all about mission planning and task execution, monitoring results and assessment of progress against mission objectives. Tasks are ubiquitous! • When informed by key reference missions, the MDMP should serve as the single integrating framework for the community. • Materiel Requirements should derive from successful task execution, under appropriate conditions and standards. Tasks Capabilities Materiel/People Must be worked iteratively!

  11. The MDMP & MMF • Since the LF programs of the 1980s, Army V/L modelers have searched for supporting frameworks/data structures • An early structure, the “V/L Taxonomy”, was developed in 1985 • The “Missions & Means Framework” [MMF] followed in 2002 • The MMF is an attempt to formalize the MDMP! • Some of the MMF structure and symbolism will be used in what follows

  12. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services So how are Tasks executed? [1/2] Tasks initiate O4,1 O3,4 Pause interactions O2,3 O1,2 which change 2. Personnel, Units Components, Systems components

  13. Intraplatform Component Linkage Platform assessment based on Component 1 Component 3 Component 4 Component 2 Component 5 • Linkages can be: • Mechanical • Electrical • Hydraulic • Radiative • Conductive IndividualTaskPerformance! Component • • • Component • • • ● ● ● Component n 2. Personnel, Units Components, Systems

  14. Interplatform Linkage: Key SoS Construct SoS assessment based on Platform 1 Platform 3 Platform 4 Platform 2 • Linkages can be: • Time Based • Event Based • Effects Based • Mechanical • Electrical • Hydraulic • Radiative • Conductive Platform 5 Collective Task Performance! Platform • • • Platform • • • Platform n 2. Personnel, Units Components, Systems ● ● ●

  15. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services So how are Tasks executed? [2/2] Tasks initiate which execute O4,1 O3,4 capabilities interactions O2,3 O1,2 which change which change 2. Personnel, Units Components, Systems components [w structure] Task Cycle

  16. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services Supporting Contexts‡ [1/4] These Principal Elements are necessary, but not sufficient, to define a full representation of the MDMP. O4,1 O3,4 O2,3 O1,2 2. Personnel, Units Components, Systems ‡ The OPFOR is not shown!

  17. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services Supporting Contexts [2/4] Level 5: Index- Location & Time 5. Index: Location & Time O4,1 O3,4 O2,3 O1,2 2. Personnel, Units Components, Systems

  18. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services Supporting Contexts [3/4] Level 7: OWNFOR Purpose, Mission 7. Why = Purpose, Mission 5. Index: Location & Time O4,1 O3,4 7. Mission O2,3 O1,2 2. Personnel, Units Components, Systems

  19. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services Supporting Contexts [4/4]‡ Level 6: Environment- Military, Civil, Physical, . . . 6. Context, Environment (Military, Civil, Physical, etc.) 7. Why = Purpose, Mission 5. Index: Location & Time O4,1 O3,4 7. Mission O2,3 O1,2 2. Personnel, Units Components, Systems Context is critical for all mapping levels! ‡ The OPFOR is not shown!

  20. 4. Tasks, Operations 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services 3. Functions, Services Interactions between Opposing Forces Self and Cross Interactions O3,4 O3,4 O4,1 O4,1 OPFOR OWNFOR O2,3 O2,3 O1,2 O1,2 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems

  21. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services Important Takeaway Note: Tasks initiate interactions which directly change the physical/biological/ psychological state of components! Capabilities (Functions) & Tasks (i.e., Utilities) derive from component physical & functional state changes. O3,4 O4,1 O2,3 O1,2 Levels 3. or 4.! 2. Personnel, Units Components, Systems Aggregation/integration of multiple interactions/effects must take place at Level 2., NOT Levels 3. or 4.! Level 2.

  22. 4. Tasks, Operations 1. Interactions, Effects 3. Functions, Services Typical Lumped-Task Simulation [1/2] Single estimate or test result! In vulnerability/lethality analyses, this metric is often “defined” as a probability! As such, it is an ill-defined utility! Atypically task related! Test initial conditions Task O4,1 O3,4 O2,3 O1,2 No intermediate results! 2. Personnel, Units Components, Systems Task Cycle

  23. Typical Lumped-Task Simulation [2/2] • Lumped metrics are problematic wrt both interpretation and integration with other parameters! • Without context and intermediate results, the contribution of each of the three components (physical state change, capability change, change in mission challenge) cannot be apportioned to create data extensibility. • The inability to define the “PK” metrics objectively/ quantitatively as well as lack of objective intermediate damage and performance metrics contributed greatly to the Live Fire Program issues in the 1980s.

  24. A “Lego” Collection of Mission/Performance Elements Ability to Mix & Match Levels & Operators • ≈ 2200 Universal Joint Tasks • ≈ 350 Condition Descriptors • ≈ 4 Standards per Task • ≈ 680 Army Universal Tasks • ≈ 350 Condition Descriptors • ≈ 4 Standards per Task O4,1 ● ● ● ● ● ● • Ballistic Effects • Jamming • Damage Repair • Chemical • Resupply • Repair • Laser Damage • Sleep • Directed Energy • Nuclear • Physics of Failure • Logistics Burdens • Reliability • Fair Wear & Tear • Fatigue • Heat Stress • . . . • . . . Capabilities described in task-compatible metrics 4. Tasks, Operations 4. Tasks, Operations 4. Tasks, Operations 4. Tasks, Operations 4. Tasks, Operations 4. Tasks, Operations 4. Tasks, Operations 4. Tasks, Operations ● ● ● ● ● ● ● ● ● ● ● ● 3. Functions, Services 3. Functions, Services 3. Functions, Services 3. Functions, Services 3. Functions, Services 3. Functions, Services 3. Functions, Services 3. Functions, Services 1. Interactions, Effects 1. Interactions, Effects 1. Interactions, Effects 1. Interactions, Effects 1. Interactions, Effects 1. Interactions, Effects 1. Interactions, Effects 1. Interactions, Effects O3,4 ● ● ● ● ● ● 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems 2. Personnel, Units Components, Systems O2,3 Degraded States Mapping Methodology well established O1,2 Unlimited geometric & material configurations/structures

  25. Sequence of Task Cycles Forms a TOEL Material Analysis Integrated OR/SA Op Test Requirements Task #n Task #3 Task #1 Task #2 TIME TIME ● ● ● ● ● ● Human Dimension ● ● ● Research Effectiveness Analysis Dev Test • Missions are composed of task sequences • Following task initiation, an event cycle occurs • As a result, material, capability, and utility changes may follow • When the “lego” elements are developed at this level of resolution, they can be combined endlessly with great extensibility • All communities of interest can focus on the specific elements with clarity, define sharing or exclusivity with others, resolve precedence, dependencies, . . . or ? Are the Venn data sets or

  26. Analysis, Evaluation & DT Issues Individual lego elements combine into task cycles, define model elements, and focus Developmental Testing Model Assessment Test Planning For a particular system under study, identify which Levels and Operators are insufficiently understood. A single DT cycle O3,4 O4,1 O2,3 O1,2

  27. Analysis, Evaluation & OT Issues Sequences of task cycles define and focus Operational Testing Platform 1 2. Person, Unit Component, System accumulated by the same Note: A sequence of 1. Interactions, Effects 1. Interactions, Effects IS NOT the same as a sequence of , each on a pristine 2. Person, Unit Component, System , followed by post processing! Parallel chains of task cycles connected by common purpose define and focus Systems-of-Systems OT via Collective Tasks Platform 1 SoS Platform 2 ● ● ● Platform p

  28. The Survivor Sum Rule For fifty years, vulnerability analysts and modelers have been taking Level 4., so-called “probabilities”, and combining them using the Survivor Sum Rule,‡ e.g.: Ballistic Vulnerability Example Total PK for an n-shot ballistic volley: PK = 1 – { [1 – PK1] x [1 – PK2] x . . . [1 – PKn] } Total Total PS for n survivability-related events (e.g., encounter, engagement, hit, damage, kill): Survivability Example PS = 1 – { [1 – PE1] x [1 – PE2] x . . . [1 – PEn] } Total ‡ Caveat Emptor: The Survivor Sum Rule applies only when metrics are both true probabilities and independent! Here, neither condition holds! ‡ Caveat Emptor: The Survivor Sum Rule applies only when metrics are both true probabilities and independent! Here, neither condition accrues! Sorry and good luck!

  29. Conduct Strategic Deployment and Redeployment Foster Multinational And Interagency relations Conduct Foreign Humanitarian Assistance and Humanitarian and Civic Assistance Support Peace Operations Cooperate with and Support NGO’s and PVO’s SN 8.1.3 SN 8.1.9 SN 8 SN 1 SN 8.1.5 Foster Alliance and Regional Relations and Security Arrangements Cooperate With and Support Nongovernmental Organizations (NGOs) in Theater Cooperate With and Support Private Voluntary Organizations (PVOs) in Theater Coordinate Humanitarian and Civic Assistance Programs ST 8.2.4 ST 8.1 ST 8.2.11 ST 8.2.12 Concentrate Forces in Theater of Operations OP 1.2.3 Concentrate Tactical Forces BTT 1.2.1 Enemy team fires on disabled vehicle from church tower Gunner of disabled vehicle returns fire on church tower w/ 25mm auto-gun Platoon leader receives order from company commander to break contact with enemy Platoon leader orders evacuation of casualties Platoon leader calls company commander reports incident and informs commander of significant damage to church, and several civilian casualties Platoon leader orders gunner to cease fire on church U-ART 1.2.2.3.3.1 U-ART 1.2.6.1.1.2.1.1 U-ART 4.5.1.1.1 U-ART 1.2.6.1.1.2.1.2 U-ART 1.2.6.1.1.2.1.4 U-ART 1.2.6.1.1.2.1.3 Summary [1/3] NATIONAL STRATEGIC • Full operational context is established identically for • all levels of war and made manifest for: • all materiel/people players and • all supporting disciplines Task #n Task #3 Task #1 Task #2 ● ● ● OPERATIONAL Strategic Mission Abstraction Operational Mission Abstraction National Mission Abstraction Tactical Mission Abstraction or ? Venn data sets: or TACTICAL Platform History 29/30

  30. Summary [2/3] Standard semantics and syntax are established across all levels The lego element abstractions are identically applicable to analysis and test, establishing key symmetry requisite for validation. Task #n Task #3 Task #1 Task #2 ● ● ● Strategic Mission Abstraction Operational Mission Abstraction Tactical Mission Abstraction National Mission Abstraction • ≈ 2200 Universal Joint Tasks • ≈ 350 Condition Descriptors • ≈ 4 Standards per Task • ≈ 680 Army Universal Tasks • ≈ ?? Condition Descriptors • ≈ 4 Standards per Task Platform History 30/32

  31. 4. Tasks 1. Interactions 3. Capabilities Summary [3/3] An Integration Strategy Capabilities described in task-compatible metrics Task #n Task #3 Task #1 Task #2 ● ● ● Tactical Mission Abstraction • Link Task Definition to Task Execution • Share Task Execution methods, measures & data structures across the Community • Ballistic Effects • Jamming • Damage Repair • Chemical • Resupply • Repair • Laser Damage • Sleep • Directed Energy • Nuclear • Physics of Failure • Logistics Burdens • Reliability • Fair Wear & Tear • Fatigue • Heat Stress • . . . • . . . Geometry/Materiel Specifications Material/People Interactions Capabilities matched to Tasks 2. Components + Context 31/32

  32. Approved for public release; distribution unlimited An Integrating Framework for Interdisciplinary Military Analyses 80th MORS Symposium US Air Force Academy 11-14 June, 2012 James N. Walbert, Ph.D. SURVICE Engineering Company jim.walbert@survice.com 703-221-7370 Paul H. Deitz, Ph.D. US AMSAA APG, MD 21005-5071 paul.h.deitz.civ@mail.mil 410-278-2786 DSN: 298-2786 LTC(R) Britt E. Bray Dynamics Research Corporation BBray@drc.com 785-550-5573 Wednesday 10:30 AM Room 3H2 Group CG D – Resources/Readiness/Training Version Date: 7 June 2012 Abstract Number CG-2790

More Related