1 / 31

Achieving a Decision Paradigm for Distributed Warfare Resource Management

Achieving a Decision Paradigm for Distributed Warfare Resource Management. Bonnie Young Naval Postgraduate School Professor, Systems Engineering bwyoung@nps.edu 703-407-4531. A Complex Decision Space. What makes the Decision Space Complex?. Time-criticality Threat complexity

sol
Download Presentation

Achieving a Decision Paradigm for Distributed Warfare Resource Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Achieving a Decision Paradigm for Distributed Warfare Resource Management Bonnie Young Naval Postgraduate School Professor, Systems Engineering bwyoung@nps.edu 703-407-4531

  2. A Complex Decision Space

  3. What makes the Decision Space Complex? • Time-criticality • Threat complexity • Prioritization of operational objectives • Limits to situational awareness • Changing nature of operation • Distribution and heterogeneity of warfare assets • Command and control complexity

  4. Sensor Resources Leading to Decision Complexity Types ofSensors UAV Sensors Satellite-based Sensors LiDAR Passive Sonar Infrared Search & Track X-band Radar Ship-based Radar Sensor Missions Synthetic Aperture Radar Active Sonar Surveillance Track Quality Hyperspectral Imaging Combat ID Enhance Situational Awareness Fire Control Support Field of View Sensor Constraints Illuminate Target Boost Phase Detection Weather Sensor Status Sensor Configuration Ranging Sensor Location Day or Night Field of View Sensor Geometry Platform Considerations Sensor Health

  5. Warfare Resources Leading to Decision Complexity Comms Platforms Weapons Types ofSensors Comms UAV Sensors Satellite-based Sensors Platforms LiDAR Passive Sonar Weapons Infrared Search & Track X-band Radar Ship-based Radar Sensor Missions Synthetic Aperture Radar Active Sonar Surveillance Track Quality Hyperspectral Imaging Combat ID Enhance Situational Awareness Fire Control Support Field of View Sensor Constraints Illuminate Target Boost Phase Detection Weather Sensor Status Sensor Configuration Ranging Sensor Location Day or Night Field of View Sensor Geometry Platform Considerations Sensor Health

  6. Over-Arching Objective • To most effectively use warfare resources to meet tactical operational objectives

  7. Strategies • Use warfare resources collaboratively as Systems of Systems (SoS) • Use an NCW approach to network distributed assets • Achieve situational awareness to support resource tasking/operations • Fuse data from multiple sources • Employ common processes across distributed warfare resources • Use decision-aids to support C2

  8. JDL Data Fusion Model:Data-Centric Framework Data Fusion Domain Resource Management External Level 1 Processing Entity assessment Level 0 Processing Signal/ Feature assessment Level 2 Processing Situation assessment Level 3 Processing Impact assessment Distributed Human/ Computer Interface Local Intel EW Sonar Radar . . . Databases Database management system Level 4 Processing Process assessment Support Database Fusion Database Sources

  9. Functionality of the 4 Levels

  10. Shift to a Decision-Centric Framework Comms Platforms Weapons Resource Management (includes level 4 Processing) Human/Computer Interface

  11. Resource Management Commanders & Operators Wargaming (Event/Consequence Prediction) • Decision Engine • Translate prioritized COA actions into resource tasks • Generate allocation options and select optimum • Issue tasks to warfare resources Operational Picture Environment Picture Mission/Threat Assessment & Prioritization C2 Picture Resource Picture Communications Weapons Data Fusion Processes Warfighting Units Sensors Weather/ Mapping/ Intel Sources Warfare Resources

  12. Conceptual RM Capability • Architecture Considerations • Distributed RM “instances” • Synchronization • Hybrid: dummy C2 nodes and RM C2 nodes • Continuous On-going RM Process • Operational situation/missions are changing • Decision assessments must change in response—instead of a single assessment • Level of Automation • How much of the RM concept is automated? • RM is a decision-aid • Human C2 decision-makers must be able to manipulate information, prioritizations, and taskings

  13. Applying Systems Engineering Methods to Distributed Resource Management • An analogy exists between the SE design process and operational C2 decisions

  14. Resource Management Decision Assessments Performance OMOE Decision Engine “Cost” Decision Cost Engine “Risk” Decision Confidence Engine

  15. Task 1 Examples of Resource Tasking Task 2 Task 3 System 1 Task 4 System 2 Task 5 System 3 Task 6 System 4 SoS 1 SoS 2 System 1 Task 7 System 5 SoS 3 System 2 System 5 Task 8 System 6 System 3 Task 9 System 7 System 4 System 7 System 6 Task 10 System 8 System 9 System 8 SoS 5 System 9 Task 11 SoS 4 Option 1 Option 2

  16. Measuring System Effectiveness & Performance Overall Operational Environment Missions System/SoS Subsystem Technical Parameters (TPs) MOPs MOEs OMOE OMOE = ΣwiMOEi MOE = ΣwiMOPi

  17. Examples of Performance Measures System OMOE Provide Situational Awareness System MOE’s Detect and track fast-moving objects of interest Provide Area of Interest (AOI) Surveillance Coverage Correctly identify objects of interest Provide sensor coverage during day and night System MOP’s Daytime capability Sensor accuracy Task turn around time Field of view (FOV) Sensor alignment targets Scan speed Range Nighttime capability Search pattern Search volume Sensor processing Time in sensor view prior to ident. Range of identification Dwell time

  18. Hierarchy of Performance Effectiveness W1*MOPS1 W1*MOE1 W2*MOPS2 W1*MOPS1 W2*MOE2 W2*MOPS3 SoS OMOE W1*MOPS3 W3*MOE3 W2*MOPS5 W1*MOPS2 W4*MOE4 SoS W2*MOPS3 System 3 W3*MOPS4 System 1 W1*MOPS3 W5*MOE5 System 2 W2*MOPS5 . . . System 5 System 4 W1*MOPS1 Wn*MOEn W2*MOPSn SoS OMOE = ΣwiMOEi(Note – these are SoS MOEs) SoS MOE = ΣwiMOPi(Note – these are System MOPs)

  19. Warfare Resources Tasking Alternative 1 Mission 1 Mission 2 Warfare Resources Tasking Alternative 2 Mission 3 Mission 4 Warfare Resources Tasking Alternative 3 Mission 5

  20. Resource Management Decision Assessments Performance OMOE Decision Engine “Cost” Decision Cost Engine “Risk” Decision Confidence Engine

  21. Cost Considerations for Resource Management • Operational Costs – defensive weapons, fuel, power • Maintenance Costs (due to usage) – preventive maintenance, spares, repairs • Safety Costs – manned vs. unmanned Remember! For RM, the systems are already developed and paid for—so cost is treated differently

  22. Decision Cost Engine Concept • Provides methods to quantitatively represent the cost associated with the use of each warfare resource • May provide relative cost levels or values • Relative values are used to further refine the overall relative ranking of resource tasking decision alternatives

  23. Decision Cost Engine: 3 Concepts • “After the fact” – shifting OMOE scores up or down based on relative cost levels • “Red Flag” – associating an “identifier” with very costly warfare resources to highlight decision alternatives that include their use • “Hierarchical Weightings” – the most comprehensive approach would assign cost ratings to all resources and weightings to compute an overall “cost” for each decision option

  24. Combining Performance and Cost Assessments Ideal Point B A OMOE C Cost

  25. Resource Management Decision Assessments Performance OMOE Decision Engine “Cost” Decision Cost Engine “Risk” Decision Confidence Engine

  26. Decision Confidence Engine • Determines a “level of confidence” associated with each resource tasking option • Based on: • Information reliability (or “goodness”) • Data fusion performance • Sensor error • Communication error • Computational error • Mis-associations, incorrect identifications, dropped tracks, poor track quality, etc.

  27. Sources of Decision Error • Sensor Observations (SO) • Communications (C) • Data Fusion Processing (DFP) • Association (A) • Attribution (At) • Identification (Id) • Threat Prioritization (TP) • Mission Identification/Prioritization (MP) • Resource Information (Health, Status, Configuration, Location, etc.) (RI) • Notional Decision Confidence Level: • PDecision Accuracy = PSO * PC * PDFP * PA * PAt * PId * PTP * PMP * PRI

  28. Decision Confidence Engine (continued) • Hierarchical probability model – that includes all possible sources of error • As the operational situation changes, model is updates with error estimates • Errors are summed hierarchically to calculate an overall confidence level for each resource tasking option

  29. Summary (comparison of Systems Engineering Assessment with Resource Management) Decision Assessment for RM Operations Systems are in operation To select the most operationally effective SoS/resource tasking Continuum of decisions Projected performance against actual operational missions/threats Cost in terms of known cost to operate & maintain; safety Risk in terms of decision uncertainty or level of confidence Decision Assessment for System Design System is in design phase To select the most operationally effective design Single decision Projected performance against operational mission requirements Cost in terms of estimated $ for acquisition and total lifecycle Risk in terms of ability to meet requirements

  30. Conclusions • A decision framework providing decision assessment methodologies can address the complexity involved in effective resource management for tactical operations. • Applications from Systems Engineering provide methods for operational performance, cost, and risk assessments of resource tasking alternatives. • Future command and control stands to benefit from adopting a decision paradigm in addition to the traditional data-focused perspective.

  31. Future Work • Objective hierarchy modeling • Techniques for generating resource tasking alternatives • Continued development of the OMOE decision engine, cost decision engine, and decision confidence engine • Designing warfare resources with an emphasis on being “taskable” and have “multiple uses”

More Related