410 likes | 860 Views
PMT-352 Systems Engineering Seminar. DragonFly JRATS Simulation. Version 2.1, 2-5-14. Introduction and Objectives. ….using aspects of the DragonFly simulation.
E N D
PMT-352 Systems Engineering Seminar DragonFly JRATS Simulation Version 2.1, 2-5-14
Introduction and Objectives ….using aspects of the DragonFly simulation. • Apply the systems engineering technical management processes and effectively implement the technical processes and the overall system acquisition process. • This seminar is designed to • help prevent the teams from • making cost/performance • tradeoffs in an ad hoc manner • without really appreciating the • rigor of the SE process model. • This seminar is intended to provide an appreciation for how the SE process would actually be applied to a system like the JUGV. Version 2.1, 2-5-14
Defense Acquisition Guide Systems Engineering Process Model Technical Management Processes Technical Planning Requirements Management Configuration Management Interface Management Transition Stakeholder Requirements Definition Decision Analysis Risk Management Data Management Technical Assessment Validation (OT) Verification (DT) Requirements Analysis Integration Architecture Design Technical Processes Technical Processes Implementation Always On-going
Start at the Beginning JRATS Architectural Views JRATS Capabilities Documents (ICD and Draft CDD) JCIDS Documents provide primary input to the SE process. Requirements immediately start to evolve when we add the CONOPS JRATS CONOPS Stakeholders Requirements Definition An important part of the SE process Requirements Management Process Architectural Views • Identify the “Context” • Identify Interfaces • Clarify “What’s Required” Interface Management Process Version 2.1, 2-5-14
JUGV Detailed stakeholder capability needs turned into good technical requirements JCIDS (Draft CDD) Key Performance Parameters (KPPs), and other required capabilities JCIDS Documents will detail operational requirements and KPPs Statutory & Regulatory/ Certification Example: Information Assurance, Spectrum Supportability Stakeholders Requirements Definition CONOPS Other Requirements are derived from Statutory, Regulatory, Certification, Design, and Interface requirements Design Considerations Example: Human Systems Integration, Electro-Magnetic Environmental Effects Interface Requirements Example: Dragonfly Unmanned Air Vehicle (UAV) Version 2.1, 2-5-14 7
JUGV • As part of the Technical Assessment Process, the program office will plan and hold technical reviews. • One of the most important is the System Requirements Review (SRR) • The SRR ensures that the PMO, user, and contractor all have a common understanding & agreement on: • the system level technical requirements and • the associated costs, schedule, and risks associated with realizing a system to meet the requirements. Technical Assessment Process Stakeholders Requirements Definition Version 2.1, 2-5-14
JUGV May Cause us to Reassess How RQMTs are Stated Extract top level functions and determine their performance requirements and constraints Functional Analysis (Verbs) Requirements Analysis Verifiable Clear Concise Consistent Traceable Feasible Necessary Version 2.1, 2-5-14
JUGV Another way to “See” this process is with a Functional Flow Block Diagram (FFBD)(the primary functional analysis technique) • The FFBD • Indicates the logical and sequential relationships • Shows the entire “network of actions” and the “logical sequence” • Does NOT prescribe a time duration to or between functions • HOWEVER: A time line analysis will be done “based on the FFBD” • Does NOT show “how” a function is to be performed • HOWEVER: the “how” will have to be identified for each block Requirements Analysis Version 2.1, 2-5-14
JUGV Requirements AnalysisUsing a Functional Flow Block Diagram Top Level – Divide all functions into logical groups 4.3.5 4.3.10 4.3.3 4.2 4.3.8 4.1 4.3.7 4.3.11 4.3.6 4.3.9 4.4 4.3.1 4.3.4 4.3.2 4.3 2.0 6.0 3.0 1.0 5.0 4.0 ConductR&S Target & Attack Start Transit to Op Area Conduct Mission Operations Transit to Base Area Shutdown Track Communicate Load Detect Guide Weapon Detect Mines Kill Target Decide Identify Launch Weapon Safe Launcher Arm Weapon Designate Locate Second Level And Or Third Level And Version 2.1, 2-5-14
JUGV Logical Decomposition Group functions in a way that they can be realized by a physical component or sub-system. ….“Allocated” into Logical Groups TMPs in play throughout Architecture Design A first cut at a design . Decide which function groups can be COTS/NDI/GFE and which can be developed. Architecture Design Risk Management Interface Management Decision Analysis For functions that could be realized in a number of ways, conduct a trade-off analysis to decide on best solution. Version 2.1, 2-5-14
JUGV Trade-Off Analysis Required Logical Decomposition – Example: Target & Attack Function An initial cut at a Physical Architecture OPTIONS SPECIFIC Architecture Design SPECIFIC SPECIFIC OPTIONS SPECIFIC Version 2.1, 2-5-14
JUGV Functional Allocation Table – JUGV Targeting and Attack Sub-system N O U N S “Loop” back to RQMTS Requirements Management “Traceability” Architecture Design V E R B S Helps Shape the WBS Version 2.1, 2-5-14
JUGV “N ” Diagram: To Analyze Interfaces and Interactions 2 • Problems occur at “interfaces.” Identifying them in advance is crucial to effective Risk Management. • N Diagrams are used to identify and analyze interface requirements between physical components or functions. 2 Shown on the diagonal Architecture Design Interfaces then identified or • Interface types and requirements are • identified for each component. • They could include: • - Electrical • - Mechanical • - Hydraulic • - Heating / Cooling • - User interface TMPs in play throughout Architecture Design Interface Management Risk Management Version 2.1, 2-5-14
2 JUGV N DiagramTo Analyze Interfaces and Interactions Version 2.1, 2-5-14
Decision Analysis JUGV Architecture DesignDesign Trade Study/Trade-offAnalysis Starts Here Define Candidate Solutions Define Assessment Criteria Assign Weights to Criteria Assign MOE or MOP to Candidate Solutions Sensitivity Analysis on Results • Example: A trade-study to choose the combination of COTS RADAR and GFE targeting computer to optimize our system according to selection criteria, which are defined as follows: • Weight • Range • Power Requirements • Life Cycle Cost Architecture Design Note: The DragonFly simulation SW will provide the component data for you to use as input for the selection criteria categories for each combination of RADAR and Targeting Computer. Version 2.1, 2-5-14
Decision Analysis Example: Trade Study Results for RADAR/Targeting Computer Selection Architecture Design Trade-off analysis for COTS RADAR and GFE Targeting Computer combinations. Nine different options considered based on available components in DF simulation. Version 2.1, 2-5-14 18
Decision Analysis Trade Study Results for RADAR/Targeting Computer Selection Sensitivity Analysis – What is the impact on the “Scores” if we make a change in One of the “weights?” Architecture Design LCC constraint • Options 3 & 6 are within our cost & performance criteria. • What’s the best choice? • Would it still be the best choice if the weights were changed? TRADE SPACE (O) (T) Version 2.1, 2-5-14
Technical Reviews Technical Planning JUGV Evolving Technical Baseline(Example of requirements traceability down to JUGV Targeting Computer Configuration Item) Functional Baseline • What the system must do - functions • How well it must do it - performance - at the “system” level • Defines the interfaces/dependencies among the functions, groups and the environment. • Main artifacts = System Performance Spec and Subsystem/Segment Specs. System Functional Review (SFR) Allocated Baseline Architecture Design Preliminary Design Review (PDR) Product Baseline Critical Design Review (CDR) 20 Version 2.1, 2-5-14
Technical Planning JUGV Evolving Technical Baseline(Example of requirements traceability down to JUGV Targeting Computer Configuration Item) Functional Baseline TMPs in play throughout Architecture Design Requirements Management • What the system must do (functions) and how well it must do it (performance) at the “system” level • Defines the interfaces/dependencies among the different functions or functional groups and the external environment. • Main artifacts: System Performance Spec and Subsystem/Segment Specs. • Tracing Requirements in the Tech Reviews: • Identifies Requirements “Creep” • Increases confidence in meeting Stakeholder expectations • Baselines provide: • The common reference point for Configuration Management • The basis for “Verification” activities Configuration Management Allocated Baseline Technical Assessment Architecture Design Product Baseline Version 2.1, 2-5-14
Decomposition of Requirements and Traceability from Baseline to Baseline Completes “Design” EXAMPLE: JUGV Targeting Software Configuration Item “System” PerformanceRequirement(JUGV System) Functional Baseline “Item” PerformanceSpecification(Targeting Software CI) Allocated Baseline(Performance of CIs that make up the system) Item Detail Specification (Targeting Software - ROE Module) • Pseudo Code, Flow Charts, Use Case Diagrams, Sequence Diagrams, State Diagrams, Structure Diagrams, Data Base Structure, Data Definitions, Data Storage Requirements, Etc. Product Baseline(Details of components and modules that make up CIs) Version 2.1, 2-5-14
Crafting Technical Performance Measures (TPMs) (A critical step for Systems Engineers) Technical Assessment • TPMs • Selected attributes that are measurable through analysis from the early stages of design and development • Allow Systems Engineers and PMs to track progress over time • An integral part of the logical decomposition and the architecture design process • Provide a mechanism for facilitating early awareness of problems • Should be based on parameters that: • Drive costs • Are on the critical path • Represent High Risk factors • Add a third dimension to the Cost & Schedule strengths of EVM • “Technical Achievement” EXAMPLE: “PROBABILITY OF Kill KPP (Next Slide) Risk Management Architecture Design Version 2.1, 2-5-14
Example JUGV TPM Measure of Effectiveness (KPP) Probability of Kill (Pk) = % of time that attack of single target results in rendering target incapable of performing its mission) Measure of Performance Target Track Accuracy (meters) Measure of PerformanceTargeting Data Update Rate (milliseconds) Measure of PerformanceMax Weapon Range (meters) Technical Performance MeasureTargeting Software Memory Utilization (% total RAM capacity) Technical Performance MeasureTargeting Algorithm Running Time (milliseconds) Planned progress over time with tolerance bands Actual, measured progress Threshold Targeting Algorithm Running Time (ms) Objective Version 2.1, 2-5-14
“Realization” “Design” Involves two primary efforts: detailed design down to the lowest system elements and the realization of fabrication/production into actual products. Plans, designs, analysis, requirements development, and drawings are realized into actual products Ensure detail design is properly captured in Design Phase Artifacts Make, buy, or reuse system components Verify that each element—whether bought, made, or reused—meets specification JUGV System Implementation ACQ STRATEGY Targeting & Attack Drives Implementation Strategy Applied to JUGV WBS. Example: Make, Buy, or Reuse? RADAR Target H/W Target S/W IFF LASER Launch Missile SEP Documents how decisions are to be made Buy Buy Make Re-use Buy Re-use Re-use Version 2.1, 2-5-14 Example: Targeting & Attack Segment
JUGV Schematic block diagram (Putting the pieces together) Targeting & Attack Sub-system RS 422 TMPs in play IFF LASER Targeting Cmptr RADAR Launcher Interface Management Encoded LASER Energy Ethernet Ethernet MIL-STD-1760 Configuration Management MIL-STD-1760 Integration MIL-STD-1553B • Integration happens at each & every level in the system architecture • Starts with basic parts, such as resistors . . . Then proceeds up through: • Components (e.g., printed circuit boards) • Assemblies (e.g., antenna) • Segments (e.g., mission system segment) • Subassemblies (e.g., actuators) • Subsystems (e.g., target & attack subsystem) • Full systems (e.g., JUGV). Version 2.1, 2-5-14
JUGV Schematic block diagram (Putting the pieces together) Targeting & Attack Sub-system Vehicle Control & Nav LASER RADAR Launcher IFF Target Cmptr Remote Operator Integration Comms Electrical and Mechanical Subsystems Vehicle Chassis Engine and Drive Train Integration within a Sub-system = a Major Task Integrating Sub-system to Sub-system = a HUGE Task Major challenges in Interface Management and Configuration Management Version 2.1, 2-5-14
JUGV (Across 3 Dimensions) Ops & Maintenance Qualification DEMONSTRATION Development Acceptance Requires early and continuous Planning Throughout all Phases ANALYSIS INSPECTION JUGV System Target & Attack Survivability Chassis Drive Train S&R TEST “System” Verification Top to Bottom. Down to The Lowest Piece- part Verification RADAR H/W S/W IFF LASER Launch Missile Think “DT” Buy Buy Make Re-use Buy Re-use Re-use Methods “Confirming system elements meet the design-to or build-to Spec…Did you build it right?” Version 2.1, 2-5-14
Technical Planning JUGV Verification Verification • Verification requirements provide the basis for the TEMP • As they’re developed and refined, the SEP should be updated to describe how test results will feedback into system design. • The SEP should also describe the tools used to for tracking and maintaining traceability of verification requirements. • Verification Matrices are developed for each level of the system You never really have a “good” requirement until you have “verification” requirements to go with it Version 2.1, 2-5-14
JUGV ValidationJUGV Operational Test Design Example Developed by JCIDS Process Developed by T&E WIPT • Mission Task Analysis • Attack by Fire an Enemy Force or Position • Identify enemy • Fix enemy location • Engage enemy with weapons • Provide Maintenance Support • Identify failed components • Remove and replace failed components in the field • Critical Operational Issues • Can the JUGV kill its intended targets? • Can the JUGV be maintained in an operational environment? JUGV CDDJUGV CONOPS KPPs Probability of Kill / Ao Measures of Effectiveness Target Detection Range Validation Developed by Ops Testers Measures of Suitability Mean time to Fault Locate Think “OT” “Confirming system elements meet Stakeholder requirements” Operational Scenarios Measures of Performance RADAR Range Built in Test (BIT) False Alarm Rate OT Data Requirements OT Resource Requirements OT Framework/ OT Plan TEMP Version 2.1, 2-5-14 Did you get the requirements right?
Verification Verification Validation Validation VERIFICATIONREDUCES VALIDATION RISK Version 2.1, 2-5-14
JUGV Targeting & Attack Sub-system to JUGV System Transition a system element to the next level of the physical architecture Focus on: Interface Management Tech Data Management Transition Integration JUGV System to User Transition an end-item to the user in the operational environment II • Focus on Operational Integration • Integrated Logistics Support Elements (Training & Maintenance Plans, Supply • Provisions, Technical Publications, Support Equipment, PHS&T, etc.) Version 2.1, 2-5-14
Recursive and Iterative Systems Engineering Recursive- The repeated application of processes to design next lower layer system products or to realize next upper layer end products within the system structure. SYSTEM SUB- SYSTEM • Iterative - The application of a process to the same product or set of products to correct a discovered discrepancy or other variation from requirements. COMPONENTS Version 2.1, 2-5-14
Recursive and Iterative Systems Engineering Vee Model Stakeholders Requirements Definition Stakeholder Requirements, CONOPS, Validation Planning Definition Transition Validate System to Stakeholder Requirements and CONOPS Validation Requirements Analysis Validation System Performance Specification and Verification Planning Integrate System and Verify to System Specification Verification Architecture Design Verification Configuration Item Performance Specification and Verification Planning Assemble Configuration Items and Verify to CI Performance Specification Verification Implementation Integration Configuration Item Detail Specification and Verification Procedures Inspect and test to Detail Specification Verification Technical Planning Requirements Management Configuration Management Interface Management Fabricate, code, buy, or reuse Decision Analysis Risk Management Data Management Technical Assessment Version 2.1, 2-5-14