E N D
1.
1 AIAA S-102 Mission Assurance Standards Working Group Tyrone Jackson, CRE
Chair, S-102 Mission Assurance Standards Working Group
P.O. Box 2294, Hawthorne, CA 90251
Phone: (310) 926-0297
jacksont@simanima.com
2.
2 Pre-briefing Quiz QUESTION 1 - Who believes the life cycle acquisition and operation of any system or systems of systems can be implemented in 5 basic steps?
ANSWER - DoD
3.
3 Pre-briefing Quiz (Cont.) QUESTION 2 - Who believes the maximum capability of any project process can be achieved and maintained in 5 basic steps?
ANSWER – DoD and the Software Engineering Institute (SEI)
Capability Maturity Model Integration (CMMI)
Initial
Managed
Defined
Quantitatively Managed
Optimizing
4.
4 Pre-briefing Quiz (Cont.) QUESTION 3 - Who believes that any problem encountered in a project process can be solved in 5 basic steps?
ANSWER – Defense Acquisition University (DAU)
Modified DAU creative problem-solving steps
Identify the goal or challenge and gather data
Define the problem
Generate potential solutions
Select, and if possible, strengthen the solution
Develop a plan to implement the chosen solution
5.
5 Pre-briefing Quiz (Cont.) QUESTION 4 - Who believes that the degree to which Safety, Reliability and Quality Assurance (SR&QA) programs will identify and eliminate or control safety-critical and mission-critical risks can be pre-established by selecting mission assurance processes that (1) are commensurate with the product’s unit-value/criticality and systems engineering life cycle phase, and (2) have the appropriate capability levels that are based on 5 basic steps?
ANSWER – The AIAA S-102 Mission Assurance Standards Working Group
6.
6 S-102 Mission Assurance Definition mission assurance - The program-wide identification, evaluation, and mitigation or control of all existing and potential deficiencies that pose a threat to system safety or mission success, throughout the product’s useful life and post-mission disposal.
deficiencies - Damaging-threatening hazards, mission-impacting failures, and system performance anomalies that result from unverified requirements, optimistic assumptions, unplanned activities, ambiguous procedures, undesired environmental conditions, latent physical faults, inappropriate corrective actions, and operator errors.
7.
7 S-102 MASWG Strategic Goals Define, develop, implement, track, and update a comprehensive volume-set of capability-based mission assurance standards and guides
Participate in USA and foreign industry standards working groups to promote integrated risk management approaches that consistently address all project domain risk areas, especially mission assurance
Assist and mentor small thru large companies to develop capability-based mission assurance Command Media
8.
8 S-102 MASWG Strategic Goals (Cont.) Develop and improve capability-based mission assurance expertise of beginner thru expert engineers
Industries world-wide are changing from being dominated by a small number of “mega companies” that design, manufacture, and distribute products, to being dominated by many smaller companies that design products only, and out-source their manufacture and distribution.
To be regularly employed in a culture that increasingly relies on a “just in time” workforce, engineers will have to be versatile and augment their specialized skills with broad expertise in mission assurance.
9.
9 S-102 MASWG Tactical Goals On an industry need basis, develop 40 AIAA capability-based mission assurance standards that are grouped by project management, engineering, and test domains
Participate in IEEE reliability standards working groups and ISO technical committees to promote consistent and integrated mission assurance risk management approaches
Build generic Command Media templates based on predefined processes needed to control each of five levels of life cycle product unit-value/criticality risk
Establish a webpage to coordinate the development and validation of “open source” S-102 compliant mission assurance tools
10.
10 Packet One S-102 Standards S-102.1.4 FRACAS
S-102.1.5 Failure Review Board
S-102.2.2 System Reliability Modeling
S-102.2.4 Product FMECA
S-102.2.5 Sneak Circuit Analysis
S-102.2.11 Anomaly Detection & Response
S-102.2.18 Fault Tree Analysis
11.
11 Packet Two S-102 Standards S-102.0.1 Mission Assurance (MA) Program General Requirements
S-102.1.1 MA Program Planning
S-102.1.6 Critical Item Risk Mgmt
S-102.2.1 Functional Diagram Modeling
S-102.2.3 Component Reliability Predictions
S-102.2.13 Operational Dependability and Availability Modeling
S-102.2.14 Hazard Analysis
S-102.2.15 Software Component Reliability Predictions
S-102.2.17 Event Tree Analysis
12.
12 Packet Three S-102 Standards S-102.1.2 Subcontractor & Supplier MA Mgmt
S-102.1.3 MA Working Group
S-102.2.7 Finite Element Analysis
S-102.2.6 Design Concern Analysis
S-102.2.16 Process FMECA
S-102.2.22 Stress & Damage Simulation Analysis
S-102.1.8 Quality Assurance
S-102.1.9 Configuration Management
S-102.3.1 Environmental Stress Screening
13.
13 Packet Four S-102 Standards S-102.1.7 Project Mission Assurance Database
S-102.2.9 Human Error Predictions
S-102.2.10 Environmental Event Survivability Predictions
S-102.2.12 Maintainability Predictions
S-102.2.20 Similarity & Allocations Analysis
S-102.2.21 Component Engineering
S-102.3.2 Reliability Development/Growth Testing
S-102.3.4 Reliability Life Testing
S-102.3.7 Product Safety Testing
14.
14 Packet Five S-102 Standards S-102.1.10 Environmental Safety Assurance
S-102.2.8 Worst Case Analysis
S-102.2.19 Fishbone Analysis
S-102.3.3 Reliability, Maintainability, and Availability Demonstration Testing
S-102.3.5 Design of Experiments
S-102.3.6 Ongoing Reliability Testing
15.
15 S-102 Standards Development Process Every S-102 Mission Assurance (MA) standard includes an Annex B that provides consistent criteria for measuring the capability of a particular MA process to identify, assess, and control Safety, Reliability and Quality Assurance (SR&QA) related risk factors
As an MA process grows in capability during the product development life cycle, it transitions through increasing levels of accuracy and precision until it reaches the level that is specified in the Mission Assurance Program (MAP) Plan
Defining a specific capability level requires asking the question, “How much effort must be put into this process to ensure risk is controlled at a level that is commensurate with the unit-value/criticality of the product being developed or operated?”
Capability rating criteria follow the logical order of activities necessary to improve the effectiveness of a particular MA process in stages
These capability rating criteria can help an organization plan process improvement strategies by identifying the current capability levels of their processes and the most critical areas where improvements are needed
16.
16 S-102 Process Capability Level Rating The key activities of a particular Mission Assurance process are grouped according to five increasing levels of capability
Capability Level 1 activities are the set or “baseline” activities that make up the minimum effort required to identify and control risk for a low unit-value/criticality product
Capability Level 2 activities include all the Level 1 activities plus additional activities that represent the minimum effort required to identify and control risk for a medium unit-value/criticality product
Capability Level 3 activities include all the Level 1 and 2 activities plus additional activities that represent the minimum effort required to identify and control risk for a high unit-value/criticality product
Capability Level 4 activities include all the Level 1, 2 and 3 activities plus additional activities that represent the minimum effort required to control risk for a very-high unit-value/criticality product
Capability Level 5 activities include all the Level 1, 2, 3 and 4 activities plus additional activities that represent the minimum effort required to control risk for a ultra-high unit-value/criticality product
17.
17 Characteristics of Capability-based Processes For a Mission Assurance Process to be transitioned from a lower level to a higher level, the process must:
Carry over activities from the lower capability level to the higher capability level process, and;
Add new activities to the higher capability level process, and/or;
Add modified activities from the lower capability level process to the higher capability level process.
Some Mission Assurance Processes are not implemented in some of the life cycle phases of some products because the capability of the process, as defined in the applicable S-102 Standard, does not include identifying and controlling the Mission Assurance risk factors associated with the product in question in the life cycle phase(s) in question.
The responsibility for implementing some Mission Assurance Processes (e.g., Hazard Analysis, Product FMECA, and Critical Item Risk Management) must be shared among several project engineering disciplines, and not delegated to only one discipline. In other words, do not assign Hazard Analysis, Product FMECA, and Critical Item Risk Management to separate Safety, Reliability, and Quality Assurance (SR&QA) organizations, respectively.In other words, do not assign Hazard Analysis, Product FMECA, and Critical Item Risk Management to separate Safety, Reliability, and Quality Assurance (SR&QA) organizations, respectively.
18.
18 Process Capability Rating Criteria Sample
19.
19 Product Unit-Value/Criticality Category Guide
20.
20 Structure of Data Maturity Rating Criteria The key data products of a particular Mission Assurance process are grouped according to five increasing levels of maturity
Maturity Level 1 data product is likely to be the least accurate and is appropriate only for a low unit-value product assessment
Maturity Level 2 data product is likely to be more accurate than a Maturity Level 1 data product but less accurate than a Maturity Level 3 data product, and is appropriate for a medium unit-value product assessment
Maturity Level 3 data product is likely to be more accurate than a Maturity Level 2 data product but less accurate than a Maturity Level 4 data product, and represents the minimum acceptable accuracy for high unit-value product assessments
Maturity Level 4 data product is likely to be more accurate than a Maturity Level 3 data product but less accurate than a Maturity Level 5 data product, and represents the average expected accuracy for high unit-value product assessments
Maturity Level 5 data product is likely to be more accurate than a Maturity Level 4 data product, and represents the maximum possible accuracy for high unit-value product assessments
21.
21 Data Maturity Level Rating Criteria Sample
22. S-102 MASWG Develops and Validates Open Source Tools to Aid Execution of Mission Assurance Programs 22
23.
23 Conclusions The S-102 MASWG is defining, developing, implementing, tracking, and updating a comprehensive volume-set of capability-based mission assurance standards and guides
The S-102 MASWG is participating in USA and foreign industry standards working groups to promote integrated risk management approaches that consistently address all project domain risk areas, especially mission assurance
The S-102 MASWG is assisting and mentoring a small company to develop capability-based mission assurance Command Media
http://www.spacewx.com/
24.
24 Conclusions (Cont.) The S-102 MASWG is developing an open source reliability modeling tool to improve the capability-based mission assurance expertise of beginner thru expert engineers