1 / 37

CxSI Validation from “The Way We Were” to “Where We (think) We Are”

CxSI Validation from “The Way We Were” to “Where We (think) We Are”. Objectives Convey processes and plans used by CxSI IV&V Share best practices and lessons learned Provide a context for discussion about modeling and validation Agenda March 29 Brown Bag Recap, CxP Requirements

ramirezd
Download Presentation

CxSI Validation from “The Way We Were” to “Where We (think) We Are”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CxSI Validation from “The Way We Were” to “Where We (think) We Are” Objectives • Convey processes and plans used by CxSI IV&V • Share best practices and lessons learned • Provide a context for discussion about modeling and validation Agenda • March 29 Brown Bag Recap, CxP Requirements • The CxSI SRM • Validation Plans and Status christina.d.moats@nasa.gov cxsi_team@ivv.nasa.gov

  2. March 29 Brown Bag Recap (Initial Validation Plans) • CxSI Initial Framework for Validation (Ready to Use version presented 7/16/07) • Best Practices/Lessons Learned

  3. Current Efforts follow plans developed in March 07 • Three tasks will complete by July 27, 2007: • Develop Behavior Model (SRM), • Analysis Preparation • Validation Methods Validation analysis will occur from July - Sept 2007 Note: this process will be expanded in future slides

  4. CxSI Behavior Model Generation and L2 Validation Process based on this Model Feb-July, 2007 Aug-Sept, 2007 Note: this process will be expanded in future slides

  5. CARD: Section 3.2: Architecture Requirements Section 3.7: System Requirements (allocation of architecture requirements) Constellation Requirements Requirements Level Definitions • Level 1: Need, Goals and Objectives (NGO) • Level 2: CxP Architecture Requirements (CARD) • Level 3: System Requirement Documents (SRDs) • Level 4: Element CxP Requirements Artifacts (note: CxP Requirements Mgmt in Cradle Database) SRDs: Section 3.2: System Requirements Section 3.7: Element Requirements (allocation of system requirements) etc Section 3.7 of the Parent Document Is the same as Section 3.2 of the Child CxSI Requirements Validation Objectives: • Validate Level 2 Requirements  Section 3.2 of the CARD • Verify Trace from Level 2 to Level 3  Section 3.2 to 3.7 of CARD • Verify Section 3.7 of CARD flows to Sections 3.2 of appropriate SRDs SRD Section 3.2 is analysis starting point for other CxP IV&V teams

  6. Detailed plans based on framework support integrated planning and execution Part A Part B Part C

  7. Framework and Planning: Best Practices/Lessons Learned • Best Practices • CxSI spent time benchmarking modeling as part of development and extrapolating what we learned to an overall IV&V framework • This framework has had good longevity for our team • Because our work was “the first time” for a key NASA initiative, we planned and performed as a group • Socialization of ideas facilitated innovation • Socialization of products resulted in high rigor • Initiative was embraced by the team • A good thing to do was to understand how we were going to use the model for validation so we knew when we were “done enough”

  8. How do we validate our model? • The CxSI SRM • Dependability Criteria • SRM Readiness to support Requirements Validation • Requirements Assessment • Peer Review • SRM: Best Practices/Lessons Learned

  9. Constellation Behavior Model Hierarchy CxSI SRM addresses four primary goals: ISS, Moon Sortie, Moon Base, Mars

  10. Dependability Criteria Dependability criteria: • Availability: The probability that a system is operating correctly and is ready to perform its desired functions. • Consistency: The property that invariants will always hold true in the system.  • Correctness: A characteristic of a system that precisely exhibits predictable behavior at all times as defined by the system specifications.  • Reliability: The property that a system can operate continuously without experiencing a failure.  • Robustness: A characteristic of a system that is failure and fault tolerant.  • Safety: The property of avoiding a catastrophic outcome given a system fails to operate correctly.  • Recoverability: The ease for which a failed system can be restored to operational use.  • We use a process similar to the development of the BMK in Caffall thesis to assure completeness of the model • BMK development analyzed assertions to assure they addressed the seven dependability criteria (section 8H-Analysis of Prototype) • CxSI used these criteria against the model instead of against assertions

  11. Sample supporting analysis to ensure Dependability in our model • CxSI Dependability analysis required supporting analysis • anytime/constant services analysis • operational environment • hazard analysis • End to end use case views further increased model dependability • Safety • Consistency • Recoverability • Reliability Anytime failure/continuous services analysis as a separate analysis (availability and robustness) Hazard analysis performed supports model dependability (robustness).

  12. Sample end-to-end analysis for Dependability in our model Use Cases Parsed MSS Steps (with extensions) • Level 2 MSS Steps • Assertions • Extension Conditions • Extension Handling Steps(with frequency of occurrence) • Action verbs for each Actor(with frequency of occurrence) • Objectsthat get acted on(with frequency of occurrence) End-to-End Summaries facilitate availability, consistency, reliability, safety, recoverability analysis End-to End Filter criteria

  13. Necessary Model Elaboration against L2 Requirements • An assessment of the L2 requirements was made to determine need for elaboration of the model • CxSI emphasis to have consistent level of model abstraction • Level 1: independent of architecture elements, design reference missions • Level 2: Driven by architecture and associated behavior provided in system ops-con • Particularly, elaboration needs for Moon base and Mars necessary to determine need for Level 2 elaboration of future system goals • Assessment resulted in some, (but not many), elaborations to our Level 1 use cases and our Level 2 Launch model

  14. Reviewers asked to focus on various aspects of model Review of SRM • Peer Review • In addition to our internal CxSI reviews, peer reviews are held when we complete major portions of our model, further increasing model confidence • The peer review includes a combination of domain and modeling experts from the facility to review our products • Feedback has been great and has resulted in better model and processes  • Project Review: CxSI is targeting early August for CxP SE&I review RIDs State Machine Formal schedule used for review A PITS database is setup to track fixes we need to incorporate

  15. SRM: Best Practices, Lessons Learned • Best Practices: • Using rigorous peer review (internal and external) led to confidence in our model • Taking all comments seriously took a good amount of time (6 weeks allocated) • Dependability criteria and how they were used for BMK provided a framework for model validation • We have a repeatable process (four months to generate the Lunar Sortie model, six weeks to generate the ISS model) • Generating our validation plans in parallel with the model allowed us to know when we were at the “good enough” stage • Lessons learned: • Modeling was even more iterative than any of us thought. It was worth the time for us to get on the same page as a team. • We had to learn to not sweat the small stuff • Terminology discussions got old – we created an “open issues” field that helped • After the fact, we are performing a rigorous mapping of OpsCon to our model. This work could be performed in parallel with model development • As we better understand how to use our model, more updates are likely (they are expected and planned for)

  16. How do we validate the CxP Level 2 Requirements? • Validation Plans • Validation Generation Process and Status • Validation Analysis Tasks • SRM (Behavior, Interface) • Requirements Trace • Requirements Evaluation • Other (Safety, Performance, Design)

  17. Generation of CxSI Validation Plans • In preparation for validation analysis, we performed a classification of CxP Level 2 requirements into the following categories: • Behavior – requirements that are driven from the behavior of the system, • Interface – requirements that are driven from interface interactions, or necessary for interfacing across the systems • Performance – requirements that indicate thresholds of goodness • Design Implementation - trade based requirements … could drive lower level behavior, but at Level 2, are driven from the design • Safety • The validation plans are tailored to these categories • e.g. SRM is used to validate the behavior and interface requirements but the Loss of Mission requirement stems from trades and has a performance aspect • Our process to generate CxSI validation plans involved: • Initial Draft • “blue sky” view of what needed to be done for requirements validation • Reconciliation of this view with Facility WBS • Team Peer Review (CxSI Team and JWST-Vorndran) • (update) Infuse CxP processes into our assessment (both as information and as a “standard”)

  18. CxSI Validation Analysis Tasks are binned into four Categories CxSI Validation Analysis Tasks 4. Other Tasks C9. Safety Requirements  for safety requirements C10. Design Implementation  for design/trade related requirements C11. Performance Requirements  for performance related requirements 2. Requirements Trace Related Tasks  for all requirements C3. Level 1 to Level 2 trace (verify project trace) C4. Level 2 to Level 3 trace (verify project trace) C5. Artifact consistency – verify CARD Section 3.7 is the same as associated SRD Section 3.2 1. SRM Related Tasks  for behavior and interface Requirements C1. Validate Behavior Requirements and Interface Interactions against SRM C2. Interface Requirements (trades, and trace related activities) 3. Requirements Evaluation Tasks  for all requirements C6. Requirements Evaluation – per requirement C7. Requirements Evaluation – per mission goal C8. Compare mission goal requirements for completeness Note: this framework will be expanded in future slides. Task ID’s are used in planning, e.g. C1, C2, etc

  19. CxSI IV&V Analysis – Utilized multiple tasks, tools, methods to increase assurance of coverage

  20. SRM Related Validation Tasks • SRM used to validate behavior requirements and interface interactions. Issues from analysis require CxSI assurance model is correct • Specific Plans: • Evaluate attributes of system behavior extracted from behavior model to ensure complete coverage of requirements at a consistent level of abstraction. Perform a detailed correlation between the SRM and behavior requirements from the CARD. This trace must occur in a two way manner: • 1) that the model represents all behavior requirements in the CARD, and that the CARD Requirements are correct (note: if SRM needs to be updated, CxSI process will be to find reference for the change from somewhere else besides the requirements or reference behavior as an assumption) • 2) that all behaviors (given level of abstraction) are represented in the requirements • In performing the correlation, all fields of the UC/AD should be evaluated for completeness in the requirements. • In terms of a more detailed representation of behavior in the IV&V model than the CARD requirements, IV&V should either • a) write an issue to capture the missing behavior even though it may be in a lower level document, or • b) go into the lower level requirements to determine if the behavior is captured appropriately, and evaluate if a (lower level) TIM should be generated which recommends moving a requirement from Level 3/4 to Level 2.

  21. Validation Data Capture Products CxSI IV&V SRM CARD Requirements • Updates to both the model and the CARD are expected • Validation results will correspond to a snapshot of these products • Results must be captured in a way that supports assessing change impacts and re-validation

  22. SRM Behavior Validation Analysis Data Capture • CxSI Team is preparing for validation of CARD Requirements • Involves creating a correlation map between model elements and requirements • Capabilities for capturing validation results needed in the near term (CARD validation task scheduled for August and September 07) • CxSI IV&V models developed thus far in Microsoft Office products • Recognize that selection of a common tool set (Together) requires some amount of augmentation for IV&V processes • CxSI IV&V model will be migrated to selected tool for long-term maintenance • Near-term solution: Access Database • Correlation data must be captured in a way that supports validation analysis views (many-to-many relationship) • Ideally, this database, and associated lessons learned in our analysis, can provide insight into the IV&V tailoring of the Together tool Note: The "how" of this task is tbd, there have been multiple approaches brought up in CxSI discussions. The convergence on a single method will occur after this task is begun (as compared to other tasks where the method is known prior to beginning analysis)

  23. Requirements toModel Correlation – Access Database (note: still under construction) Step 2 – Perform Validation Analysis: • Database reports enhance analysis by showing created links • Analyst performs correctness and completeness analysis between requirements and model • Discrepancies analyzed and addressed (TIMs) Step 1 – Perform Correlation: • Database contains the requirements and the model • Analyst links requirements to the elements of the model CARD Requirements CxSI Model (shown is a Use Case) Output Report: Correlation of the CxSI model to CARD Level 2 requirement Link Capability Database construction: • Analysis reports going through upgrade to include requirement/model text and means to capture analysis • Widows and Orphans (requirements or model) reports • Merge/replace with Facility tool when appropriate The CxSI access database facilitates our model validation analysis

  24. Requirements Trace Validation Tasks • Same as trace work performed under prior WBS 3.1 • CxSI will perform trace from Level 12 and Level 23 • At level 3, additional consistency check necessary across artifacts • Specific Plans: • For provided trace, use parent --> child trace as primary artifact: • 1) ensure appropriate links exist; • 2) ensure that the linked items at the lower level constitute a correct and complete decomposition of the upper level requirements; and • 3) review widows/orphans as a potential source of issues. • 4) ensure that child-parent and parent-child traces show equivalent information • In analysis setup, • Reconcile results with Civil Servant trace performed. • Ensure rationale explanations do not contradict the trace or are necessary to complete trace

  25. Requirements Evaluation Related Validation Tasks • Same as requirements evaluation work performed under prior WBS 3.2 • CxSI will perform evaluations in three areas: • On a requirement by requirement basis (correctness) • Within a mission goal (correctness) • Across mission goals (consistency) • Specific Plans: • per requirement: Is this requirement : atomic (single thought), precise (unambiguous), testable. E.g. a good requirement (so if you were doing a design of it, you'd know what was expected) - and if not; why (see Req't Eval Worksheet) <-- note that the specific criteria may evolve • Additionally, check that the requirement and rationale are consistent (this is a lower level issue). Ensure that information in the rationale should not be in the requirement (this would be a higher level completeness issue) • evaluate requirements by mission goal (ISS, lunar, Mars) to ensure that multiple interpretations can't occur across suites of requirements or that contradictions don't exist within requirements. This check is to ensure consistency within a mission. • Correlate related requirements for each mission against each other to ensure consistent level of depth across the four missions.: ISS, moon sortie/base and Mars

  26. Other (Safety, Design, Performance) Validation Tasks • Performed under prior WBS 3.2 • Combination of: • Tracing to standards/policies • Identifying underlying trade or rationale • Assessing technology readiness • Domain Understanding Specific Plans: Safety: Shuttle and ISS safety requirements flow from some Program standard documents, which are probably related to NASA standards. CxP should have something analogous (if they don't, we should use shuttle/ISS standards). The analysis performed for the Level 2 requirements involved adherence to Constellation safety standards. Safety requirements are evaluated against the identified aspects/criteria (Caffall) to assess the requirements in terms of system-software safety: Fault Avoidance, Fault Warning, Fault Correction, Fault Tolerance, Fail Operational, Fail Safe Note that some of the safety requirements are references to other documents/standards. As part of this activity, a decomposition of the referenced documents is likely necessary for verification. Also, some of the safety requirements will overlap and be also mapped into performance, behavior or design requirements Design Implementation Requirements: Analyze design implementation requirements to ensure they are appropriate constraints on system behavior at this level for subsequent decomposition. Provide rationale for why design based requirements are appropriate, e.g. traced to a trade study. If requirements can't be traced to a trade study, then the requirement is an assumption by IV&V and will be carried as such. Performance Requirements. For each performance requirement, analyze to ensure the requirement is appropriate in defining the system goal. Appropriateness is viewed in two ways: 1) The applicability of the requirement to meet the overall performance objective (performed via analysis) 2) The reasonableness of this requirement from a technology readiness point of view (for CxSI safety performance, extrapolate these requirements from Shuttle/ISS safety)

  27. Where We Are Now • Finalizing report with all analysis results • C1, C8: Missing requirements • C6, C10: Process problems manifest as technical problems • C1, C2: SRM ≠ Requirement level of detail • C10: O&M Requirements bin • C3, C4, C5: Requirement traceability/consistency • C6: Requirement clarity vs. requirement rationale vs. requirement verification • Always need to incorporate Project’s plans/standards into the IV&V analysis • Next • Continue C1 with L3 requirements • Update all analysis with future CARD/SRD releases • More detailed scrutiny on the interfaces • Use Validation results to focus future efforts

  28. Conclusion • Charts are available on the Validation Workshop website • Contact the CxSI team if you have questions/comments: • Christina.D.Moats@nasa.gov • cxsi_team@ivv.nasa.gov

  29. Appendix 1 – SRM Development (Part A),including Dependability Checks on SRM

  30. Appendix 2 – Validation Plans

More Related