340 likes | 446 Views
Requirements Verification. Bill Fournier billfournier4@gmail.com 202 255 0760. Agenda. Definitions Confidence Top level Planning Concepts Requirements understanding & traces Relationship to other topics
E N D
Requirements Verification Bill Fournier billfournier4@gmail.com 202 255 0760 Bill Fournier Oct 2014
Agenda • Definitions • Confidence • Top level Planning Concepts • Requirements understanding & traces • Relationship to other topics • Similarities and differences, and synergy to other types of Verification • Methods • Events • Redo /back-ups and flexibility • Closure • Lessons learned
Verification • Definition • Confirm or prove by Verifiable Objective Evidence (VOE) that the requirement has been met with the final configuration • Answers the question, “Did we build it right?” Verifiable Objective Evidence • Definition- factual proof that a requirement has been meet or exceeded. • Documented trace and path to evidence • Objective as possible • Evidence beyond a reasonable doubt • Legalese- Shall’s not will’s, may’s, or should’s • Perspective- 4 step rigorous quality process of plan, do, analyze, close • When plan & influence execute right side of Verification “Vee” • Types of verification • GMD
Requirements • What system must do • Not Statement of Work (SOW) • Stakeholders vs. technical requirements • Requirements Understanding (RU) or context statements if not only one reasonable interpretation and requirement baselined. • Parallel section 3 and 4 • RVTM • Accuracy • Confidence • States, modes • Normal, stressing, both, all • Compound requirements Bill Fournier
Requirements Tracing • Up and down • Requirements from parent to children and from children to parents • Parents • Requirements which spawn other lower level requirements • Children • Requirements spawn from a higher level requirement • Orphans • Requirements at second level and below without parents • Tight • All children and grandchildren completely make up a parent Requirement • Large implication to reusing lower level (subsystem) verification closures.
Validation • Definition • Assessing we meet what user needs • Perspective • Did we do right thing (big picture), not did we do what we planned in specification • Difference from Verification more nebulous, more sampling, & user environment and really used • Types • Requirements - System • System • Software Independent Verification & Validation • Model & Simulation Verification Validation & Accreditation Bill Fournier
Relationship to Validation • Verification 1000s versus validation 10s • Verification assumes SEs did stakeholder/user translation assumptions were correct to specifications • Verification assumes system user will use system the way the SE assumed and with other related systems • Validation does not make these translation, user, related systems assumptions and tries to see system really do the job at a high level of detail
Accreditation • Definition- approval of authority of a M&S adequate for Intended Usage • DoD requires of a M&S used for Verification • Analyzing data pedigree of M&S for its intended use • Ideally physics-based • Test data • Correlation with more credible M&S
Assessment • Definition • Assess ability and progress to meet requirements with less than final configuration and / or verification environment • Perspective • Risk reduction and “what if” • Types • Risk reduction like a TPM as early warning indicator for system • Verification light to reduce verification risk can verify well
Configuration • Definition • Item that is controlled • Type – baselines on increasing level of control • Functional • Allocated • Production… • No one size fits all • Depends on technology, maturity, supportability, and acquisition/contracting strategy
Confidence • Definition • Degree of comfort that it is real • Conditions • Temperature, threat, states, modes • Operator • Field, engineer, scientist • Pedigree • Comfort on data applicability • With how much precision should you state results? • M&S fidelity - low to high • Missile Defense Agency examples • Interrelated requirements • Runs • Design of Experiments (DOE) • Normal / exponential • Bayesian statistics • Key verification issue confidence per $
Rigor • Definition • Repeatability, exactness, reality, with stated assumptions and conditions • Rigor • Requires planning and being methodical • May require more money up front • May save money by preventing rework but • Should allow identical redo by another organization with the same results of method within randomness and measurement error • Repeatability: Can an outside organization repeat? • Assumptions agreed with customer as reasonable and acceptable • Rigor assumption could relate to confidence customer willing to pay
Verification Cost • Method choice • Tools and Computer Aided Systems Engineering (CASE) tools • Environment • Standard accuracies • Confidence • Not process • Event efficiencies • Data reuse/ Trace • Envelope assumption • Requirement Understanding • Environment normal vs. 3 standard deviation vs. worst case • Regression/ Reverification • Navy five interrelated requirements example
Process Choice & Timing & training • INCOSE Handbook task Verification process = Verification closure process after Requirements done and have a good RVTM • Factors that influence choice • Experience • Schedule • Cost • Quality • Confidence • Process training
Approaches, Perspectives, Themes • Trust but verify • Communication • Clear, consistent, one definition, requirements understanding • Quality and configuration management perspective • Tailoring • system, testability, rigor, cost, variability, stressing, and confidence • Risk-based prioritization • Plan early on appropriate levels a lead time away - levels of planning as relates to design baseline levels • Maximize risk based Confidence for $
Other Types of Verification • Requirements verification vs. SW IV&V vs. M&S VV&A • Similarities • Trying to answer same questions for system, software, or M&S • Differences • Narrowing scope and degree of formality and all versus sampling of requirements. • Synergy • With care, the events and data can be sometimes be shared between three types of verifications
Why Verification Methods? • Four main reasons for using verification methods • Plan events; lump similar requirements together • Tools long lead Items (realize early need a new M&S or test Tool) • Price / schedule effort (help with early estimates number of requirements by verification method estimates) • Specialization take advantage (as an example of test engineer or performance analyst functionality)
Methods • Definition • Top level technique to provide data • The point here is to reduce method overlaps • Reuse with integration and validation • Decisions involved in selecting methods - cost, schedule, confidence, feasibility • Test events may include both test and demonstration methods • An old definition is still acceptable: Tests are objective, demos are subjective • INCOSE Handbook definitions, paraphrased: • TestActive test with heavy instrumentation, quantitative requirements • DemonstrationActive test with limited instrumentation, qualitative requirements • AnalysisMathematical formula or M&S • Inspection (Examination) Passive visual of product, process, or data • Certification Bill Fournier
Verification Plan • Early at top level – include verification environment • Relationship to other plans • Maturity • Useful document • Tailored • Consistent • Assumptions and resources • Standards • Process and closure process • Dependencies, long lead items, critical items • At least Three levels of planning
Planning for methods • Confidence required vs. Cost • Priority of Requirements • Some obvious choices • Some family of Requirements can be easily combined • Some multiple methods • Requirements Verification Traceability Matrix (RVTM) • Organizational preferences
Processes Selection Criteria • Quality of process • Other examples • Most processes adequate quality. • Experience with process • Tools / Databases for process • Cost and fit for system verifying • Verification environment • Verification risks of top few % of requirements
Tools • Planning, Requirements and lead time for • M&S and Analysis tools • Test tools (Screen captures, instrumentation on test unit while flying, outside sensors, truth sources) • Inspection tools • Requirements/ trace and version tools • CAD/CAM • CASE tools • Tool calibration to other tools, physics, or more credible events • Version management tools • Metrics
Test • Reason beyond verification for assessment and risk reduction • Test method, test events different but overlap • Test vs. demonstration • Quantitative requirements • Most popular and best comfort • Can be very expensive or impossible • Challenge - normal, stressing, concerns envelope • Challenge requirements may interrelate and all stressing requirements may be impossible
Analysis • Probability distribution • Mathematical Analysis • M&S Analysis- Allows lots of runs and unsafe conditions • Repeatable • Assumptions • Calibration/anchoring • M&S fidelity • Intended Usage Accreditation • Cost usually primarily tool cost • Bayesian and /or Design of Experiments (DOE)
Detail Analysis Plans • Intermediate level • Assumptions • Tools • Run matrix • Analysis method • Ability for someone to repeat • Resources • Tie to Verification
Inspection • Product or data examination • Preplan qualification of inspector and whether data be easily available • Lower level data roll-up • Lowest cost method tied with mathematical analysis • More common higher level of systems • More common in design consideration requirements • Key question in trace roll-up do children equal parents.
Events • Definition • Specific formal set of activities to generate data • Reasons • Lump similar requirements, use specialization, and plan events • Reuse with integration and validation • “Test” definition confusion events vs. methods vs. test/verification • Test for verification, assessment, and risk reduction reasons • Common types for verification • Flight, sea, ground, space tests • Qualification test • Integration test • Environmental test • Analysis events • Performance, communication, Reliability, Availability, Maintainability (RAM) • Inspection events • Quality, design, standard compliance, and lower level data • Maintenance Demonstration
Event Types • Test / Demonstration • Development • Qualification • Environment • RMA • Acceptance • Operational • Part / subsystem / system/ system of systems levels • Analysis events • Inspection events • Tag on events
Verification Closure Schedule • Dependent on events report and configuration • Key to right events • Back up events if must redo • All requirements verification are not created equal • 80/20 rule • Not linear • What is your long pole Bill Fournier
Closure Forms/Topics for Closure Report • Method (s) • Tool (s) • Success criteria • Test cases/ scenarios • Test conditions • Assumptions/ Requirements understanding • Summary • Individuals involved • Data source • Configuration • Rationale • Events • Constraints
Verification Closure Roll-Up / Trace • Databases • Tools • Requirements wording & context • Verification method definition • Trace (tight)
Cost • Automation with quality control • Sampling independent repeat • Automated test • Tolerance and conditions • Number of runs • Trace and verification closure reuse • Combing requirements for events • Multipurpose events and their cost risk
Redo / Reverify • Failure (failed or no event data) or change • More than just software regression testing • Some people call this regression • Cannot 100% test software • Portion changed or failed • Impact / risk • Replan • Rerun / redo • Rerun one run, in between, all original runs
Lessons Learned • Good requirements • All Requirements are not equal • Flexibility of method, Events • Willingness to tradeoff weighted summation of Confidence and Cost per Requirement • Importance of clear consistent communication • Long term planning with appropriate level of detail and long lead items for Verification Bill Fournier