340 likes | 465 Views
New Development Techniques: New Challenges for Verification and Validation. Mats Heimdahl Critical Systems Research Group Department of Computer Science and Engineering University of Minnesota 4-192 EE/CS; 200 Union Street SE Minneapolis, MN 55455. Domain of Concern. Test. System Test.
E N D
New Development Techniques:New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science and Engineering University of Minnesota 4-192 EE/CS; 200 Union Street SE Minneapolis, MN 55455
Test System Test Integration Test Unit Test Object Code How we Develop Software Concept Formation Requirements Specification System Design Integration Implementation Analysis
Verification:Are we building the thing right? Validation:Are we building the right thing? Validation and Verification Concept Formation Requirements Specification System Design Integration Implementation
Properties Analysis Testing Prototyping Visualization Code Model-Based Development Specification Model
Commercial Products Esterel Studio and SCADE Studio from Esterel Technologies SpecTRM from Safeware Engineering Rhapsody from I-Logix Simulink and Stateflow from Mathworks Inc. Rose Real-Time from Rational Etc. Etc. Model-Based Development Tools
Research Tools (many):RSML-e and Nimbus Simulations of environment RSML-e Formal Models (~20 running concurrently)
System Test Integration Test Specification Test How we Will Develop Software Concept Formation Analysis Requirements Properties System Specification/Model Integration Implementation
FGS/FMS Mode LogicRSML-e and Nimbus Simulations of environment RSML-e Formal Models (~20 running concurrently)
Early Validation of Requirements Using Model-Checking (NuSMV) • Prove Over 300+ Properties in Less Than an Hour • Found Several Errors in Our Models Using Model-Checking • Substantially Revised the Shalls to Correct Errors
Early Validation of Requirements Using Theorem Proving (PVS) • Proved Several Hundred Properties Using PVS • More Time Consuming than Model-Checking • Use When Model-Checking Won’t Work
A Simplified Development Model Requirements and Specification System Test Code Unit Test Time
Properties Analysis Testing Prototyping Visualization Specification Model Code Ongoing Research RSML-e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. –UML Minnesota, Pennsylvania, George Mason, NRL, NASA Ames, Etc. CMU, SRI, Stanford, UC Berkley, VERIMAG, NASA, Etc., Etc. RSML-e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. RSML-e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. –UML Proof carrying code, Provably correct compilers, Test for correctness
Properties Analysis Testing Prototyping Visualization Specification Model Code Problems… Tested enough? Trust the results? Can we trust execution environment? Can we trust execution environment? Are the languages usable—syntax and semantics? Can they play nice together? Can we really trust the code?
Fewer “Bugs” Fewer “Bugs” Benefits of Modeling Savings Time
Fewer “Bugs” Fewer “Bugs” Code Generation Coding effort greatly reduced Savings Time
Qualified Code Generation(theory) Unit testing moved here. Unit testing eliminated for generated code Time Savings
Code Generation Concerns Concept Formation • Is our model “right”? • Can we trust the execution environment? • Can we trust our analysis tools? • Can we trust our properties? Requirements Properties System Specification/Model Integration Can we trust the code generator? Implementation
Provably correct compilers Very hard (and often not convincing) Proof carrying code Total correctness required Base all specification testing on the generated code Loose the benefits of working at the specification level Generate test suites from specification Compare specification behavior with generated code to better trust your specification testing Unit testing is now not eliminated, but completely automated Generate Specification/Model Output Specification Based Tests Implementation Output “Correct” Code Generation
Certify the execution environment Too costly and probably impossible Specification based testing Any discrepancy and either the code generator is wrong, or the execution environment is wrong, or the target platform is faulty When have we tested enough? Specification coverage criteria What is adequate coverage? Criteria for measurement are not good for generation Technically covering the specification, but with useless tests Do we reveal faults Tradeoff between the impossible and the inadequate Specification Testing
Proof Techniques (theory) Reduced testing since properties proved correct in specification stage Proofs performed here Time Savings
Verification Trust • We need properties (requirements)!!! • Often lost in the modeling “frenzy” Concept Formation Requirements Properties How do we trust our proofs? System Specification/Model Integration Proof validity in production environment? Implementation
Certify analysis tools Too costly and probably impossible Use redundant proof paths Technically feasible, but is the redundancy “trustworthy”?? Cost… Automation is key Must keep analysis cost under control Generate test suites from specification Low cost since it is already done for the code generator Many languages and many analysis techniques Trusted Translators? Proof Techniques Trusted Translators? PVS Theorem Prover Translation SMV Translation Model Checker RSML-e SAL Translation State Exploration
Proof Techniques (worst case) Added burden that cannot be leveraged later Most analysis is not easy, nor cheap! Time Savings
Iterated Weekly? Daily? Hourly? • Abstraction cost amortized • Impact of change on abstraction • Approximate techniques in day-to-day activities Regression Verification 100s, if not 1000s, of properties Large Evolving Model Analysis Result
Yes! Can We Achieve the Goal? ? Redundant proof process (PVS, SMV, Prover, SAL,…) ? Abbreviated system testing augmented with generated tests ? Specification testing Test case generation ? Automated unit testing (to MC/DC?)—to check code generator and specification execution environment Verifiable code generator ? Time Savings
Perfection is Not Necessary ≥ Missed Faults • We only need to be better than what we are now… • How do we demonstrate this? • Empirical studies are of great importance
Education of Regulatory Agencies • Regulatory agencies are very conservative • And rightly so… • Avionics software is very good • We need to understand regulatory and industry concerns to get our techniques into practice • We need to have convincing evidence that our techniques work and are effective
New Challenges for V&V • Validate models • The models must satisfy the “real” requirements • Validate the properties used in analysis • Model testing crucial to success • Validate tools • We will rely a lot on tools for model validation, can we trust them? • Creative use of testing necessary • Verify and Validate generated code • Can we trust that the translation was correct? • Test automation crucial • Includes output to analysis tools • Adapt to the various modeling notations • Models will not come in one language • Translation between notations and tools