1 / 34

New Development Techniques: New Challenges for Verification and Validation

New Development Techniques: New Challenges for Verification and Validation. Mats Heimdahl Critical Systems Research Group Department of Computer Science and Engineering University of Minnesota 4-192 EE/CS; 200 Union Street SE Minneapolis, MN 55455. Domain of Concern. Test. System Test.

talasi
Download Presentation

New Development Techniques: New Challenges for Verification and Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. New Development Techniques:New Challenges for Verification and Validation Mats Heimdahl Critical Systems Research Group Department of Computer Science and Engineering University of Minnesota 4-192 EE/CS; 200 Union Street SE Minneapolis, MN 55455

  2. Domain of Concern

  3. Test System Test Integration Test Unit Test Object Code How we Develop Software Concept Formation Requirements Specification System Design Integration Implementation Analysis

  4. Verification:Are we building the thing right? Validation:Are we building the right thing? Validation and Verification Concept Formation Requirements Specification System Design Integration Implementation

  5. Properties Analysis Testing Prototyping Visualization Code Model-Based Development Specification Model

  6. Commercial Products Esterel Studio and SCADE Studio from Esterel Technologies SpecTRM from Safeware Engineering Rhapsody from I-Logix Simulink and Stateflow from Mathworks Inc. Rose Real-Time from Rational Etc. Etc. Model-Based Development Tools

  7. Research Tools (many):RSML-e and Nimbus Simulations of environment RSML-e Formal Models (~20 running concurrently)

  8. System Test Integration Test Specification Test How we Will Develop Software Concept Formation Analysis Requirements Properties System Specification/Model Integration Implementation

  9. FGS/FMS Mode LogicRSML-e and Nimbus Simulations of environment RSML-e Formal Models (~20 running concurrently)

  10. Sample RSML-e Specification

  11. Capture Requirements as Shalls

  12. Translated All the Shalls into SMV Properties

  13. Early Validation of Requirements Using Model-Checking (NuSMV) • Prove Over 300+ Properties in Less Than an Hour • Found Several Errors in Our Models Using Model-Checking • Substantially Revised the Shalls to Correct Errors

  14. Early Validation of Requirements Using Theorem Proving (PVS) • Proved Several Hundred Properties Using PVS • More Time Consuming than Model-Checking • Use When Model-Checking Won’t Work

  15. Model-Based DevelopmentExamples

  16. A Simplified Development Model Requirements and Specification System Test Code Unit Test Time

  17. Properties Analysis Testing Prototyping Visualization Specification Model Code Ongoing Research RSML-e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. –UML Minnesota, Pennsylvania, George Mason, NRL, NASA Ames, Etc. CMU, SRI, Stanford, UC Berkley, VERIMAG, NASA, Etc., Etc. RSML-e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. RSML-e, SCR, SpecTRM, Statecharts, Esterel, SCADE, Simulink, Etc. Etc. –UML Proof carrying code, Provably correct compilers, Test for correctness

  18. Properties Analysis Testing Prototyping Visualization Specification Model Code Problems… Tested enough? Trust the results? Can we trust execution environment? Can we trust execution environment? Are the languages usable—syntax and semantics? Can they play nice together? Can we really trust the code?

  19. Fewer “Bugs” Fewer “Bugs” Benefits of Modeling Savings Time

  20. Fewer “Bugs” Fewer “Bugs” Code Generation Coding effort greatly reduced Savings Time

  21. Qualified Code Generation(theory) Unit testing moved here. Unit testing eliminated for generated code Time Savings

  22. Code Generation Concerns Concept Formation • Is our model “right”? • Can we trust the execution environment? • Can we trust our analysis tools? • Can we trust our properties? Requirements Properties System Specification/Model Integration Can we trust the code generator? Implementation

  23. Provably correct compilers Very hard (and often not convincing) Proof carrying code Total correctness required Base all specification testing on the generated code Loose the benefits of working at the specification level Generate test suites from specification Compare specification behavior with generated code to better trust your specification testing Unit testing is now not eliminated, but completely automated Generate Specification/Model Output Specification Based Tests Implementation Output “Correct” Code Generation

  24. Certify the execution environment Too costly and probably impossible Specification based testing Any discrepancy and either the code generator is wrong, or the execution environment is wrong, or the target platform is faulty When have we tested enough? Specification coverage criteria What is adequate coverage? Criteria for measurement are not good for generation Technically covering the specification, but with useless tests Do we reveal faults Tradeoff between the impossible and the inadequate Specification Testing

  25. Proof Techniques (theory) Reduced testing since properties proved correct in specification stage Proofs performed here Time Savings

  26. Verification Trust • We need properties (requirements)!!! • Often lost in the modeling “frenzy” Concept Formation Requirements Properties How do we trust our proofs? System Specification/Model Integration Proof validity in production environment? Implementation

  27. Certify analysis tools Too costly and probably impossible Use redundant proof paths Technically feasible, but is the redundancy “trustworthy”?? Cost… Automation is key Must keep analysis cost under control Generate test suites from specification Low cost since it is already done for the code generator Many languages and many analysis techniques Trusted Translators? Proof Techniques Trusted Translators? PVS Theorem Prover Translation SMV Translation Model Checker RSML-e SAL Translation State Exploration

  28. Proof Techniques (worst case) Added burden that cannot be leveraged later Most analysis is not easy, nor cheap! Time Savings

  29. Iterated Weekly? Daily? Hourly? • Abstraction cost amortized • Impact of change on abstraction • Approximate techniques in day-to-day activities Regression Verification 100s, if not 1000s, of properties Large Evolving Model Analysis Result

  30. Yes! Can We Achieve the Goal? ? Redundant proof process (PVS, SMV, Prover, SAL,…) ? Abbreviated system testing augmented with generated tests ? Specification testing Test case generation ? Automated unit testing (to MC/DC?)—to check code generator and specification execution environment Verifiable code generator ? Time Savings

  31. Perfection is Not Necessary ≥ Missed Faults • We only need to be better than what we are now… • How do we demonstrate this? • Empirical studies are of great importance

  32. Education of Regulatory Agencies • Regulatory agencies are very conservative • And rightly so… • Avionics software is very good • We need to understand regulatory and industry concerns to get our techniques into practice • We need to have convincing evidence that our techniques work and are effective

  33. New Challenges for V&V • Validate models • The models must satisfy the “real” requirements • Validate the properties used in analysis • Model testing crucial to success • Validate tools • We will rely a lot on tools for model validation, can we trust them? • Creative use of testing necessary • Verify and Validate generated code • Can we trust that the translation was correct? • Test automation crucial • Includes output to analysis tools • Adapt to the various modeling notations • Models will not come in one language • Translation between notations and tools

  34. Discussion

More Related