380 likes | 542 Views
Requirements-Based Testing. Dr. Mats P. E. Heimdahl University of Minnesota Software Engineering Center Dr. Steven P. Miller Dr. Michael W. Whalen Advanced Computing Systems Rockwell Collins 400 Collins Road NE, MS 108-206 Cedar Rapids, Iowa 52498 spmiller@rockwellcollins.com.
E N D
Requirements-Based Testing Dr. Mats P. E. Heimdahl University of Minnesota Software Engineering Center Dr. Steven P. Miller Dr. Michael W. Whalen Advanced Computing Systems Rockwell Collins 400 Collins Road NE, MS 108-206 Cedar Rapids, Iowa 52498 spmiller@rockwellcollins.com
Outline of Presentation Motivation Validation Testing Conformance Testing What’s Next
How We Develop Software SW High-Level Reqs. Development HW/SW Integration Testing SW Design Description Dev. (SW Low-Level Reqs. & SW Arch. SW Integration Testing SW Source Code Dev. SW Low-Level Testing SW Integration (Executable Code Production)
How we Will Develop Software(From V to a Y) SW High-Level Reqs. Development HW/SW Integration Testing Software Model SW Integration Testing How do we know our model is correct? Validation Testing Formal Verification Can we trust the code generator? Conformance Testing SW Integration (Executable Code Production)
Outline of Presentation Motivation Validation Testing Conformance Testing What’s Next
How we Will Develop Software(From V to a Y) SW High-Level Reqs. Development HW/SW Integration Testing Software Model SW Integration Testing How do we know our model is correct? SW Integration (Executable Code Production)
Modeling Process Desired Model Properties High-Level Requirements Software Model Low-Level Requirements SW Integration (Executable Code Production) SW High-Level Reqs. Development
Problem—Modeling Frenzy Headfirst into modeling SW High-Level Reqs. Development Desired Model Properties Software Model How do we know the model is “right”? How do we test the model? SW Integration (Executable Code Production)
One Solution: Redefine Requirements System System Development Development Processes Processes (ARP 4754) (ARP4754) Software Software Development Development Processes Processes (DO (DO - - 178B) 178B) System Reqs. Development HW/SW Integration Testing The model is the requirements Software Model SW Integration Testing Use Engineering Judgment when Testing SW Integration (Executable Code Production)
One Solution: Redefine Requirements System System Development Development My Comment Processes Processes (ARP 4754) (ARP4754) Software Software Development Development Processes Processes (DO (DO - - 178B) 178B) System Reqs. Development HW/SW Integration Testing The model is the requirements Software Model SW Integration Testing Use Engineering Judgment when Testing SW Integration (Executable Code Production)
Testing Does not go Away System Reqs. Development HW/SW Integration Testing Software Model SW Integration Testing Extensive Testing (MC/DC) SW Integration (Executable Code Production)
It Simply Moves System Reqs. Development HW/SW Integration Testing Software Model SW Integration Testing Extensive Testing (MC/DC) SW Integration (Executable Code Production)
Do it Right! Desired Model Properties Analysis(Model Checking,Theorem Proving) Software Model SW Integration (Executable Code Production) SW High-Level Reqs. Development Specification Test – Is the Model Right?
How Much to Test? State Coverage Masking MC/DC? MC/DC Decision Coverage? Transition Coverage? Something New?? Def-Use Coverage? Where Do the TestsCome From?
Requirements Based Testing Properties are Requirements… Desired Model Properties Software Model SW Integration (Executable Code Production) SW High-Level Reqs. Development Cover the Properties!
Requirements Based Testing Advantages • Objective Measurement of Model Validation Efforts • Requirements Coverage in Model-based Development • Help Identify Missing Requirements • Measure converge of model • Basis for Automated Generation of Requirements-based Tests • Even If Properties Are Not Used for Verification, They Can Be Used for Test Automation How Are Properties “Covered” with Requirements-based Tests?
Property Coverage “If the onside FD cues are off, the onside FD cues shall be displayed when the AP is engaged” • G(((!Onside_FD_On & !Is_AP_Engaged) -> X(Is_AP_Engaged -> Onside_FD_On)) • Property Automata Coverage • Cover a Synchronous Observer Representing the Requirement (Property) • Structural Property Coverage • Demonstrate Structurally “Interesting” Ways in Which the Requirement (Property) Is Met
Property Automata Coverage • Cover Accepting State Machine As Opposed to Structure of Property • Büchi Coverage • State Coverage, Transition Coverage, Lasso Coverage…
Alternative Machine • Different synthesis algorithms give different automata • Will affect the test cases required for coverage
Structural Property Coverage • Define Structural Coverage Criteria for the Property Specification • Traditional Condition-based Criteria such as MC/DC Prime Candidates • Property Coverage Different than Code Coverage • Coverage of Code and Models • Evaluate a decision with a specific combination of truth values in the decision • Coverage of Properties • Run an execution scenario that illustrates a specific way a requirement (temporal property) is satisfied
Example • G(((!Onside_FD_On & !Is_AP_Engaged) -> X(Is_AP_Engaged -> Onside_FD_On)) • Demonstrate That Somewhere Along Some Execution Trace Each MC/DC Case Is Met • Only the “positive” MC/DC cases • The negative cases should have no traces • In the Case of G(p)—Globally p Holds—we Need to Find a Test Where • in the prefix the requirement p is met • we reach a state of the trace where the requirement p holds because of the specific MC/DC case of interest – let us call this case a • then the requirement p keeps on holding through the remainder of the trace • p U ( a U X(G p)) p p a p p p
Summary • Objective Measurement of Model Validation Efforts • Requirements Coverage in Model-based Development • Help Identify Missing Requirements • Basis for Automated Generation of Requirements-based Tests • Even If Properties Are Not Used for Verification, They Can Be Used for Test Automation and Test Measurement • Challenges • How Are Properties Specified? • Combination of Observers and Temporal Properties • What Coverage Criteria Are Suitable? • How Is Automation Achieved? • How Do We Eliminate “Obviously” Bad Tests? Should We? • How Do We Generate “Realistic” Test-cases? • Rigorous Empirical Studies Badly Needed
Outline of Presentation Motivation Validation Testing Conformance Testing What’s Next
How we Will Develop Software(From V to a Y) SW High-Level Reqs. Development HW/SW Integration Testing Software Model SW Integration Testing Can we trust the code generator? SW Integration (Executable Code Production)
Provably Correct Compilers Very Hard (and Often Not Convincing) Proof Carrying Code Generate Test Suites From Model Compare Model Behavior With Generated Code Unit Testing Is Now Not Eliminated, but Largely Automated “Correct” Code Generation—How? Generate Specification/Model Output Specification Based Tests Implementation Output
Existing Capabilities • Several Commercial and Research Tools for Test-Case Generation • TVEC • Theorem Proving and Constraint Solving techniques • Reactis from Reactive Systems Inc. • Random, Heuristic, and Guided Search • University of Minnesota • Bounded Model Checking • NASA Langley • Bounded Model Checking/Decision Procedures/Constraint Solving • Tools Applicable to Relevant Notations • In Our Case Simulink
An Initial Experiment • Used a Model of the Mode Logic of a Flight Guidance System As a Case Example • Fault Seeding • Representative Faults • Generated 100 Faulty Specifications • Generate Test Suites • Selection of Common (and Not So Common) Criteria • Fault Detection • Ran the Test Suites Against the Faulty Specifications • Recorded the Total Number of Faults Detected
Fault Finding Results Same Effort
Model “Cheats” Test Generator FCS Architecture
Summary • Automated Generation of Conformance Tests • Current Technology Largely Allows This Automation • Challenges • Development of Suitable Coverage Criteria • Effect of Test Set Size on Test Set Effectiveness • Effect of Model Structure on Coverage Criteria Effectiveness • Traceability of Tests to Constructs Tested • Empirical Studies of Great Importance
Outline of Presentation Motivation Conformance Testing Validation Testing What’s Next
New Challenges for Testing • Model Validation – Requirements-based Testing • How Do We Best Formalize the Requirements? • What Coverage Criteria Are Feasible? • Which Coverage Criteria Are Effective (If Any)? • How Do We Generate “Realistic” Tests? • Will This Be a Practical (Tractable) Solution? • Conformance Testing • What Coverage Criteria Are Effective? • Detecting Faults From Manual Coding • Detecting Faults From Code Generation • Relationship Between Model Structure and Criteria Effectiveness • Traceability From Tests to Model • Relationship Between Model Coverage and Code Coverage • Optimizations in Code Generator Will Compromise Coverage
Perfection is Not Necessary ≥ Missed Faults • Tools and Models Only Need To Be Better Than Manual Processes… • How Do We Demonstrate This? • Empirical Studies Are of Great Importance I Think Many Already Are
DO-178B Test Objectives Requirements-Based Testing Conformance Testing • The executable code complies with the high-level requirements. • The executable code complies with the specification (low-level requirements). • Test coverage of high-level requirements is achieved • Test coverage of specification (low-level requirements) is achieved • Test coverage of the executable code is achieved