1 / 21

Reanimating Test Logs for UML Models Verification

Explore how to leverage test data by reanimating test logs against System Reference Models (SRMs) and verify the SRMs against test logs. This approach is particularly useful for flight control software and related domains where testing facilities are expensive and difficult to access.

jmccauley
Download Presentation

Reanimating Test Logs for UML Models Verification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Executing UML Models as Verification Oracles Tom Gullion SRMV Toolsmith tgullion@msisinc.com (608) 315-2780

  2. Overview • Charter • Execute UML models as oracles of behavior exhibited by software under test • Challenge • Testing facilities for flight control software and related domains are expensive • in terms of cost and resources and access • Verification teams often find it difficult to get adequate access to those testing facilities • Goal • Reanimating test logs from the test labs against System Reference Models • That is, verify the SRM against test logs. And vice versa!

  3. The Challenge • Testing facilities for flight control software and related domains are expensive • in terms of cost and resources and access • Verification teams often find it difficult to get adequate access to those testing facilities • Access to logs from test runs is common • Access to software simulations may happen • Access to live test rigs could possibly happen… • How can we leverage this test data in Verification?

  4. Test Data Reanimation • Data from a test log is used as a stream of input events • SRM is implemented/extended as a test oracle • Behavior of the model, under that stimulus stream, is compared with the behavior logged for the software under test

  5. Basic Reanimation Approach Reanimate SRM test definition comand/event stack data converted to stream of events exec SRM as test oracle Behavior of the model, under that stimulus stream, is compared with the behavior logged for the software under test

  6. Test log expectations • Logs from testing / simulation facilities will have timeframes, command names, parameter data and other metadata • We won’t be interested in all of it (at least, at first) • We may even prefer to extract relevant data from multiple sources for our own scenarios (e.g., testing across component boundaries)

  7. Case study: Ejection Seat Behavior

  8. Test scenario • Example: the testing facility ran a simulation of a flight where the crew ejected

  9. Reanimate Ejection Seat Events • Focus only on ejection seat commands in test logs • Configure model runner environment • Execute commands against SRM to reanimate the events

  10. Reanimation details COMMENT START SCRIPT Flight with Eject 3330 EJECTION_SEAT arm 6322 EJECTION_SEAT enable 6323 EJECTION_SEAT eject 6326 EJECTION_SEAT separate 6330 EJECTION_SEAT releaseMain COMMENT END SCRIPT playback against SRM model element(s) Input events (Frame, Command, Parameter(s)) Results FRAME: 3330 EJECTION SEAT PHASE CHANGE TO Armed FRAME: 6322 EJECTION SEAT PHASE CHANGE TO Ready FRAME: 6323 EJECTION SEAT PHASE CHANGE TO Ejected FRAME: 6326 EJECTION SEAT PHASE CHANGE TO Separated FRAME: 6330 EJECTION SEAT PHASE CHANGE TO MainChute Scenario complete: SUCCESS

  11. Ejection seat scenario results • Use reanimation results to verify logged events match SRM modeled behavior • But this scenario only verifies a single slice of the SRM. Let’s expand the reanimation to include flight modes…

  12. Reanimating multiple statemachines playback against SRM model element(s) Input events (Frame, Command, Parameter(s)) Results FRAME: 3320 FLIGHT MODE CHANGE TO PreFlight FRAME: 3330 EJECTION SEAT PHASE CHANGE TO Armed FRAME: 4532 FLIGHT MODE CHANGE TO Taxi FRAME: 5832 FLIGHT MODE CHANGE TO Flight FRAME: 6322 EJECTION SEAT PHASE CHANGE TO Ready FRAME: 6323 EJECTION SEAT PHASE CHANGE TO Ejected FRAME: 6326 EJECTION SEAT PHASE CHANGE TO Separated FRAME: 6330 EJECTION SEAT PHASE CHANGE TO MainChute Scenario complete: SUCCESS 3320 FLIGHT_MODE preflight 3330 EJECTION_SEAT arm 4532 FLIGHT_MODE taxi 5832 FLIGHT_MODE takeoff 6322 EJECTION_SEAT enable 6323 EJECTION_SEAT eject 6326 EJECTION_SEAT separate 6330 EJECTION_SEAT releaseMain COMMENT END SCRIPT

  13. Flight with Ejection seat scenario results • Now we’ve verified logged events match SRM modeled behavior in two statemachines • This scenario only verifies proper event sequencing within its scope. But we’re not checking preconditions and other rules. • Extend interfaces defined in SRM to implement rules…

  14. Reanimating the SRM playback against SRM model element(s) Results FRAME: 3320 FLIGHT MODE CHANGE TO PreFlight FRAME: 3330 FlightController: Ejection seat armed EJECTION SEAT PHASE CHANGE TO Armed FRAME: 4532 FLIGHT MODE CHANGE TO Taxi FRAME: 5832 FLIGHT MODE CHANGE TO Flight FRAME: 6322 EJECTION SEAT PHASE CHANGE TO Ready FRAME: 6323 FlightController: Canopy released FlightController: Ejection seat positioned FlightController: Ejection seat ejected FlightController: main chute deployed EJECTION SEAT PHASE CHANGE TO Ejected FRAME: 6326 EJECTION SEAT PHASE CHANGE TO Separated FRAME: 6330 EJECTION SEAT PHASE CHANGE TO MainChute Scenario complete: SUCCESS Input events (Frame, Command, Parameter(s)) 3320 FLIGHT_MODE preflight 3330 EJECTION_SEAT arm 4532 FLIGHT_MODE taxi 5832 FLIGHT_MODE takeoff 6322 EJECTION_SEAT enable 6323 EJECTION_SEAT eject 6326 EJECTION_SEAT separate 6330 EJECTION_SEAT releaseMain COMMENT END SCRIPT

  15. Complete scenario results • Now we’ve verified logged events match SRM modeled behavior in two statemachines as well as a flight controller class • From this framework, it is now possible to add in telemetry data and other command data. • This structure is setup for reuse across other reanimations.

  16. Extensible Reanimation Approach Reanimate SRM test definition(s) comand/event stack exec adverse condition data Software Simulator live data feed Pluggable input data types; Pluggable exec environments Live Test Rig

  17. Extensible Reanimation Approach Pluggable input streams allow for future use of • Other test logs (related tests, adverse conditions, etc.) • "live" data feeds • the use of accredited simulations as live or stale data sources Some or all these data sources can be used without changing the overall approach

  18. Conclusions • This approach is appropriate for complex collections of analysis cases • Simple analysis cases probably do not require reanimation • We are building part of a larger construct • Analyze adverse conditions against specific tests logs or accredited simulations • Enables use/reuse of SRM without extensive modification • Enables traceability from reanimation result back through SRM to originating requirements • Allow distributed machines to play different roles (data source vs data analysis) • Flexibility for machines (real and virtual) to interface with existing infrastructure (IV&V not required to manage all machines) • This approach embraces the existing overall framework and is designed to scale

  19. Applying the 3 Questions • What is the system supposed to do? • Verified by success and failure scenarios • What is the system not supposed to do? • Verified by success and failure scenarios • What does the system do under adverse conditions? • Verified by adverse conditions test data • Can predict possible failure scenarios that don’t seem adequately handled, propose test scenarios for dev team

  20. Acknowledgements • Thanks to NASA IV&V for the opportunity to present these concepts • Special thanks to Frank Huy, Bill Stanton, Todd Gauer, Karl Frank, Steve Driskell, Dan Sivertson, Tom Hempler and Jeff Micke for technical contributions and assistance

  21. Questions? / Comments? Thank you! Tom Gullion tgullion@msisinc.com (608) 315-2780

More Related