1 / 35

DEVS Agents to Support Conformance Testing of Emerging Defense Information Standards

DEVS Agents to Support Conformance Testing of Emerging Defense Information Standards. Bernard P. Zeigler, Arizona Center for Integrative Modeling and Simulation Tucson Arizona www.acims.arizona.edu and Joint Interoperability Test Command (JITC) Fort Huachuca, Arizona. Outline.

gaerwn
Download Presentation

DEVS Agents to Support Conformance Testing of Emerging Defense Information Standards

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DEVS Agents to Support Conformance Testing of Emerging Defense Information Standards Bernard P. Zeigler, Arizona Center for Integrative Modeling and Simulation Tucson Arizona www.acims.arizona.edu and Joint Interoperability Test Command(JITC) Fort Huachuca, Arizona

  2. Outline • “Agent-supportedsimulation deals with the use of agents as a support facility to enable computer assistance in problem solving or enhancing cognitive capabilities.” (ADS’06 definition) • The agent-supported simulation metaphor applies to testing the conformance of multi-agent systems to complex defense information standards • Structure control agents induce structural change in themselves or others to effectuate different behaviors under different circumstances • Structure control implemented in Dynamic Structure DEVS enables automation of standards conformance testing • DEVS formalism is capable of capturing the information-processing complexities, including the dynamics, underlying the MIL-STD-6016C standard • DEVS modeling and simulation methodology is attaining core-technology for automated testing of military tactical data link standards

  3. Background: Simulation-based development implies the need for simulation-based testing System Under Test (SUT) Specified as abstract model, e.g. UML Test Device send/receive messages send/receive messages Connecting middle ware, e.g. HLA Network -- distributed simulation allows testing a system that is first formulated as an abstract model -- both the System under Test (SUT) and the test device are coupled by an common interface Raises the question: How to develop the Test Device in an authoritative manner?

  4. Problem: Conflicting Requirements • To deal with the increasing complexity and advanced decision capabilities of C4ISR systems =>testing methodology has to become more rigorous, in-depth and thorough • To keep up with the rapid change and short development life cycles expected from the system builders => tests have to be ready to conduct in time scales compatible with the agile development strategies of new systems. • Solution: employ DEVS-based M&S • to increase capabilities for simulation-based testing • and • as a basis to increase the automation of testing processes.

  5. Theater DSP/SBIRS Warning ABL AWACS JLENS F-15 THAAD TEL PATRIOT MEADS AVENGER ATACMS SIS(MSCS) SIS(MSCS) AEGIS (CEP) Testing of interface standards is a focus area for automated simulation-based testing. Link-16 is required in all Joint and multi-national operations. Link-16 specification The Joint Interoperability Test Command (JITC) is developing an automated test generation methodology as its core technology for testing conformance of systems to Link-16 This methodology is fundamentally enabled by the DEVS formalized modeling and simulation approach

  6. Transaction Level - example P.1.2 = Drop Track Transmit 1 Preparation 2 Processing 3 Modify C2 Record for TN Transmit Msg Rule Processing Constraints (Exception) Rules Validity checking Track Display Time outs Operator decisions Periodic Msg Other ConsequentProcessing Jumps (stimuli) to other Transactions of specification Stop Stop, Do Nothing, Alerts, Or jump to other Transaction Output from Input to system system DEVS t t t t 1 2 3 4 DEVS Representation of Link-16 System Theory Provides Levels of Structure/Behavior

  7. Recent Successful Application The JITC employed such testing for the initial major milestone evaluation of the Integrated Architecture Behavior Model (IABM) developed by the Joint Single Integrated Air Picture (SIAP) System Engineering Organization (JSSEO) in 2005. The test exercise produced significant results that uncovered flaws in the model design and added acknowledged value to the model development.

  8. Types of Distributed Simulation Testing

  9. Multiplatform Distributed Simulation – controlled testing Platform (System, Component) Platform (System, Component) Platform (System, Component) Test Driver Test Driver controls the scenario

  10. Multiplatform Distributed Simulation - uncontrolled testing Platform (System, Component) Platform (System, Component) Platform (System, Component) Observer Observer Observer Test Coordinator Distributed Observers look for opportunities to test

  11. Test Driver for Controlled Testing Coupled Test Model Component Test Model 1 Component Test Model2 Component Test Model 3 Jx1,data1 Jx2,data2 Jx3,data3 Jx1,data1 Jx2,data2 Jx3,data3 Jx1,data1 Jx2,data2 Jx3,data3 Jx4,data4 Jx4,data4 Jx4,data4 Middleware SUT

  12. Test Model • holdSend(Jx1,data1,t1) • holdSend (Jx2,data2,t2) • holdSend (Jx3,data3,t3) • waitReceive(Jx4,data4) Jx1,data1 Jx2,data2 Jx3,data3 Jx4,data4 t1 t2 t3 t4 time receiveAndProcess(Jx1,data1) receiveAndProcess(Jx2,data2) receiveAndProcess(Jx3,data3) transmit(Jx4,data4) SUT Model Test Model Generation for Controlled Testing Mirroring (flipping) the transactions of a SUT model (system model behavior selected as a test case) allows automated creation of a test model

  13. Test Manager for Opportunistic Testing • Replace Test Models by Test Detectors • Deploy Test Detectors in parallel, fed by the Observer • Test Detector activates a test when its conditions are met • Test results are sent to a Collector for further processing Test Manager Jx1,data1 Jx2,data2 Jx3,data3Jx4,data4 Test Detector 1 Results Collector SUO Observer Test Detector 2 Other Federates Test Detector 3

  14. Test Detector • processDetect(Jx1,data1,t1) • processDetect(Jx2,data2,t2) • processDetect(Jx3,data3,t3) • waitReceive(Jx4,data4) Jx1,data1 Jx2,data2 Jx3,data3 Jx4,data4 t1 t2 t3 t4 time receiveAndProcess(Jx1,data1) receiveAndProcess(Jx2,data2) receiveAndProcess(Jx3,data3) transmit(Jx4,data4) SUO Test Detector Generation for Opportunistic Testing The Test Detector watches for the arrival of the given subsequence of messages to the SUO and then watches for the corresponding system output • Define a new primitive, processDetect, that replaces holdSend • Test Detector • Tries to match the initial subsequence of messages received by the SUO • When the initial subsequence is successfully matched, it enables waitReceive (or waitNotReceive) to complete the test

  15. Observer & System Under Observation (SUO) System (e.g. DEVS) inports inports outports outports Tap into inputs and outputs of SUO Observer outports Observer For System Gather input/output data and forward for testing

  16. Example: Joint Close Air Support (JCAS) Scenario Natural Language Specification JTAC works with ODA! JTAC is supported by a Predator! JTAC requests ImmediateCAS to AWACS ! AWACS passes requestImmediateCAS to CAOC! CAOC assigns USMCAircraft to JTAC! CAOC sends readyOrder to USMCAircraft ! USMCAircraft sends sitBriefRequest to AWACS ! AWACS sends sitBrief to USMCAircraft ! USMCAircraft sends requestForTAC to JTAC ! JTAC sends TACCommand to USMCAircraft ! USMCAircraft sends deconflictRequest to UAV! USMCAircraft gets targetLocation from UAV!!

  17. Observer of AWACS with JCAS Observer is connected to SUO and monitors itsI/O traffic Data gathered by Observer addObserver(USMCAircraft, JCASNUM1);

  18. Test Detector Prototype:Sequence Matcher processDetect(J2.2,data1,t1) processDetect(J3.2,data2,t2) Sequential triggering, same as test models waitReceive (J7.0,data3,t3)

  19. Example of Effect of State: AWACS Rules R1: if phase = “passive” & receive= "ImmediateCASIn“ then output = "CASResourcesSpec" & state = "doSurveillance“ R2: if state = "doSurveillance“ & receive= "sitBriefRequestIn“ then output = "sitBriefOut“ & phase = “passive” i1 i2 matchsequence 1: initial state = passive processDetect(ImmediateCASIn,””,1) waitReceive(CASResourcesSpec,””) o1 need to know the state to enable this sequence o2 matchsequence 2: initial state = doSurveillance processDetect(sitBriefRequestIn,””,1) waitReceive(sitBriefOut,””) state = doSurveillance state = passive

  20. Solution: make activation of matchsequence2 conditional on matchsequence1 matchsequence2 can only start when matchsequence1 has successfully been performed

  21. Observation Test Of AWACS Observer of AWACS AWACS Test Manager

  22. Problem with Fixed Set of Test Detectors • after a test detector has been started up, a message may arrive that requires it to be re-initialized • Parallel search and processing required by fixed presence of multiple test detectors under the test manager may limit the processing and/or number of monitor points • does not allow for changing from one test focus to another in real-time, e.g. going from format testing to correlation testing once format the first has been satisfied Solution • on-demand inclusion of test detector instances • remove detector when known to be “finished” • employ DEVS variable structure capabilities • requires intelligence to decide inclusion and removal

  23. Dynamic Test Suite: Features • Test Detectors are inserted into Test Suite by Test Control • Test Control selects Detectors based on incoming message • Test Control passes on just received message and starts up Test Detector • Each Detector stage starts up next stage and removes itself from Test Suite as soon as the result of its test is known • If the outcome is a pass (test is successful) then next stage is started up

  24. Dynamic Inclusion/Removal of Test Detectors Test Manager Active Test Suite Test Control removeAncestorBrotherOf(“TestControl"); message arrives test detector subcomponent removes its enclosing test detector when test case result is known (either pass or fail) add induced test detectors into test set addModel(‘test detector”); addCoupling2(" Test Manager ",“Jmessage",“test detector", “Jmessage");

  25. A C B E D Adding Ports and Coupling by Relation • Methods can be issued by any atomic model: • add my ports and coupling to ancestor relation of another model – E.g., find my ancestor that is a brother of the given model, and add my ports and associated internal and external coupling to this ancestor A A B B C D C D E addMyPortsToAncestorBrotherOf(B) E Issued by E, adds E’s ports to C, which is ancestor brother of B, and does associated coupling

  26. Adding Ports and Coupling (cont’d) • Methods can be issued by any atomic model: • add output ports and coupling from source to destination – adds outports of source as input ports of destination and couples them • add input ports and coupling from source to destination – adds inports of source as outputs of destination and couples them A A B B In1 out1 addOutportsNcoupling(C,B)) addInportsNcoupling(C,B)) C C D D E E in1 out1 Issued by E, allows E to communicate with B

  27. A C B E D Removing Models by Name and by Relation • Methods can be issued by any atomic model: • remove model by name – accepts the unique name of any atomic or coupled model within hierarchy and removes it and its immediate coupling • remove model by relation to another model – e.g., remove ancestor that is a brother of the given model, can be used to avoid providing atomic models with names of other models A A B B removeModel2(C) C D E removeAncestorBrotherOf(B) Issued by D, removes its parent which in this case is its ancestor that is a brother of B

  28. AWACS Opportunistic Testing in JCAS CAS Model with AWACS observation Test Control Initially empty Test Suite

  29. AWACS Opportunistic Testing in JCAS (cont’d) Test Control observes CAS request message to AWACS Test Control adds appropriate Test Detector and connects it in to interface,

  30. AWACS Opportunistic Testing in JCAS (cont’d) First stage detector verifies request message receipt and prepares to start up second stage Test Control passes on start signal and request message

  31. AWACS Opportunistic Testing in JCAS (cont’d) First stage detector removes self from test suite second stage waits for expected response from AWACS to request

  32. AWACS Opportunistic Testing in JCAS (cont’d) Second stage observes correct AWACS response and removes itself and starts up second part

  33. AWACS Opportunistic Testing in JCAS (cont’d) At some later time, second part of Test Detector observes situation brief request message to AWACS First stage removes itself and starts up second stage

  34. AWACS Opportunistic Testing in JCAS (cont’d) Second stage observes situation brief output from AWACS thus passing test, It removes itself and enclosing Test Detector

  35. Summary • DEVS formalism is capable of capturing the information-processing complexities, including the dynamics, underlying the MIL-STD-6016C standard • DEVS modeling and simulation methodology is attaining core-technology for automated testing of military tactical data link standards • Structure control implemented in Dynamic Structure DEVS enables automation of standards conformance testing • Structure control agents induce structural change in themselves or others to effectuate different behaviors under different circumstances • The agent-supported simulation metaphor applies to testing the conformance of multi-agent systems to complex defense information standards

More Related