220 likes | 338 Views
ATM Meeting #13. Test Description “DiagTestDescription” section. * design decision in blue. Topics. Test Actions Pre- & Post-Conditions, Initialization & Termination Parameters & Variables Integration of IEEE P-1641 signals. Test Actions.
E N D
ATM Meeting #13 Test Description “DiagTestDescription” section * design decision in blue
Topics • Test Actions • Pre- & Post-Conditions, Initialization & Termination • Parameters & Variables • Integration of IEEE P-1641 signals
Test Actions • Objective: Formal representation for Test behavior, through a sequence of Test Actions • Background • The “TRD DTI Sheet” model lacks support for • Sequence for stimulus & measurement operations • Operations other that “stimulate” & “measure”, ex. bus transfers • Multiple measurement operations
Test Actions... • Background (cont’d) • TRD developers use a “DTI Continuation Sheet”, with free-form text • In some cases, the format to be used in DTI Continuation Sheets is specified in organization-specific guidelines (loose) • A standard format for Test Actions would: • Improve sharing of test description documentation by different organizations • Improve the communication between the Test Description developer and the Test Program developer by enforcing a more rigorous specification format • Improve data validation • Support automatic code generation, etc.
Test Actions... • Use Case (Jose Gonzales, Indra) Action 1: apply 28 VDC at UUT points HI J1-10 LO J1-30 Action 2: apply a short between UUT points J2-7 and J2-9 Action 3: send word "AABF" thru RS232 serial link, UUT connector J10 Action 4: read (measure) digital value at UUT point J3-2 Action 5: verify digital value • The DTI Sheet model • Does not support Action 2 (and other possible actions) • Cannot express the sequence of stimuli and responses (in some cases they could be "interlaced") • Does not provide a way to associate limits to responses
Test Actions... • Design Requirements (original document) • 1. Support description of a Test by an ordered list of Test Actions. • 1.1. For simplicity, support only linear sequences (i.e., no branches). The free-form description (see below) should be used when Test behavior is more complex. • 2. Keep support for free-form description of test behavior. In the same way stimuli & measurement data are in many situations insufficient for specifying test behavior, Test Actions may be insufficient in some cases. • 3. Do not restrict the types of actions. May be achieved as follows: (1) do not define Action types, or (2) define a set of common types & allow extension (TBD). • 4. Support parameters for each Test Action. • 4.1. If a set of common action types is defined, these definitions may include required & optional parameters, or not (TBD). • 4.2. Parameters can receive values from Test Inputs. • 4.3. Parameters can return values to Test Outputs (?). • 5.Support the specification of UUT Connections. These may be represented by parameters, or by specialized elements (TBD).
Test Actions... • Example
Test Actions... • Discussions • Jose Gonzales • Prefer to define a set of common actions plus an extension mechanism, rather than leaving it completely open. • Establishing a predefined set of arguments for each action could be difficult. Perhaps we could leave it open. In any case, all parameters should be optional. • A test action parameter should include as a minimum: • Name • Value • Units • Qualifier – optional (maybe) • Type (desirable) • Connections should be a special type of parameters.
Test Actions... • Discussions (cont’d) • Chris Gorringe • For Connections, use IEEE P1641 connector names. • Sequence in Example 2 can be described as an IEEE P1641 STD entity: <Signal xmlns="STDBSC" xmlns:tsf="TSFLIB" > <TwoWire hi="J1-10" lo="J1-30" In='s2' name="s28vdc"/> <tsf:DC_SIGNAL dc_ampl="28.0 V" name='s1' /> <TwoWire name="short" hi="J2-7" lo="J2-9" In="s2" /> <Constant name="s2" amplitude="0 Ohm" Sync="s28vdc"/> <Series name="SerialLink" via="J10" In="s3" /> <tsf:RS_232 name="s3" data_word="AABF" Sync="short"/> <Series name="DigitalSignal" via="J3-2" Sync="SerialLink"/> <Measure name="DigitalValue" In="DigitalSignal" As="tsf:SerialDigital“ attribute="data"/> </Signal> • Alternative: use IEEE P1641 Test Procedure Language (TPL) • Dynamic test description, as opposite to above example, which is “static” • TPS statements: Setup, Reset, Connect, Disconnect, Enable, Disable, Read, Change, Compare, Wait_For • Ion Neag: • Maybe all we have to do is provide a hook for IEEE P1641 test description
Test Actions... • Summary • Possible solutions: • Free-form text only. Anything beyond that is specifying test implementation. • Define Test Actions in ATML, along the lines described in the posted Design Document. This solution offers a formal model for information that is frequently present in TRDs, as free-form text. Formalization improves data exchange between organizations and supports automatic data processing (ex. code generation). • Use 1641 • Hook for BSC-based mode. Offers more advanced sequencing capabilities (ex. actions that trigger each other). This model may be IEEE 1641. • Define XML model for TPL. Extensible. • Support a) and b) above
Test Actions... • Decisions • Define XML model for 1641 TPL. • Jose, Chris to compare with Indra use case. • Ion to develop schema. • Question: should the model be extensible (by defining new actions)? Look at this after having an initial design. • May need to add support for specifying requirements (ex. accuracy). • Provide hook for other XML representation of Test behavior. May be organization-specific. In any case, it must be defined by a schema. • Support free-form text. To be used only if 1 and 2 do not have sufficient capabilities.
Preconditions & Postconditions • Objective: Provide additional sequencing information to application executives. • Necessary when using dynamic reasoners • Example: power/stim short test; power up; power down • It may be useful in conjunction with static fault trees. • Examples ?
Preconditions & Postconditions... • Issue 1: Meaning of "pre-conditions" and "post-conditions" • Use Case documents present this feature in two different ways: • a. Pre-conditions are tests (or actions, or sequences?) that must have been executed before a particular test (and possibly must have returned a particular outcome - ex. T1000 has PASSED). • b. Post-conditions describe the effect of particular test on the state of the UUT (ex. POWER in ON) or the TPS (ex. T1000 has PASSED). Pre-conditions are requirements regarding the state of the UUT or TPS that must be fulfilled before running a particular test. Before running a test, the Application Executive looks at its pre-conditions, identifies the tests whose post-conditions match the pre-conditions and runs those tests. • Question: Which approach/approaches do we want to support in the schema design? • Chris G.: not (a) • Joe Stanco: Initialization is a set of test actions required before a test (group) can be performed. Termination is a set of test actions required after completion of a test (group). The state(s) of the entity prior to the test is precondition. The state(s) after, post-condition. • Decisions • Try an implementation of b), with an example, then revisit the choice between a) and b) • For now, execute Tests for achieving pre-conditions (rather than creating a new type of “action”). May revisit later; possible solution: define a common base type, specialize it as a Test and as the “entity” that is executed to achieve pre-conditions. • Tentatively support Exit conditions (see next slide) • Can be specified for Tests & sequences • Revisit after defining Groups
Preconditions & Postconditions... • Design decision: • Pre-condition; must have been achieved before running the Test (IEEE 100); entry state (Aeroflex) • Post-condition; condition that is guaranteed to be true after Test execution (IEEE 100); target state (Aeroflex); consequence; effect; • Exit condition: must be achieved after running the Test, unless the next Test requires a different pre-condition (Chris) • Diverse opinions on the utility of this use case; Ion to define schema; Chris to build an example; group to re-assess utility based on the example
Preconditions & Postconditions... • Issue 2: Mixing pre/post-conditions and fault trees • Pre/post-conditions are necessary for the "dynamic reasoner" use case. With fault trees, identical behavior can be achieved by embedding directly in the fault tree the tests to be executed before and after a particular test. The following points were made: • Using pre/post-conditions in conjunction with fault trees may be confusing when looking at test descriptions, because sequencing is specified in two different ways, which the reader has to "aggregate" mentally. • Test executives support mixing pre/post-conditions and fault trees. Users make use of this capability. • Question: Should we restrict the use of pre/post-conditions in conjunction with fault trees? • Chris G.: Not sure we need to - Is there a specific example that highlights this? • Joe S.: Do not restrict. • Arguments for allowing mix • Using pre & post-condition facilitates use with reasoners • Easier to write sequences • Can invoke individual tests • Supports fault trees generated from FMECA data • Not allowing: • Readability • Decision: write the statement that would restrict the mix of pre & post-conditions; post & discuss in telecons
Preconditions & Postconditions... • Issue 3: Initialization/termination • Initialization/termination tests or sequences can be specified for a TPS (or sequence? or test block?). These tests/sequences can perform initialization and cleanup operations on the UUT, the test station and the TPS. • In the "dynamic reasoner" use case, identical behavior can be achieved through pre/post-conditions. • With fault trees, identical behavior can be achieved by embedding directly in the fault tree the tests to be executed at the beginning and the end of a sequence. However, specifying a termination test/sequence is more convenient than embedding it in all the terminal braches of the fault tree. On the other hand, there are cases when behavior at TPS termination depends on the way it is terminated (i.e., the terminal branch); in this case, a unique termination test/sequence cannot be used. • Questions: • Should we support the initalization/termination approach? • Chris G.: Yes but different to conditions. • Joe Stanco: Yes • Should these be initialization/termination tests, sequences or both? • Chris G.: All • Specified at which level? • Chris G.: Highest level possible suggest notion of Group • Joe S. The Group notation is the highest level for that class it represents. • Decisions • For now: Initialization/termination sequences; can be specified at the sequence level only • Revisit after defining Groups
Parameters & Variables • Objective: Enable the transmission of data between Tests • Example: The result of a Test is used for calculating the result of a different Test • Supported by the “TRD DTI Sheet” model - store measurement results in “variables” (ex. “INTO VAR1”) • Potential problems: • Tests must be executed in proper order; not guaranteed with dynamic reasoners • May be specified as precondition • More?
Parameters & Variables... • Suggested design: • Variables can be defined under Diag_Test_Description • Same data types as Test inputs & outputs • Test outputs can be stored in variables • Test inputs can be retrieved from variables • Questions: • 0. Support variables? • 1. Support specification of initial values for variables? • Can be used as “constants”. • 2. Support storage of signals into variables (in addition to parameters)? • 3. Do values of variables get stored in Test Results? When, during execution? • If initial values are supported for variables, these values may be stored, for traceability. • Not necessary, if Test Description instance document is available • Other than that: because variable values can originate only from Test outputs, which are stored in Test Results, there is no need to store the values of variables. • Decision (Jan 11) • Identified 2 solutions: variables & referencing outputs from inputs (see Journal document) • Try 2 XML models & examples; revisit
Parameters & Variables... • Alternatives (Jan. 12) • 1. Variables • Readability – meaningful names • Can use as constants • Can go through sequence boundaries; is this good? • 2. Outputs -> Inputs • Readability – simpler; data flow analysis simpler • Can derive dependencies (pre-conditions) easily • Encapsulation – ex. cannot pass data through sequence call boundaries without explicitly passing as arguments • 3. Outputs -> Inputs + Constants • 4. Outputs -> Inputs + Variables • Decisions (Jan 12) • Experiment first with solution (2) • Inputs can originate from outputs of multiple Tests; if several of these Tests were previously executed, the output of the last one is used • Re-assess decision based on examples • If 2. is preserved, look at adding support for constants
Integration of P1641 signal descriptions • Objective: • Support specification of signals as Test inputs & outputs • Background • IEEE 1641 defines • A set of signals - Test Signal Framework for ATLAS • Includes most of the signals defined in IEEE 716-95 C/ATLAS • XML schema defines format • XML instance document (conformant to schema) defines signals • A mechanism for defining signals • Provides extensibility
Integration of P1641 signal descriptions... • Issues: • What to support in Test Outputs • 1. Signal type & one or more signal attributes • What if we don’t know the signal type? • Signal type is what is supposed to be there when it’s good • measurements that are signal type-independent can be returned as ValueType; signal are useful only when they aggregate multiple attributes, or provide semantic context for attributes • Also enables passing of Test output values to Test inputs • 2. Signal attribute (outside signal type context) • May be signal-type independent (ex. current)? Look at SMML for examples. • Decision: support 1. • Signal values must be stored in Test Results • Add support in future schema revision? • Will probably end up as a data type in Common? Extend ValueType? • Compatibility with signal-oriented description of instrument capabilities • Support specification of requirements via range, resolution, accuracy (in addition to “immediate” stimulus values)? • At which level – Test, Test Description, both? • Other • Should be able to specify requirements for signal attributes other than the one to measure; these are Test inputs • Issues to be considered in design: • Use 1641 schema & instance document to validate • combine Signals & ValueType • experiment with signal type extensibility • revisit range, resolution, accuracy after joint discussion with Instruments • bring up Signals issue in Framework discussion