220 likes | 425 Views
Embedded Systems: System Implementation— Hardware Test and Debug. fig_09_09. System Development--V cycle model: “ideal”. Code / hardware. fig_09_09. Question: what are detailed activities at each level? What is plan? Example: hardware test and debug
E N D
Embedded Systems: System Implementation— Hardware Test and Debug
fig_09_09 System Development--V cycle model: “ideal” Code / hardware fig_09_09
Question: what are detailed activities at each level? What is plan? Example: hardware test and debug Good idea: keep a record of your “best practices” (organization / individual)
Testing: important terms UUT / DUT—unit or device under test Resolution—fineness of measure possible. Ex: 3 levels of res. = 3 decimal places Mean, variance---note this implies REPEATED tests (computer, automation mean there is no excuse for scrimping on repetitions) RMS—root mean square—how closely does mean value in repeated set of measurements approach “true value”? Let xi be ith measurement, x= mean, vi = (xi – x)2: RMS = { (v1 + … + vN) / N } ½ Bias—how closely does mean approach true value? Residual—measured value minus mean Statistical tolerance level—how much variability was due to test system? Test limits must be outside this Test limits—upper and lower physical limits of the measurement Golden unit—unit whose behavior is completely known—can be used as a standard
fig_10_00 Why do we test? Principal reasons fig_10_00
fig_10_01 • Example: UUT = and gate • Possible plan: • Ensure logical functionality—this is an AND gate • Verify signal values: VOHmin, VOLmax, VIL min, VILmax • Verify dynamic behavior, confirm values of tPDHL, tPDLH, trise, tfall fig_10_01
fig_10_02 Test specification: Step 1: Step 2: test voltage levels; must decide on error tolerance (e.g., +/- 0.003 VDC) Step 3: test timing. Must decide on error tolerance (e.g., +/- 0.001 ns) fig_10_02
fig_10_02 Carry out tests Requires “egoless design” Do a design review—what if anything needs to change? Note that prototype and production tests may need to be different fig_10_02
Terms: Black box White box Gray box: mixture of white and black box example: system includes parts from outside vendor There will be a series of tests (“V” strategy): individual components more complex modules …… entire system Hardware / software strategies may be different; UML can provide some measure of commonality int erms of strategy fig_10_02 fig_10_02
fig_10_07 Be aware of common problems: example—component works individually; part of system works, but when component added system no longer works: Why? As current demand increases, there is increasing drop across internal impedance of power supply, so output voltage is decreasing. Fix: increase current limit if power supply not part of design under test or modify design fig_10_07
fig_10_09 Testing combinational logic: path sensitizing fig_10_09
fig_10_11 Example: test C, then B / A fig_10_11
fig_10_14 Example: shared path—or gate fig_10_14
fig_10_17 Example: shared path—and gate– test A, C, then B fig_10_17
Masking / untestable faults Example: 1 0 on B can produce transient 1 on D Can include redundant term to mask this but if bottom AND gate is stuck at 1 we get an untestable potential fault In practice: maximize what you can test, strategy will be design-specific fig_10_18 fig_10_18
Other cases that will be encountered (examples): Single variable-multiple paths—path sensitizing Bridge faults: fig_10_20 fig_10_20
fig_10_25 Sequential logic: scan path testing: example fig_10_25
fig_10_28 Adding coded state values fig_10_28
table_10_01 Test steps (first two rows of table): table_10_01
fig_10_30 • Testing terms: • Alpha—inhouse • Beta—friendly outside users • Verification—”proof” of product—”only as good as test suite” • Validation—verification tests are testing what they claim • Acceptance—customer tests on acceptance & during use • Production—ensure quality, execute quickly—these do not “add value” • Self tests—built-in; note testing is overhead, can introduce its own errors • --on startup • --on demand • --in background fig_10_30