1 / 48

Stimulus and Response

Stimulus and Response. Stimulus and Response. Simple Stimulus Verifying the Output Self-Checking Testbenches Complex Stimulus Complex Response Predicting the Output. Simple Stimulus. Generating Stimulus is the process of providing input signals to the DUV

howardw
Download Presentation

Stimulus and Response

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stimulus and Response

  2. Stimulus and Response • Simple Stimulus • Verifying the Output • Self-Checking Testbenches • Complex Stimulus • Complex Response • Predicting the Output

  3. Simple Stimulus • Generating Stimulus is the process of providing input signals to the DUV • Every input to the DUV is an output from a stimulus model • Any deterministic waveform is easy to generate

  4. Verilog Example 1 ‘timescale 1ns/1ns Module testbench; … Reg clk; Parameter cycle = 15; Always Begin #(cycle/2); clk=1’b0; #(cycle/2); clk=1b1; End endmodule

  5. Verilog Example 2 ‘timescale 1ns/1ns Module testbench; … Reg clk; Parameter cycle = 15; Always Begin #(cycle/2.0); clk=1’b0; #(cycle/2.0); clk=1b1; End endmodule

  6. Verilog Example 3 ‘timescale 1ns/100ps Module testbench; … Reg clk; Parameter cycle = 15; Always Begin #(cycle/2); clk=1’b0; #(cycle/2); clk=1b1; End endmodule

  7. Simple Stimulus (Cont) – Complex Waveforms • Complex waveforms • Care must be taken not to over constrain the waveform generation or limit it to a subset of its possible variations. • Need to make sure that there are many instances of absolute min and max values • Controlled Randomization

  8. Simple Stimulus (Cont) – Synchronized Waveforms • Synchronized Waveforms • Stimulus for a DUV is never composed of 1 signal. You must synchronize all inputs to the DUV properly • In a synchronous design, most signals should be aligned with the clock

  9. Sample 5-9 Always Begin #50 clk = 1’b0; #50 clk = 1’b1; End Initial Begin Rst = 1’b0; #150 rst = 1’b1; #200 rst = 1’b0; End Sample 5-10 Always Begin #50 clk <= 1’b0; #50 clk <= 1’b1; End Initial Begin Rst = 1’b0; #150 rst <= 1’b1; #200 rst <= 1’b0; End Sample of Synchronized Waveforms

  10. Simple Stimulus (Cont) – Generating Waveforms • We talked about Delta Cycles – Can be equivalent to real delays. • If, due to delta cycle problems, you miss a value at one clock edge, then you will get that value on the next! • Must have everything aligned! • Generating Data Waveforms • If not done properly, could produce race conditions • Can’t ensure that total number of delta cycles between clock and data is maintained, or at least in favor of the data signal. • Interfaces specs never specify 0-delay values, thus when generating synchronous data, always provide a real delay between active edge and transition on the data signal.

  11. Simple Stimulus (Cont) – Encapsulating and Abstraction Waveform Generation • Encapsulating Waveform Generation • Generation of waves may need to be repeated during simulation • Place then generation in a subprogram and call that subprogram with the vector to be applied as the input to the subprogram • Abstracting Waveform Generation • Using synchronous test vectors (as above) is cumbersome and hard to interpret (maintainability) • Easier if operations accomplished by the vectors were abstracted! • Try to apply worst possible combinations of inputs

  12. Simple Stimulus (Cont)– Abstraction Waveform Generation Example • 2-1 input sync reset D flip flop • Inputs: rst, d0, d1, sel, clk • Output d_out • Subprograms needed: • Reset • Load input d0 • Load input d1

  13. Abstraction Waveform Generation Example (Cont) • Reset • Worst possible condition: • D0=1 • D1=1 • Sel = randomly set • Load d0 • Worst possible condition: • D1 is complement of d0

  14. Task sync_reset; Begin rst<= 1’b1; d0 <= 1’b1; d1 <= 1’b1; @(posedge clk); #(Thold); {rst,d0,d1,sel} <=4’bxxxx; #(cycle-Thold-Tsetup); End endtask Task load_d0; input data; Begin rst <= 1’b0; d0 <= data; d1 <= ~data; sel <= 1’b0; @(posedge clk); #(Thold); {rst,d0,d1,sel} <=4’bxxxx; #(cycle-Thold-Tsetup); End endtask Abstraction Waveform Generation Example (Cont)

  15. Abstraction Waveform Generation Example (Cont) initial Begin sync_reset; load_d0(1’b1); sync_reset; load_d1(1’b1); load_d0(1’b0); load_d1(1’b1); sync_reset; ….. End

  16. Verifying the Output • Generating Stimulus is only about 30% of job, 70% is in verifying output • Most obvious method is visually • ASCII output • Waveforms

  17. Producing Simulation Results • Which signals are significant change with time • In order to determine what is correct, must model this knowledge • Producing the proper simulation results involves modeling the behavior of the signal sampling • Sample at regular intervals (clk) • Sample on interested signals (only when they change)

  18. Minimizing Sampling • Minimizing the sampling improves the simulator performance • In VHDL only put interesting signals on sensitivity list or use: ‘wait until <interesting condition>’ • In Verilog use • $monitor(“…”, <signal list>) • $monitoroff • $monitoron

  19. Visual Inspection of Waveforms • Results are better (to understand) when plotted over time • Advantage is that it plots the signal continuously overtime, not at specified points as in text view (the samples) • Tool dependent on how to turn on • Performance impact, want to minimize the total number of signals to view • Mostly used for debug

  20. Self-Checking Testbenches • Use self-checking – different techniques • Specify input and expected output for each clock cycle • Problems • Difficult to maintain • Difficult to specify • Difficult to debug • Require perfectly synchronous interfaces

  21. Self-Checking Testbenches (Cont) • Golden Vectors – Set of reference simulation results • DUV vectors are captured and then compared against the golden set. • If results are stored in ASCII format, use diff command • Some tools allow for waveform comparisons • Significant maintenance • Separate clock domain references

  22. Self-Checking Testbenches (Cont) • Run-Time Result Verification • Results compared in parallel with the stimulus generation • Use a reference model • The DUV and reference model are subjected to same stimulus • Outputs of both, DUV and reference model, are constantly monitored and compared.

  23. Self-Checking Testbenches (Cont) • Focus on operations instead of input and output vectors • Include the verification of the operations that were put into the subprograms. • Instead of simply applying stimulus, include the checking, now just run the operations, individually or in sequence. • Must verify that the operations are being performed.

  24. Complex Stimulus • Talked about simple stimulus • Complex stimulus includes feedback from DUV to the stimulator • Most desirable is a bus-functional model that is configurable.

  25. Complex Stimulus (Cont) • Feedback between stimulus and design • Generator can wait for feedback before continuing • Include timing and functional verification in the feedback monitoring • Using feedback can cause deadlock during testing. • DUV may not provide feedback and the model may not provide any more stimulus until there is feedback • Eliminate the possibility of deadlock! • I.e. a timeout (with error!) and test continues • Testcase fails and stops immediately

  26. Complex Stimulus (Cont) • Asynchronous interfaces • Most vectors are inherently synchronous • Many interfaces are specified in an asynchronous fashion (even though synchronous – FSM’s, flip-flop’s, etc) • If a clock is not specified in the specification, then it should not be part of the verification nor part of the stimulus • Behavioral model do not need a clock • Need to think of all failure modes

  27. Complex Stimulus (Cont) – Example • CPU operations encapsulated using procedures • Encapsulating complex stimulus is known as BFM’s (Bus Functional Models) • If you had a specification for a 386sx read cycle, then using vector stimulus would be inefficient – why?

  28. Complex Stimulus (Cont) - Example • Extend the CPU operations to include writes. Using test vectors, can’t do read-modify-write operations. • To do this, you’ll want to use the value returned on a read for a future write (after modifying it).

  29. Complex Stimulus (Cont) • Configurable operations • If you have interfaces that have certain signals that are configurable • Don’t want to create nearly identical models – maintenance issues • Simple configurable elements become complex when grouped. • Solution - Create one model with configurable operations. Now you can use the model however you need to.

  30. Complex Response • We identified that visual inspection is not the way to go. And that was with simple responses, what about complex responses. • Must automate this, one way to perform this is with BFM’s • What is a complex response?

  31. Complex Response – (Cont) • Example – UART transmit path • Waiting for output before applying next input would prevent the ability to stress (or cause interesting conditions) • Filling up the FIFO is one • Stress the DUV under max conditions, must decouple generation from checking.

  32. Complex Response – (Cont) • How do you deal with unknown or variable latency? • This latency is usually a by-product of the architecture or implementation. You may not care what it is. • If it is a by-product of the implementation and not a design requirement, why enforce one in verification?

  33. Complex Response – (Cont) • How to verify output independently? • Put output checkers and stimulus generators in separate execution threads. • Processes in VHDL • Always/initial and fork/join in Verilog • Must synchronize to know when to start checking, etc.

  34. Complex Response – (Cont) • Earlier we encapsulated input operations – can do the same for outputs • For stimulus, the subprograms took the arguments as stimulus. • For output operations, take the arguments as the expect results (results that the DUV should output) • Implementation should be as configurable as the stimulus. • Remember – consider all possible failure modes.

  35. Complex Response – (Cont) • This procedure ‘recv’ is very limited. • Only can be used in the current scope. • You pass in the expect and it compares the actual to this expect (predefined). • What if output is to be ignored until a predetermined sequence of events? Or data? • What if the output needs to be fed back to the stimulus model? • What if….? • What if….? • Solution is to create a more generic output monitor.

  36. Complex Response – (Cont) • Generic Output monitor: • Return the data that the DUV output back to the caller! • The ‘higher authority’ now makes the call to what is correct and what is not. It is also controlling the stimulus model, therefore it knows more of the state of the environment and what is to be tested. • The other things (protocols, etc) are still be verified. • But do not arbitrarily constrain the input.

  37. Complex Response – (Cont) • Monitoring multiple possible operations • You may have a situation where more than one type of output may be OK. (branch prediction, out of order processing, etc) • Can’t predict unless you model the details of implementation. • If you verify for a particular order, over constraining environment (starting directed tests).

  38. Complex Response – (Cont) • How do you write an encapsulated output monitor for this? • 1st write a monitor that identifies the next cycle. • Verifies the preamble to all operations on the output interface until it becomes unique • It then returns any information collected thus far to the testbench. • Testbench is left up to call the appropriate subprograms to complete the verification.

  39. Complex Response – (Cont) • We defined a stimulator as one who has outputs. If a monitor must provide output back to the DUV, is it not a stimulator? • Stimulator (or generator) is a model that initiates a transaction • Monitor is a model that may/many not respond to an operation initiated by the DUV.

  40. Complex Response – (Cont) • Monitoring bi-di interfaces • Example: bridge chip • Cycles initiated on the on-chip bus are translated to PCI transactions (if addresses match) • Allows master devices (on-chip) to transparently access slave devices on the PCI bus

  41. Complex Response – (Cont) • What do you need to verify this? • On-chip bus cycle generator • PCI bus cycle monitor • Could you use a memory as the slave device instead of monitor? • Have the generator write to PCI space then read it back, and continue doing this in a random fashion?

  42. Complex Response – (Cont) • Using the PCI monitor reduces the risk • Have the monitor detect the PCI cycle and have it notify to the testbench along with the address being read or written. • The testbench would decide if it is correct.

  43. Complex Response – (Cont) • Let’s slice up the PCI cycle up into multiple monitors • One to handle the preamble and type of cycle • One to handle each data transfer • Input/Output of monitor for data read/written • Output to indicate whether to continue with more data xfers or terminate • One to handle cycle termination • Now that the cycle is broken up, we have the ability to provide the master and slave to throttle the transfer rate. It can assert irdy_n and trdy_n (these can be parameters for randomization).

  44. Complex Response – (Cont) • By using the generic PCI bus monitor, the testcase becomes shorter. The monitor provides access to all bus values. • Also provides an easy mechanism to catch exceptions • Also enable the use of few addresses to provide a sufficient test suite.

  45. Predicting the Output • Unstated assumption with self-checking testbenches is that you have detailed knowledge of the output to be expected • This is the most crucial factor • Knowing exactly which output to expect and when determines the functional correctness.

  46. Predicting the Output – (Cont) • Data formatters • Expect output = data input (reformatted) • Simplest output prediction process • May want to forward data to the monitor one value at a time. • Due to pipelines and latency, this may constrain the generator (can’t change data until it is checked) • Instead of one value at a time, use a FIFO structure • May want to use a global array that both generator and monitor use. • Another aspect (similar to global array) is to read in values from a file. This way can be more dynamic.

  47. Predicting the Output – (Cont) • Packet Processors • Portion of packet is transformed somehow • Other portion is untouched • Use untouched field to encode the expected transformation • Simplifies testbench • All controls for stimulus and expect generation are in one location. • Include all necessary information in payload to determine correctness.

  48. Predicting the Output – (Cont) • Complex Transformations • Expected output can only be determined by reproducing same transformation. • use alternative means (different algorithm) • Example was the DSP (using reals)

More Related