380 likes | 422 Views
Verification Techniques for Macro Blocks (IP). Overview Inspection as Verification Adversarial Testing Testbench Design Timing Verification. Overview of Macro Verification. Overall verification plan Strategies Sub-block simulation Macro simulation Prototyping Limited production.
E N D
Verification Techniques for Macro Blocks (IP) • Overview • Inspection as Verification • Adversarial Testing • Testbench Design • Timing Verification
Overview of Macro Verification • Overall verification plan • Strategies • Sub-block simulation • Macro simulation • Prototyping • Limited production
Overall Verification Plan • Functional verification based on • Requirements • Specifications • Behavioral models • Extended functions (for generalization and re-use) • Engineering review of verification plan • Documentation of verification steps
Strategy: Bottom Up Testing • Verify individual sublocks • Focus on coverage of data values and internal states • Verify complete macro (block) • Focus on interfaces and timing among sub-blocks • Test I/O timing and “corner cases” • Perform prototyping • Proto-board • PC board • FPGA • Platform based design*
Verification vs. Validation • In the hardware world: • Verification means testing • Validation means “formal proof” • In the software world: • Verification means “formal proof” • Validation means testing Don’t be confused…
Verification vs. Manufacturing Testing • Verification is to confirm that the design is correct • It meets the requirements, specifications • It meets timing, power, and area goals • It meets the re-use goals It assumes a debugging cycle • Manufacturing Test • Fixed set of tests (vectors from above) • Go/NoGo tests
Fundamental Ideas • Controllability • Can you wiggle every wire? • Can you set every bit? • Observability • Can you see the state of every bit? • Can you see the value on every wire? During validation, we can do this in the simulator – especially in a bottom-up mode (In manufacturing test this is not so easy)
Types of Verification Tests • Compliance Testing • Against specifications and/or standards • Corner Case Testing • Try to break the design with complex scenarios • “Answer Yes or No [Y]?” > fred • Random Testing • Use statistical models to generate data patterns • Real Code Testing • Work with the software team, use real data • Regression Testing • Keep all tests in a suite and add to them
bugs found time Testing in the Product Development Cycle • More complete tests find more subtle bugs • Fixes often make design more complex • Test time goes up • New bugs are introduced by fixes • Repeat…
Verification Tools • Simulation • functional, cycle, RTL, gate level, circuit level • Testbench automation tools • Verisity (and others) testing language • Code coverage tools • Measurements of what got wiggled • Hardware Modeling • Comparison against / with real hardware • Emulation • High speed hardware boxes interfaced to simulator • Prototyping
“Static” Tools • Static timing analysis tools • “Lint-like” tools • Formal verification tools • Model checking • Build abstract model of design and environment • Uses “smart” exhaustive methods for safety and liveness tests • Liveness: something good will eventually happen • Safety: nothing bad will ever happen • Theorem proving • Uses automatic theorem provers to verify assertions about the design
Inspection as Verification “The fastest, cheapest, and most effective way to detect and remove bugs is by careful inspection of the design and the code” • Design reviews • Large group review requirements, specifications, assumptions and overall design • Code reviews • Smaller groups perform line-by-line review of implementation Reviews must be done in a collegial atmosphere with the goal of enhanced quality, not to asses designer performance
Prototyping • Use Prototyping when: • You can afford it • $cost • $time • You can not wait for the time it takes to find bugs using available simulation techniques • The prototype is just a fast simulator for the device itself
Adversarial Testing The designer’s tests have the goal to show that the system will work… • Adversarial Testing uses a verification team of experts in testing all kinds of designs • Goal: develop tests to show that the system will fail • Report failures, and test cases to design team
Sub-Block Simulation • Use automated response checking • Not appropriate for designer to check waveforms “by eye” • How many times will it be done? • Use of automated test to build macro block testbench • Use assertions in code to test assumptions • Test sub-blocks for 100% statement and path coverage
Testbench • The testbench For each component • Is the model of its external world • Must generate (all) legal input sequences • Must check output sequences • Must alert the designer and log discrepancies between expected and actual results • The testbench is often more complex than component itself
InputTransactionGenerator Device UnderTest (DUT) Output Transaction Checker Assume simple I/O relationships Sub-block Testbench Design • Input transaction data types, sequences • Output transaction data types, sequences
Input Stimulus • Generate “all” legal sequences of inputs • Look at functionality of DUT • Check corner cases • Achieve 100% coverage
Output Checking • Check for legal sequences of outputs • Look at functionality of DUT • Check corner cases • Check input/output relationships • Ideally each sub-block has • Simple Functionality • Simple I/O
Getting ready for integration • Timing checks • Timing plan for each sub-block • Clocking, handshaking • Timing budgets, per-cycle latency budget • Area and power checks • Are the sub-blocks within their bugets
Concurrent Design, Synthesis & Testbench Development Test Team Design Team Synthesis Team Macro Specifications / Behavioral Model Functional Specifications, Documentation, Behavioral Model DevelopMacro-Test Bench,Test Cases Specify Blocks Coordinate Timing Macro Partitioning and Block Specifications Code and Test Synthesize Block Design Block Integration Test Full Block Synthesize Full Block Validate against Specification and Test Plan / Productization
Integration Steps • Integrating the sub-blocks • Top level netlist • Final synthesis • Final testing • Manufacturing testability • Scan insertion • ATPG • Test coverage
Timing Verification • Post synthesis timing • “more real” delays • Delay models based on statistical models • No layout Not real delays • Check for clock/data skew • Signal loading, critical path analysis
Post Synthesis Test Bench Development • Refine the Behavioral Model for testing and verification • Wrappers for each sub-block as needed (for signal compatibility before/after synthesis) • After synthesis every wire is a “bit” • Document test plan and test vectors
Macro Testbench • Design is more complex • Functional complexity • I/O Complexity • More people involved • Testbench will be delivered with product • Testbench will be used to develop scan path and manufacturing tests
Generating Testbenches • Not simple “do scripts” • Complex state machines: • Loop • Generate stimulus • Get output • Check results • Log results • Change stimulus parameters
DUT ROMMemory I/O Bus Address Bus Data Bus RegisterFile Clock/Timing Generator I/O Testbench Simple Example • Simulate Busses • Check bus timing • Check data ranges • Simulate ROM • Also, supply data values • Check address ranges • Register File • Also, r/w timing • Overwriting data • I/O • Also, response to asynchronous events • Response to out of range data
Help! • Use existing models for bench components • Have a test team, with their own test library • Take library components and enhance them with checking/logging functions • Use behavioral models for bench • Bus functional models • Need to have statistical bounds on input/output timing • Use random generators for stimulus sequences • Use C/C++ models and co-simulation for bench • Need to have a “master bench controller” to coordinate I/O behaviors on all components
SynthesizedBlock InputTransactionGenerator Output Transaction Checker BehavioralModel Use the Behavioral Model
Use Commercial Tools • Vera - Synopsis • Specman Elite – Verisity Both tools use abstract data types to allow user to define data-ranges, states, and scenarios. Tools support statistical models, for stimulus, checking, and logging (beyond the scope of this class)
Test, Test, Test • Back to Requirements and Specifications • Corner Cases • Random Cases • Regression Testing • Code Coverage
Types of Coverage • Statement Coverage • # times each statement executed • Branch Coverage • Each side of each if/then else executed • Condition Coverage • Checks each boolean clause in branch condition • Path Coverage • Paths between blocks (if – if, if – else, else – if) • Trigger Coverage • Each signal on sensitivity list • Toggle Coverage • 1/0 transition on every signal
Still Not Enough • Does not verify that code works right, just that it was run • Does not verify that design meets specifications or requirements • Does not check results of optimization and synthesis tools