230 likes | 355 Views
Teaching Functional Verification Workshop DAC 2002, Sunday June 9 th. Testbench Automation Concepts. Title of Presentation Presenter. Matthew J Morley. Agenda. The University Program Functional Verification at a Glance Key Testbench Concepts Setting Objectives
E N D
Teaching Functional Verification Workshop DAC 2002, Sunday June 9th. Testbench Automation Concepts Title of Presentation Presenter Matthew J Morley
Agenda • The University Program • Functional Verification at a Glance • Key Testbench Concepts • Setting Objectives • Modeling and Data Abstraction • Data Injection and Collection • Self Checking • Summary
The University Programme • A low key affair since October 1999 • http://www.verisity.com/programs/university • Currently around 15 member universities • Canada, USA, Europe and Asia • You Gain: • Access to Verisity’s products and documentation • e Language reference materials • Entry into Verification Vault upon request • On-line use of Verification Advisor • We offer limited support for research projects and course development around e and Specman Elite
Verification Advisor • An extensive repository of verification lore • Organized into design patterns • HTML based for easy access • Kept in sync with Specman major releases
Verification Advisor • Provides expert guidance from the beginning: • What should I do first? What is the right design process? • What kind of data structures and checking schemes do I need for this task? • As well as concrete, clearly documented, examples for the most complex activities: • Selecting verification components for data-transfer devices, bus-client devices, and protocols like PCI bus • Building a fully functional eVC from scratch
The Functional Verification Space Verification Process Verif. Env Development Writing Tests Module Design Stage Software Chip System IP SOC Trend
Agenda • The University Program • Functional Verification at a Glance • Key Testbench Concepts • Setting Objectives • Modeling and Data Abstraction • Data Injection and Collection • Self Checking • Summary
Functional View Design/Interface Specification and Functional Test Plan Constraint-driven Test Generation Data & Temporal Checking Functional Coverage Analysis HDLSimulator HDL (Device Under Test)
Tests Simulation Process View Design Spec Environment Development… Functional Test Plan Test Environment Verification… DUT Coverage Analysis
Agenda • The University Program • Functional Verification at a Glance • Key Testbench Concepts • Setting Objectives • Modeling and Data Abstraction • Data Injection and Collection • Self Checking • Summary
Setting Objectives Require to establish measurable criteria in order to judge how thorough the verification effort has been • What do we want to achieve overall? • What information do we have, is it adequate? • What features are present, behaviors expected; how do these features interact? • What are misbehaviors? • How to codify these as individual goals?
Coverage • Test that an interrupt is handled during JSR. • The instruction decoder received an interrupt signal during a JSR instruction. • For each interrupt received record the opcode in the decode register.
Coverage Group extendinstr_s { event interrupt; -- Emitted on interrupted cover interrupt is { item opcode using ranges = { range ([ADD..SHFL], “Arithmetic Op”); range ([JMP..JSR], “Branch Op”); }; }; };
Modeling Structure & Data Start mush as the designer would, modeling: • Overall system architecture • Major data flows and functional elements But paying much closer attention to • (structure) Peripherals, I/O channels • (behavior) BFMs that drive/readout data • (data) The data themselves
Data modelling Upping the level of abstraction somewhat… struct frame { kind : [LLC, L2, ETHERNET];llc : LLCHeader; destAddr : uint (bits:48);srcAddr : uint (bits:48); size : int;}; Physical data and virtual (auxiliary) data both play an important role in the verification environment.
Each frame has a payload of a certain size: extend frame {payload : list of byte; keep payload.size() in [0..size];}; An individual test may specify a more specific size… extend frame { keepsize == 0;}; Constraining Data Inputs
Test Generation So this is what we mean by constrained, random test generation • The testbench defines the underlying test infrastructure • Individual test files constrain that environment with specific goals in mind • The generator creates a whole family of test stimuli given different random seeds
BFM Random Gen Data Space DUT pack() Test Criteria Packing Input Transactor struct frame { kind : [LLC, L2, ETHERNET];llc : LLCHeader; destAddr : uint (bits:48);srcAddr : uint (bits:48); size : int;payload : list of byte; }; serial_bits = pack(low,current_frame)
BFM DUT unpack() ? Output Transactor Data Checking Our favorite checking scheme involves something like a scoreboard • Upon generation/injection compute expected packet header and destination channel • Enter packet into scoreboard • When packets emerge check them off the SB Dilemma: how to determine that the emerging packet corresponds to that transmitted?
? input output DUT Temporal Checking expect @exec_intr => { [..9]; true(‘service’==0); [3]*(not rise(‘service’) } @clk;
coverage K --- L - J ---- DB Abstract test file input output DUT Data Generator ? Log checking Summary • Data generation & stimulus injection • Data retrieval & coverage recording • Check Data integrity & transformation • Check sequential & temporal behavior