390 likes | 544 Views
Sequential Equivalence Checking Across Arbitrary Design Transformation: Technologies and Applications. Viresh Paruthi, IBM Corporation J. Baumgartner, H. Mony, R. L. Kanzelman. Outline. Equivalence Checking Overview Combinational Equivalence Checking (CEC)
E N D
Sequential Equivalence Checking Across Arbitrary Design Transformation:Technologies and Applications Viresh Paruthi, IBM Corporation J. Baumgartner, H. Mony, R. L. Kanzelman Formal Methods in Computer-Aided Design, 2006
Outline • Equivalence Checking Overview • Combinational Equivalence Checking (CEC) • Sequential Equivalence Checking (SEC) • Use of SEC within IBM • IBM’s SEC Solution • SEC Applications • SEC Challenges • Conclusion Formal Methods in Computer-Aided Design, 2006
{0, 0, …}? {x0, x1, …} R1 Logic 2 Logic 1 R2 Equivalence Checking • A technique to check equivalent behavior of two designs • Validates that certain design transforms preserve behavior • Logic synthesis, manual redesign does not introduce bugs • Often done formally to save resources, eliminate risk Formal Methods in Computer-Aided Design, 2006
Logic 1 0? X R1 S Logic 2 0? R2 Combinational Equivalence Checking (CEC) • No sequential analysis: latches treated as cutpoints • Equivalence check over outputs + next-state functions • Though NP-complete, CEC is scalable+mature technology • CEC is the most prevalent formal verification application • Often mandated to validate synthesis Formal Methods in Computer-Aided Design, 2006
Sequential Equivalence Checking (SEC) • Latch cutpointing req’ment severely limits CEC applicability • Cannot handle retimed designs, state machine re-encoding, ... • Cutpointing may cause mismatches in unreachable states • Often requires manual introduction of constraints over cutpoints • SEC overcomes these CEC limitations • Supports arbitrary design changes that do not impact I/O behavior • Does not require for 1:1 latch or hierarchy correspondence • Known mappings can be leveraged to reduce problem complexity • Check restricted to reachable states • Explores sequential behavior of design to assess I/O equivalence Formal Methods in Computer-Aided Design, 2006
SEC Is Computationally Expensive • Sequential verification: more complex than combinational • Higher complexity class: PSPACE vs. NP • Model checking is thus less scalable than CEC • SEC deals with 2x size of model checking! • Composite model built over both designs being equiv checked • However, tuned algorithms exist to scale SEC better in practice Formal Methods in Computer-Aided Design, 2006
SEC Paradigms • Various SEC paradigms exist • Initialized approaches • Check equivalent behavior from user-specified initial states • Assumes that designs can be brought into known reset states • Uninitialized approaches, e.g. alignability analysis • Require designs to share a common reset mechanism • Compute reset mechanism concurrently with checking equivalence from a reset state Formal Methods in Computer-Aided Design, 2006
IBM’s Approach: Initialized SEC • More flexible: • Enables checking specific modes of operation • Applicable even if initialization logic altered (or not yet implemented) • Applicable even to designs that are not exactly equivalent • Pipeline stage added? check equivalence modulo 1-clock delay • data_out differs when data_valid=0? check equiv only when data_valid=1 • More scalable: 1,000s to even 100,000+ state elements • Reset mechanism computation adds (needless) complexity • Validation of reset mechanism can be done independently • Functional verification performed w.r.t. power-on reset states Formal Methods in Computer-Aided Design, 2006
SEC Usage at IBM • IBM’s SEC toolset: SixthSense • Developed primarily for custom microprocessor designs • Also used by ASICs; for (semi-)formal functional verification • Use CEC to validate combinational synthesis • Verity is IBM’s CEC toolset • Also used for other specific purposes, e.g. ECO verification • Use SEC for pre-synthesis HDL comparisons • Sequential optimizations manually reflected in HDL • SEC efficiently eliminates the risk of such optimizations Formal Methods in Computer-Aided Design, 2006
SixthSense Horsepower • SixthSense is a system of cooperating algorithms • Transformation engines (simplification/reduction algorithms) • Falsification engines • Proof engines • Unique Transformation-Based Verification(TBV) framework • Exploits maximal synergy between various algorithms • Retiming, redundancy removal, localization, induction... • Incrementally chop problem into simpler sub-problems until solvable • Transformations yield exponential speedups to bug-finding (semi-formal), as well as proof (formal) applications Formal Methods in Computer-Aided Design, 2006
Design N Redundancy Removal Engine Design N’ Result N Retiming Engine Design N’’ Result N’ Target Enlargement Engine Design N’’’ Result N’’ Result N’’’ Transformation-Based Verification (TBV) Formal Methods in Computer-Aided Design, 2006
Design + Driver + Checker Counter-example Trace consistent with original design 140627 registers Combinational Optimization Engine SixthSense Problem decomposition via synergistic transformations optimized trace 119147 registers Retiming Engine optimized, retimed trace 100902 registers Localization Engine optimized, retimed, localized trace 132 registers Reachability Engine Transformation-Based Verification Framework These transformations are completely transparent to the user All results are in terms of original design Formal Methods in Computer-Aided Design, 2006
Example SixthSense Engines • Combinational rewriting • Sequential redundancy removal • Min-area retiming • Sequential rewriting • Input reparameterization • Localization • Target enlargement • State-transition folding • Isomorphic property decomposition • Unfolding • Semi-formal search • Symbolic sim: SAT+BDDs • Symbolic reachability • Induction • Interpolation • … • Expert System Engine automates optimal engine sequence experimentation Formal Methods in Computer-Aided Design, 2006
Key to Scalability: Assume-then-prove framework • Guess redundancy candidates • Equivalence classes of gates • Create speculatively-reduced model • Add a miter (XOR) over each candidate and its equiv class representative • Replace fanout references by representatives • Attempt to prove each miter unassertable • If all miters proven unassertable, corresponding gates can be merged • Else, refine to separate unproven candidates; go to Step 2 Formal Methods in Computer-Aided Design, 2006
Assume-then-prove Framework • Speculative reduction greatly enhances scalability • Generalizes CEC • Sequential analysis only needed over sequentially redesigned logic • Proof step is the most costly facet • Most equivalences solved by lower-cost algos (e.g. induction) • However, some equivalences can be very difficult to prove • Failure to prove a cutpoint often degrades into inconclusive SEC run • Novel SixthSense technology: leverage synergistic algorithms to solve these harder proofs Formal Methods in Computer-Aided Design, 2006
Causes of refinement • Asserted miter – incorrect candidate guessing • Resource limitations preclude proof • Induction becomes expensive with depth • Approximation weakens power of reachability • Refinement weakens induction hypothesis • Immediate separation of candidate gates • Avalanche of future resource-gated refinements • End result? Suboptimal redundancy removal • Inconclusive equivalence check Formal Methods in Computer-Aided Design, 2006
SixthSense: Enhanced Redundancy Proofs • Use of robust variety of synergistic transformation and verification algorithms • Enables best proof strategy per miter • Exponential run-time improvements • Greater speed and scalability • Greater degree of redundancy identified • Powerful use of Transformation-based Verification • Synergistically leverage transformations to simplify large problems • Reduction in model size, number of distinct miters • Transformation alone sufficient for many proofs Formal Methods in Computer-Aided Design, 2006
Benefits of Transformation-Based Verification • Reduction in model size, number of distinct miters • Useful regardless of proof technique • Transformations alone sufficient for many proofs • Sub-circuits differing by retiming and resynthesis solved using polynomial-resource transformations • Scales to aggressive design modifications • Leverage independent proof strategy on each miter • Different algorithms suited for different problems • Entails exponential difference in run-times Formal Methods in Computer-Aided Design, 2006
TBV on Reduced Model • Methodology restrictions • Retiming may render name- and structure-based candidate guessing ineffective • Synergistic increase in reduction potential • TBV flows more effective after merging • Applying TBV before + after induction-based redundancy removal insufficient • Need to avoid resource-gated refinement “Exploiting Suspected Redundancy without Proving it”, DAC 2005 Formal Methods in Computer-Aided Design, 2006
Redundancy Removal Results • Induction alone unable to solve all properties • TBV => solves all properties, faster than induction Formal Methods in Computer-Aided Design, 2006
TBV on speculatively-reduced model RETiming, LOCalization, COMbinationalreduction, CUT: reparameterization Formal Methods in Computer-Aided Design, 2006
Enhanced search without Proofs • Use miters as filters • No miter asserted => search remains within states for which speculative merging is correct • i.e., search results valid on original model also • Miters need not be proven unassertable • Enables exploitation of redundancy that holds only for an initial bounded time-frame • Faster and deeper bounded falsification • Improved candidate guessing using spec-reduced model Formal Methods in Computer-Aided Design, 2006
Bounded Falsification Results (% improvement) Formal Methods in Computer-Aided Design, 2006
Miter Validation Results (% improvement) Formal Methods in Computer-Aided Design, 2006
SixthSense Sequential Equivalence Checking Drivers (stimulus) Black-Box list Checkers Mapping file Mismatch OLD Design Trace SixthSense NEW Design Proof of Equality Initialization Data Outputs Initialized OLD Design Inputs =? Initialized NEW Design Formal Methods in Computer-Aided Design, 2006
Running the Sequential Equivalence Check • Little manual effort to use • Produces a counterexample showing output mismatch • With respect to specified initial state(s) • Trace is short and has minimal activity to simply illustrate mismatch • Or, proves that no such trace exists • Proof of equivalence • Mandatory inputs: • Requires OLD and NEW version of a design Formal Methods in Computer-Aided Design, 2006
Running the Seq Equiv Check: Optional Inputs • Initialization data; equiv checked w.r.t. given initial values • Mapping file • Indicates I/O signal renaming/polarities, add cutpoints, omit checks… • Drivers, filter input stimuli to prevent spurious mismatches • Black Box file, to easily delete components from design • Outputs correlated, driven randomly; Inputs correlated, made targets • Checkers (check equivalence of internal events) • Ensure that coverage obtained before change, is valid after • "Audit" known mismatches to enable meaningful proofs Formal Methods in Computer-Aided Design, 2006
Sequential Equivalence Checking Applications • Used at block/unit-level on multiple projects… • To verify remaps, retiming, synthesis optimizations… • CEC inadequate to deal with these changes • Exposed 100's of unintended mismatches/design errors • No need to run lengthy regression buckets for lesser coverage • SixthSense often provides proofs/bugs in lesser time • No need to debug lengthy, more cluttered traces • SixthSense traces are short, with minimal activity to illustrate bug • Quickly finds bugs before faulty logic is released Formal Methods in Computer-Aided Design, 2006
Example SEC Applications • Timing optimizations: retiming, adding redundant logic,… • Power optimizations: clock gating, logic minimization, … • Check specific modes of design behavior • Backward-compatibility modes of a redesign preserve functionality • BIST change must not alter functionality • Verifying RTL vs. higher-level models • Quantifying late design fixes • Eg., constrain SEC to disallow ops that are the ones affected by a fix Formal Methods in Computer-Aided Design, 2006
Example Applications: Clock-Gating Verification • Clock-gating: • Disables clocks to certain state elements when they are not required to update • Approach: Equiv-check identical unit • One with clock-gating enabled, one disabled • Check design behavior does not change during care time-frames • Leveraged to converge upon an optimal clock-gating solution • Iteratively apply SEC to ascertain if clock-gating a latch alters function input input Unit with Unit with Unit with Unit with Clock Gating Clock Gating Clock Gating Clock Gating Disabled Disabled Enabled Enabled Formal Methods in Computer-Aided Design, 2006
Example Applications: Quantifying a late design fix • Late bug involving specific cmds on target memory node • Fix made with backwards-compatible "disable" chicken-switch • Wanted to validate: • "disable" mode truly disabled fix • Fix had no impact upon other commands, non-target nodes • Several quick SixthSense equiv check runs performed: • With straight-forward comparison, 192/217 outputs mismatched • "Disabled" NEW design is equivalent to OLD • If configured as non-target node, NEW equivalent to OLD • If specific commands excluded (via a driver), NEW equiv to OLD Formal Methods in Computer-Aided Design, 2006
Example Applications: Hierarchical Design Flow • FPU designed hierarchically • Conventional latch-equivalent VHDL (yellow) • Simple, abstract cycle-accurate VHDL (green) • FPU spec (blue) is behavioral model • Verification approach: • First, formally verified green box is equivalent to its spec using SixthSense (SEC) • Next, yellow box is verified equivalent to green, macro by macro (takes minutes) • Finally, schematics verified using Verity (CEC) • FPU verification is done completely by Formal FPU spec FPU spec SixthSense High-Level Design High - SixthSense VHDL VHDL (Latch (Latch-Equivalent) - Verity(CEC) Schematics Schematics Formal Methods in Computer-Aided Design, 2006
SEC Future Directions: Hierarchical Design Flow • Enables raising the level of abstraction (ESL) • IBM methodology requires, CEC-equivalent to circuit, RTL model • Allows for verifying self-test logic, asynchronous crossings, scan, … • Specification of each macro precisely captured by high-level model • Allows creativity in designing optimal circuit for the macro • Verifn can begin without having the entire design ready • Verify the high-level macros, unit/core/chip compositions • Verifn done in parallel to circuit design; reduces design+verifn cycle • Formal correctness eliminates risk of late design changes • Efficient automated equiv proof of high-level vs. ckt-accurate macros Formal Methods in Computer-Aided Design, 2006
SEC Future Directions: Sequential Optimizations • SEC is an enabler for “safe” sequential synthesis • E.g. retiming, addition/deletion of sequential redundancy • Opens the door for automated (behavioral) synthesis • Results in higher quality, more optimized designs • Enabler for system-level design and verification • SEC enables sequential optimizations • Identify sequential redundancy, unreachable states… • Validate user specified don’t-care conditions • Verify “global” optimizations, e.g. FSM re-encoding, clock-gating,… • Leveraged in diverse areas such as power-gating, fencing, etc. Formal Methods in Computer-Aided Design, 2006
SEC Challenges: Scalability • SEC has to scale to real world problems • Large design slices, arbitrary transforms, low-level HDL spec,… • Tighten induction to resolve miters in spec-reduced model • TBV attempts to do just that, but further improvements welcome! • Improved proof techniques critical to improving scalability • Improved falsification methods to help with candidate guessing • Helps distinguish false equivalences to converge faster • Abstractions to reduce computational complexity • Leverage techniques such as uninterpreted functions, blackboxing,… • Hierarchical proof decomposition • Bottom-up approach – blackboxes verified portions of the logic, and captures constraints at the interfaces Formal Methods in Computer-Aided Design, 2006
SEC Challenges: Combined CEC and SEC • Leverage mappings of state elements obtained from CEC • Take advantage of the wealth of techniques to correspond latches • Name-based, structural, functional, scan-based… • Used as cutpoints to define a boundary between CEC and SEC • Significantly simplifies the SEC problem via co-relation hints • Refining a cut if a false negative obtained is a hard problem • Automatically propagate constraints across mapped state elements • Benefits to CEC • Improved latch pair matching via functional analysis • Latch-phase determination, functional correspondence,… • Apply constraints derived from SEC to simplify problems Formal Methods in Computer-Aided Design, 2006
Conclusion: Sequential Equivalence Checking • Eliminates Risk: • SEC is exhaustive, unlike sim regressions • Improves design quality: • Enables aggressive optimizations, even late in design flow • Saves Resources: • Obviates lengthy verification regressions • Generalizes CEC, and improves productivity • Opens door to automated sequential synthesis Formal Methods in Computer-Aided Design, 2006
Conclusion: SEC at IBM • SEC becoming part of standard methodology at IBM • Pre-synthesis HDL-to-HDL applications • CEC closes gap with combinational synthesis flow • IBM’s SEC solution driven by scalability across arbitrary design transforms • Hooks for: initial values, interface constraints, “partial equivalence”… • SixthSense: TBV-Powered SEC • Leverage a rich set of synergistic algos for highly-scalable SEC Formal Methods in Computer-Aided Design, 2006
Conclusion: References/Links • Website (lists SixthSense publications): www.research.ibm.com/sixthsense • Relevant Papers: “Exploiting Suspected Redundancy without Proving it”, DAC 2005 “Scalable Sequential Equivalence Checking across Arbitrary Design Transformations”, ICCD 2006 Formal Methods in Computer-Aided Design, 2006