250 likes | 348 Views
Modeling the Effectiveness of Reuse in SoC Design. William Fornaciari, Fabio Salice FDL’00 , Tubingen, Germany, September 2000. Politecnico di Milano Italy. CEFRIEL Research Center Italy. Presenter: William Fornaciari fornacia@elet.polimi.it. Presentation outline.
E N D
Modeling the Effectiveness of Reuse in SoC Design William Fornaciari, Fabio Salice FDL’00, Tubingen, Germany, September 2000 Politecnico di Milano Italy CEFRIEL Research Center Italy Presenter: William Fornaciarifornacia@elet.polimi.it
Presentation outline • Problem definition and paper goal • The identified models for • financial analysis • project size estimation • Methodology assessment • Analysis of the LEON-1 microprocessor • Concluding remarks
Problem background • The productivity gap is becoming the crucial factor influencing the technological/financial choices • Almost all digital designs are centered on some HDL-based (e.g., VHDL) design flows • The development of a design requires to cope with management/refinement of specifications • Problems • Introduction of design templates • Storing/retrieval of reusable parts • Make-it or buy dilemma • Financial analysis of reuse benefits • Estimation of the development time vs TTM • Strong sensitivity of any tradeoff on the prediction of the project size
Paper goals • Overall goal of our ongoing project • Estimation of the cost-effectiveness of soft-reuse ofVHDL-based designs • Main focus of this paper • Outlining of the financial model • Customization of the COCOMO analysis • Identification of a strategy to predict the crucial factor, namely the project size, from a high-level system characterization • Function Point (FP) analysis has been adapted to deal with the peculiarity of VHDL-based designs • Application of the analysis to a real VHDL design: LEON-1, a 32-bit microprocessor
The Economical Model (EM) • A possible solution to identify the cost of a Hw project is the use of parametric models • Identification of the factors influencing the quantities to be estimated • Definition of the sources of cost (NRC, RC, Fixed) and their cross-relations with the above quantities • Definition of the predictive variables (parameters) to compute the estimates via a formal model • Three models are useful • From scratch, for reuse, with reuse • In addition, it should be considered • Losses (cost) due to TTM windows • ROI
The Economical Model - cont’d • Definitions • Eff [pm] global development effort for a project • T [m] time to develop the project for a proper team of R designers • S project size (Klines of code) • In general, the cost C is proportional to Eff C = k * Eff • The top-level relations are those of COCOMO-2 Eff = A * SB T = A2 * EffB2 so thatR = Eff / T
Factors of the EM • A, A2 are multiplicative factors for the effort • Personnel, product, project • B, B2 account for economy/diseconomy originates in developing projects of different sizes • Familiarity, flexibility, group cohesion, risk-resolution, maturity of the development process • Typical values for different project complexities • For our purposes, the selected formalism to estimate S is VHDL complexity
EM for reuse: static aspects • Integration costs can offset the reuse benefits • Static aspects • SMes is the equivalent size for a module M to be reused whose original size is SM • AA [0..8] Evaluation and selection (Assessment and Assimilation) • CU Code Undestanding andUNFM Unfamiliarity • AAFModification. It depends on Percent Design Modified (DM), Percent Code Modified (CM)and the integration for the modified code (IM) SMes=0.01 SM[AA+AAF+(1+0.02 CU) UNFM] AAF 0.5 SMes=0.01 SM[AA+AAF+CU UNFM] AAF > 0.5 • SMes is used to compute Eff and T for reuse of M
EM for reuse - dynamic aspects • Key point: prediction of the design costs while considering the evolution of productivity • Pr (productivity): # tested gates produced in one month Pr = gates/Eff • G: # of gates after t years G = 1.588t G0 • Steady improvement of EDA tools produces PrMt = (1+ qM)t PrM0 with qM=0.3 - 0.25 • Design for Reuse Factor (DRF 1.5 - 4.0) captures the effort to design a component for reuse • Design for Integration Factor (DOIF 0.1 - 0.3) considers the trend of the integration effort Collapsing of the three equations reveal the evolution over the years of the effort for reuse and for single use
Project size estimation • Starting point of any analysis is the project size S • Typically underestimated from 50%-150% • For VHDL we split S in two components S = Ssystem + Stestbench • Estimation of Ssystem • Lines of code VHDL (KLOC or LOCVHDL) • Function Point VHDL (FPVHDL) • LOCVHDL problems • Requires a well-structured and detailed view of the project to obtain reliable values • VHDL is inherently parallel, and the different statements vary in expressiveness and complexity
LOCVHDL : direct metric • Direct computation of LOC through analysis of the different contributes • Port (I/O), signal, concurrent statements, package and library • Processes and components are the relevant ones • Components • LOC is of the same order of magnitude of the signals composing the component interface ( e.g. LOCcomp = 1.5 * N ) • Processes • Inputs: primary in and out of the process • Vectors, signals have weight = 1, while records must be decomposed and only the relevant fields are considered • LOC is a parabolic function of the number of inputs and outputs
ctrl Example of LOCVHDL Inputs: mpco - 4 components r - 2 components rst - 1 component Outputs: ri - 2 components mpcii - 3 components ctrl: process(mpcio, r) variable v : pci_type; variable ready, mexc : std_logic; begin v.data := (others => '0'); v.state := '0'; ready := '0'; mexc := '0'; -- pragma translate_off v := r; case r.state is when '0' => if mpcio.en = '1' then v.state := '1'; v.data := mpcio.addr; ready := '1'; mexc := not mpcio.read; end if; when others => v.state := '0'; end case; if rst = '0' then v.state := '0'; end if; -- pragma translate_on ri <= v; mpcii.data <= r.data; mpcii.ready <= ready; mpcii.mexc <= mexc; end process; Actual: 25 Estimated: 33
Function Point VHDL • Suitable to the analysis of new projects • Requires a structured but not necessarily detailed view of the project • The starting point is a reasonable hierarchical decomposition; the primary inputs and outputs of the system and subsystems must be known • From the description, some functional classed are identified and associated with a weight depending on their complexity • To quantify S, the weights are finally converted in LOC • Elements of a generic specification vs VHDL • General activities: acquisition of information, processing, memory access and emitting of information • Functional categories in VHDL: primary inputs and outputs, basic blocks, internal signals, interface signals
FPVHDL : weights calculation • A weight is associated with each element composing a given unit, according to a lookup table considering the complexity • Given a functional unit k, the average of the weights of its mk components is considered • For an intermediate node, all the children contributes are summed up • For the entire system, all the contributes are collapsed
FPVHDL : weights calculation - cont’d • Primary inputs • Contains inputs from specification surrounding the system both for control and data acquisition • Complexity depends on • Uniformity of data constituting the inputs, ie.e type • Number of involved blocks • Primary outputs • Similar to primary inputs but with a different lookup table
FPVHDL : weights calculation - cont’d • Internal signals • Involves uniformity of data and the control information exchanged among components and subsystems • The table reports the correspondence between the number of uniform data constituting the signal and its complexity • Interface signals • Similar to internal signals but with a different lookup table
FPVHDL : weights calculation - cont’d • Basic Blocks • Are instances of blocks identifiable from specification • It is important to identify those functionalities that will be converted in VHDL as components or processes • Test-bench functionalities can be considered as basic blocks • The complexity of a basic block depends on • Number of concurrent sub-blocks composing it • Level of communication between sub-blocks, expressed as a function of the complexity of internal and external signals
From FPVHDL to LOC • In literature there exist coefficients to convert FP to LOC for different languages • We found sound the following conversion factors • For single node LOC = 19 * FPVHDL • For a node at level Lev of the graph hierarchy LOC = 19 * FPVHDL * Lev • The value Lev is computed considering a “livelized” procedure where the top entity assumes the maximum value • LOC is function of the level used to estimates the global cost of a given module when its final decomposition is unknown and its level is predictable
From FPVHDL to LOC : example • In the example, C can be further decomposed insub-modules • The LOC for the overall project is the sum of the local costs of A, B and the global cost of C LOC = 19 * (FPVHDL-A + FPVHDL-B + FPVHDL-C* 3)
Experimental results • Real world complex VHDL benchmark: LEON-1 • 32-bits SPARC V8 architecture for embedded apps • 7655 lines of codes partitioned in 20 files • Sufficiently documented and well structured example
Example: FP analysis of the UART • Primary inputs • 3 elementary involving three subcomp (A complexity) • 2 structured in joining only a subcomp (A complexity) • Internal signals • 1 small vector (VL complexity) • 2 records each of 34 entries (H complexity) • Primary outputs • 2 elements of L complexity FPVHDL= 21 LOC= 21 * 19 = 399
LEON-1: structure and local costs • Local cost [LOC] including BODY and ARCHITECTURE, no comments and a single statement per line
LEON-1: global costs • The table reports the global estimated costs, i.e. number of VHDL lines constituting the portion of the project included in the considered module, supposing the corresponding hierarchical level • Example: LEON module • FPVHDL-LEON 85 • Global Cost = 2206 *19* 5 = 11030 LOC
Concluding remarks • The paper approached the problem of predicting the cost of large size VHDL-based projects • A conceptual analysis framework has been identified to analyze the effectiveness of reuse • The focus has been on the evaluation of the project size through the introduction of FP analysis • Soundness of the model has been assessed experimentally, considering both small benchmarks and a 32-bits microprocessor • Avg accuracy of local estim. is ~ 20%, variance 15%; however, errors tend to compensate • Predictions starting only from top-level view of the project have accuracy of 40%
Future work • Hw/Sw systems: extension to cover also the sw costs in a unified analysis strategy • Hybrid approach to include custom cost figures for regularly generated sub-components (e.g., caches) • Introduction of risk-analysis virtual costs • Integration with other metrics to evaluate the reuse effort