270 likes | 429 Views
Design Hierarchy Guided Multilevel Circuit Partitioning. Yongseok Cheon and D.F. Wong Department of Computer Sciences The University of Texas at Austin. Outline. Motivation & Contribution Problem Design hierarchy Rent’s rule & Rent exponent Our approach
E N D
Design Hierarchy Guided Multilevel Circuit Partitioning Yongseok Cheon and D.F. Wong Department of Computer Sciences The University of Texas at Austin
Outline • Motivation & Contribution • Problem • Design hierarchy • Rent’s rule & Rent exponent • Our approach • Design hierarchy guided clustering • Design hierarchy guided ML partitioning • Experimental results
Motivation • Natural question: How to use design hierarchy for partitioning? • Effectiveness of multilevel partitioning • Similarity between design hierarchy (DH) and ML clustering tree
Contribution • Rent exponent as a quality indicator • Intelligent and systematic use of hierarchical logical grouping information for better partitioning • Partitioning results with higher quality, more stability obtained
Partitioning problem Netlist hypergraph Partitioned hypergraph
Multilevel partitioning • Multilevel clustering (coarsening) • Initial partitioning • Multilevel FM refinement with unclustering (uncoarsening) • hMetis (3) (1) (2)
Design hierarchy • Hierarchical grouping which already has some implications on connectivity information • To identify which hierarchical element is good or bad in terms of physical connectivity, Rent’s rule is used
Rent’s rule • Rent’s rule & Rent exponent • E = external pin count • B = # of cells inside • P = avg # of pins per cell • r = Rent exponent
Rent exponent • For a hierarchical element H, • Rent exponent for H • E = external pin count • I = internal pin count • P = avg # of pins per cell = (I+E)/|H|
Rent exponent • Small r more strongly connected cells inside • Large r more weakly connected cells inside r = ln(4/34)/ln10 + 1 = 0.0147 r = ln(15/25)/ln10 + 1 = 0.778
Selective preservation of DH • Global Rent exponent, r = weighted average of Rent exponents of all hierarchical elements in DH = • A hierarchical element H is determined to be preserved or broken according to r(H) • If r(H) r : H will be used as a search scope for clustering of the cells inside H – positive scope • If r(H) r : H is removed from DH and the cells inside of H can be freely clustered with outside cells – negative scope
Design : negative scope hierarchy tree Scope tree D' H(u) = H1 D : positive scope H3 H(v) = H2 H1 H4 H2 u v Modification of DH • Remove all negative scopes from design hierarchy D – scope tree D’ • H(v) (parent of v in D’) : served as clustering scope for v
DH guided ML clustering • Input: bottommost hypergraph G1 & design hierarchy D • Output: k-level clustering tree C • Modify D to D’ • do • Perform cluster_one_level(Gk) with D’ upper level hypergraph Gk+1 • Update D’ • k = k+1 • until Gk is saturated
Global saturation • Saturation condition(stopping criteria): • # of vertices or • Problem size reduction rate • ( =100, =0.9 in our experiments )
Scope tree D' H3 H1 H4 H2 u v Clustering scope • Hierarchical node as clustering scope • For each anchor v, best neighbor w to be matched with v is searched within H(v) • u is selected as an anchor before v if H(u) H(v)
Scope restricted clustering • cluster_one_level() • For randomly selected unmatched vertex v, find w within the scope H(v) that maximizes the clustering cost, • Vertices with smaller scopes are selected as anchors earlier • Create a new upper level cluster v’ with v and w • H(v’) := H(v) since H(v) H(w)
Scope restricted clustering(cnt’d) • cluster_one_level() – continued • If no best target w, create v’ only with v • If w already matched in v’, append v to v’ • “unmatched” condition is relaxed - already matched neighbor w is also considered More problem size reduction • H(v’) := H(v) since H(v) H(v’)
One level clustering • No reduction rate control to take full advantage of design hierarchy aggressively reduced # of levels in resulting clustering tree • Cluster sizes are controlled such that they cannot exceed = bal_ratiototal_size • Local saturation condition for scope X: • # of vertices in X (X) or • Size reduction rate in X (X) • ( in our experiments )
Scope tree restructuring • Scope tree is restructured after one level clustering by removing saturated scopes • Enlarged clustering scopes are used at higher level clustering with bigger & fewer clusters Restructured scope Scope tree D' H(u) = H1 tree after one level H(u') = H3 clustering H3 H(v) = H2 H3 H(v') = H4 H1 H4 H4 H2 u v u' v' H1 and H2 are saturated!
DH guided ML partitioning • dhml • Perform Rent exponent computation on D • Apply DH guided ML clustering to obtain k level clustering tree C • At the coarsest level, execute 20 runs of FM and pick the best one • From the partition at level k down to level 0, apply unclustering and FM_partition to improve the partition from upper levels
DH guided ML partitioning • Multi way partitioning: dhml RBP • Recursive bi-partitioning • Partial design hierarchy trees used at each sub-partitioning • Performance compared with hMetis RBP version
Experimental results • Circuit characteristics
Experimental results • Cut set size comparison (Minimum cut size from 5 runs of dhml & 10 runs of hMetis RBP) • Up to 16% better quality in half # of runs
Experimental results • Quality stability
Experimental results • Observation • 20-50% better quality in the initial partition at the coarsest level • Number of levels reduced to 55-75% of hMetis while still producing up to 16% better cut quality • More stable cut quality implying smaller # of runs needed to obtain the near-best solution • Similar or little more runtime than hMetis
Summary • Systematic ML partitioning method exploiting design hierarchy presented • ML clustering guided by design hierarchy • Rent exponent • Clustering scope restriction • Dynamic scope restructuring • Experimental results show… • Better clustering tree • More stable and higher quality solution