370 likes | 385 Views
Hierarchical Face Clustering on Polygonal Surfaces. Michael Garland University of Illinois at Urbana–Champaign Andrew Willmott Paul S. Heckbert Carnegie Mellon University. Overview. Surface models are often too complex may exceed processing, storage, … capacity
E N D
Hierarchical Face Clustering on Polygonal Surfaces Michael Garland University of Illinois at Urbana–Champaign Andrew Willmott Paul S. Heckbert Carnegie Mellon University
Overview • Surface models are often too complex • may exceed processing, storage, … capacity • may represent unnecessary detail • Must be able to control level of detail • has motivated work on surface simplification • Here we describe an alternative approach • still produces hierarchical representation • aggregate, rather than approximate geometry
Surface Simplification Iteratively mergingadjacent vertices
Iterative Face Clustering Repeatedlymerge pairs ofadjacent clusters
B C A Face Clusters • A connected set of triangles on the surface • use disjoint clusters to partition surface • record certain aggregate properties • representative plane • surface area & boundary perimeter • Clusters may be non-simple regions • individual clusters may have holes e.g., cluster A has 3 boundary loops
Our Sample Application:Multiresolution Radiosity • Existing history of hierarchical methods • Hierarchical radiosity [Hanrahan et al 91] • Hier. radiosity + volume clustering [Smits et al 94] • Essential for good performance • non-hierarchical methods haveO(n2) performance For details on radiosity algorithm:Willmott et al. Face Cluster Radiosity.Eurographics Rendering Workshop, 1999. 3.3 million polygon scene
The Basic Problem:Excessively Detailed Input 1,000,000 input triangles 870,000 input triangles Fine detail has little effecton ultimate solution.
Dual Graph of Meshes • Assume we start with triangulated mesh • construct one dual node per face • connect dual nodes if their faces are adjacent • Non-manifolds can cause efficiency problems • k faces per edge = O(k2) dual edges
Clustering = Dual Contraction • Consider contracting an edge of dual graph • merges two dual nodes into a single node • Equivalent to merging associated clusters • hence, iterative clustering = iterative contraction
Iterative Clustering Algorithm • Construct dual graph for mesh (every face is a singleton cluster) • Find cost of contraction of all dual edges (place in heap for efficient queries) • Loop until finished • contract dual edge of least cost • update costs of neighboring edges
Iterative Clustering Algorithm:Things to Notice • Surface is always completely partitioned • into disjoint face clusters — one per dual node • does not alter surface geometry at all • Looks very much like surface simplification • clustering is the dual of simplification • As with simplification, produces hierarchy • simplification — tree of vertex neighborhoods • clustering — tree of face clusters
Face Cluster Hierarchies • Contraction merges 2 clusters • producing new larger cluster • Iteration forms binary tree • original faces at leaves • children merge & form parent • 1 root per connected component • Similar to vertex hierarchies • result from simplification • used in applications such asview-dependent refinement
Measuring Contraction Cost • This is the big outstanding question • choice of metric has great effect on results • Our Choice:Want “mostly planar” clusters • Why? Consider radiosity application • clusters approximated with planar elements • non-planarity leads to imprecise solution
Measuring Planarity • Consider the set of all vertices in a cluster • can fit some plane to this set of points • quality of the fit will measure planarity • We choose the least squares best plane • find a plane that minimizes the mean squared distance of points to plane • the mean itself reflects the degree of planarity
Planarity Metric • Formally, this measure of planarity is • Using the dual quadric error metric Pi = the squared distance of point i to given plane
Using Quadric Metrics • Each node has an associated quadric • initially constructed from 3 corners of each face • sum quadrics when merging nodes • To compute cost of contracting an edge • add quadrics of endpoints • find representative plane, and evaluate:
Finding Planes for Clusters • Fit least squares best plane to set of points • we use principal component analysis (PCA) • a very common approach to the problem • Construct the sample covariance matrix • smallest eigenvector is normal of optimal plane • assume plane passes through mean of points
Finding Planes for Clusters • Can derive this directly from quadrics • this ignores the averaging factor • because only relative eigenvalue order matters • smallest eigenvector provides normal • other 2 eigenvectors provide axesfor oriented bounding box
Why Use Quadrics? • Expresses the error we want to measure • namely planarity (in the least squares sense) • Fairly compact, efficient representation • storage: 10 doubles per quadric • time: easy formula to evaluate • Very easy updating rules • 2 nodes are merged … added their quadrics • so only 10 additions per contraction
11,036 clusters 6000 clusters 1000 clusters 200 clusters Example Results
Adding Orientation Bias • Both of the following are equally “planar” • least squares plane is roughly the same • Leftmost region shape would be preferable • we want regions with consistent normals • so we add an additional error term (a) (b)
Orientation Bias Metric • Each cluster has a representative normal • we measure deviation of all normals from it • As with planarity, can be written as quadric
Resulting Cluster Shape • Result of using planarity + shape bias • very jagged boundaries • long, irregular shapes “gerrymandering” • May be undesirable (application dependent) • yields poor radiosity solutions due to shadows
Measuring Cluster Irregularity • We define the irregularity of a cluster as • minimum value is 1 (achieved by a circle) • higher irregularity values mean less circular • This is a fairly common definition • image processing, surface clustering, politics, …
Cluster Shape Bias • And we can introduce additional shape bias • penalizes contractions that increase irregularity • Why bias and not hard limit? • guarantees that progress can always be made • will always produce a complete cluster hierarchy
The Final Cost Metric Without Shape Bias With Shape Bias
Running Times on 450 MHz Intel Pentium III system
Sample Radiosity Results • Large scene complexity (3,350,000 triangles) • Solution time: 450 sec (on 195 MHz MIPS R10000)
Radiosity Solution Time 2000 Running Time (seconds) Progressive Vol. clustering Face clustering 120,000 Input Triangles
Why Nearly Flat Growth? • Solutions are run with fixed error threshold • settles on an “appropriate” level in the hierarchy • sufficient detail for requested precision • far above the leaves of the hierarchy • once found, never descends below this level • Works because clusters fit surface well • poor clusters = descend further for accuracy • this is a problem for volume clustering • often little coherence of elements in a single cell
Why Not Use Simplification? • Vertex hierarchies are less effective • empirically tested against face hierarchies • intuitively, they optimize the wrong thing • a planar vertex neighborhood is a useless one • Static levels of detail wouldn’t work • transport links established between node pairs • desired LOD varies with transport partner • nearby patches need finer detail than far away ones
Related Work • General idea of clustering is an old one • most commonly on (multi-dimensional) point sets • Duality is often used in geometric algorithms • “simplex meshes” representation [Delingette 94] • Some region clustering work on … • subdivision surfaces [DeRose et al 98] • (range) images [Faugeras et al 87, Willersinn et al 94] • polygon meshes [Kalvin & Taylor 96]
Future Directions • Alternative clustering heuristics • planarity is quite well suited to radiosity • but is certainly not ideal for all applications • Balancing competing goals • summing error terms causes some problems • might want other terms (e.g., balanced tree) • Non-greedy algorithm framework • might produce noticeably better partitions
Future Directions:Other Possible Applications • Distance & intersection queries • oriented bounding boxes for all clusters • similar to STRIP tree [Ballard 81] curve representation • could be used for ray tracing, for example • Collision detection • very similar to OBBTrees [Gottschalk et al 96] • bottom up merging vs. top down partitioning • partitions surface rather than partitioning space
Summary • Face clustering is the dual of simplification • same LOD problem — same framework works • greedy, iterative edge contraction on dual graph • dual quadric error metric for guiding iteration • aggregate properties vs. approximate geometry • Important features of our method • hierarchical representation of surface partitions • practical & efficient construction algorithm • an effective (dual) quadric error metric
More Details Available Online Radiosity software Clustering software Related papers http://graphics.cs.uiuc.edu/~garland/research/cluster.html