340 likes | 711 Views
A Partition-Based Heuristic for Translational Box Covering Ben England and Karen Daniels Department of Computer Science University of Massachusetts Lowell supported in part by NSF and DARPA under grant DMS-0310589 Sensor coverage: Repair work: collection of pieces cover a hole
E N D
A Partition-Based Heuristic for Translational Box Covering Ben England and Karen Daniels Department of Computer Science University of Massachusetts Lowell supported in part by NSF and DARPA under grant DMS-0310589
Sensor coverage: Repair work: collection of pieces cover a hole Motivation for 2D Polygonal Covering NP-hard problem Supported under NSF/DARPA CARGO program
Box Covering • Goal: Translate a collection of boxes (orthotopes) Q = {Q1, Q2 , ... , QN} to cover another box P in 2d, 3d, … • Motivation: Boxes can form enclosures for general shapes. 2d views of 3d covering 1st published results in > 2d Partial cover (red part uncovered) Full cover 40 covering shapes 20 covering shapes d = dimension NP-hard problem With Masters student B. England Supported under NSF/DARPA CARGO program
covering . . . . . . . . . . • Survey of non-algorithmic results [Tot04] • Thin coverings of the plane with congruent convex shapes • Translational covering of arbitrary polygonal shapes [Dan01,Dan03] • Translational B-spline covering [Nea06] • Volume condition for translational covering of a cube by a sequence of convex shapes (arbitrary dimension) [Gro85] • Volume condition for on-line algorithm for translational covering of a cube by a sequence of convex shapes (arbitrary dimension) [Las97] Selected Prior Covering Work covering combinatorial covering geometric covering VERTEX-COVER, SET-COVER (including [Gri99]), EDGE-COVER, VLSI logic minimization, facility location translational covering P: finite point sets P: shapes Q: identical Q: nonconvex Q: convex . . . decomposition: 1D interval covered by annuli using approximation algorithm [Hoc87] BOX-COVER [Fow81] NP-complete partition: decomposition with covering • NP-hardness proofs for 4 polygon covering problems [Cul88] • Approximation algorithms for some orthogonal covering problems [Ber92] • Approximation algorithm to cover orthogonal polygon (with holes) with minimum number of rectangles [Kum03] • Clique-based Integer Programming (IP) model for covering orthogonal polygon with minimum number of rectangles [Hei05] • Polynomial-time results for restricted orthogonal polygon covering and horizontally convex polygons Polynomial-time algorithms for triangulation [Cha91] and some tilings
Box Covering Outline • Set covering approach • Key volume expressions • Partition-based heuristic • Experimental highlights • Dimension-independent volume test • Computational considerations • Execution time dominated by 1-OPT • Alternatives to 1-OPT • 1-OPT preprocessing • Monotonicity across calls to Lagrangian heuristic • Conclusion and future work Lagrangian heuristic comes from Lagrangian relaxation of IP model. 1-OPT heuristic swaps groups for cover shape that best improves objective function until no improvement.
g12 = {2,4} g11 = {1,3} g22 = {3,4} g21 = {1,2} g32 = {2} g31 = {1} g34 = {4} g33 = {3} C1 = {g11,g12} C2 = {g21,g22} C3 = {g31,g32,g33,g34} Problem: choose just one part group gjk from the set of part groups Cj for each cover shape Qj such that every part of P is in one of the chosen part groups. Solution: {g11,g21,g34} Set Covering Approach Applied to Box Covering parts IP model, maximizing number of parts covered, treated with Lagrangian Heuristic + 1-OPT Based on Daniels and Grinde, IIE Transactions, 1999
Key Volume Expressions quantized volume effective volume quantized effective volume volume d = a generic part of P
Heuristic: Uniform refinement scheme, unlike general polygonal approach of [Dan03], which subdivides one triangle during each iteration of repeat loop. d = dimension j = total number of parts of P d = a generic part of P N = number of covering shapes LGC_Cover( ) = modified Lagrangian Heuristic + 1-OPT
Minkowski sum of two sets A and B is Experimental Highlights 2D Validation Experiment: • 20 instances with square P and N = 2…6 rectangular covering shapes • OrthotopeCover( ) outperforms polygonal solver of Daniels, et al. [CCCG2003] by at least 2 orders of magnitude • Simpler geometric operations (no Minkowski sum) • Volume tests that do not generalize to arbitrary polygons Example: 4 identical square covering shapes Daniels, et al. [CCCG2003]: Current paper: 167 triangular parts 4 square parts 94 triangle vertices 5 square vertices 888 groups 80 groups 875 seconds 0.2 seconds 450 MHz CPU Sun SPARC Ultra 60TM with 512 MB memory cover shape volume= = 1.21
Experimental Highlights Results in 3d and 4d: d = dimension N = number of covering shapes 3 GHz 64-bit Intel PentiumTM D CPU with 2 GB memory csv = cover shape volume j = total number of parts of P maximum aspect ratio of a covering shape = 4
2d views of 3d covering using OpenGL Experimental Highlights More results in 3d: 1 GHz Intel PentiumTM 4 CPU with 1/2 GB memory
Problem Instance “Hardness” Characterization N = number of covering shapes d = dimension d = a generic part of P j = total number of parts of P Quantized Effective Volume Ratio Dimension-independent Volume Margin
Effectiveness of Y ~10 instances for each parameter combination tp = total points. r = correlation coefficient between Y and % coverage 1 GHz Intel PentiumTM 4 CPU with 1/2 GB memory
d # jN #calls %savings instances saved 47.2 3 17 8192 2.8 12-18 128-4096 2.3 12-16 58.8 3 13 test added Heuristic: Effectiveness of Y # calls saved = average per instance % savings = average relative % savings of LCG_Cover( ) calls 100% coverage was reached by original & revised heuristic in all these cases. 3 GHz 64-bit Intel PentiumTM D CPU with 2 GB memory
Computational Considerations • Execution Time • OrthotopeCover( ) dominated by LGC_Cover( ) • LGC_Cover( ) dominated by deterministic 1-OPT • 1-OPT attempts to increase lower bound on Lagrangian dual • Unlike polygonal heuristic in which group maintenance dominates • Alternatives to 1-OPT • 2-OPT too expensive • Randomization: • Simulated annealing’s random swaps inferior to 1-OPT • Random group sampling weakens 1-OPT • 1-OPT Preprocessing • 1-OPT really behaves like a greedy global improvement strategy • 1-OPT preprocessing yields improvement in: • 75% of 2d instances • 87% of 3d instances • 64% of 4d instances Test suite = subset of 30 of our randomly generated instances: 10 2d, 10 3d, 10 4d 3 GHz 64-bit Intel PentiumTM D CPU with 2 GB memory 1-OPT heuristic swaps groups for cover shape that best improves objective function until no improvement.
Computational Considerations • Monotonicity across calls to LGC_Cover( ) Lagrangian heuristic • No theoretical guarantee that number of parts covered increases. • Number of parts doubles before each LGC_Cover( ) call. • LGC_Cover( ) is only a heuristic. • Success depends on N, d, thickness of cover, richness of group structure and strength of LGC_Cover( ). • Group structure is rich. • 1-OPT helps LGC_Cover( ) cover increasing number of parts. • Sample progression for 2d, N = 6, csv = 1.25: • 504 of 512 parts covered (98.4%) • 1014 of 1024 parts covered (99.%) • 2039 of 2048 parts covered (99.6%) • 4096 of 4096 parts covered (100%) monotonically increasing Test suite = subset of 30 of our randomly generated instances: 10 2d, 10 3d, 10 4d 3 GHz 64-bit Intel PentiumTM D CPU with 2 GB memory 1-OPT heuristic swaps groups for cover shape that best improves objective function until no improvement.
Conclusion & Future Work • Partition-based, translational, box covering heuristic has dimension as an input. • Set covering approach uses uniform refinement. • First 3d, 4d heuristic results for our translational box covering problem • Found covers for some instances with as many as 50 covering shapes. • Box covering heuristic outperforms general, polygonal heuristic in 2d rectangular experiment. • Dimension-independent volume margin avoids many refinement steps. • Computational considerations • Execution time is dominated by deterministic 1-OPT improvement heuristic. • 1-OPT outperforms 2-OPT, simulated annealing and randomized 1-OPT. • 1-OPT preprocessing improves results. • Monotonicity across calls to Lagrangian heuristic occurs often in practice, although not theoretically guaranteed. • Future work: • Use boxes as enclosures for more general shapes to: • improve 2d general covering heuristic • treat 3d general covering • Allow rotations
References Acknowledgement: Thanks to Michelle Daniels for comments.
For More Information • Email kdaniels@cs.uml.edu • Web sites: • Thanks for your attention! • Questions? http://www.cs.uml.edu/~kdaniels/covering/covering.htm http://www.cs.uml.edu/~bengland/cg/orthotope_cover/ 19
BACKUP SLIDES (from CCCG 2003, etc.)
Translational 2D Polygon Covering P2 P2 P1 Q3 P1 Q2 Q2 Q1 Sample P and Q Translated Q Covers P Q1 Q3 2D Polygonal Covering [CCCG 2001,CCCG2003] Supported under NSF/DARPA CARGO program • Input: • Covering polygons Q = {Q1, Q2 , ... , Qm} • Target polygons (or point-sets) P = {P1, P2 , ... , Pn} • Output: • Translations g = {g1, g2 , ... , gm} such that With graduate students R. Inkulu, A. Mathur, C.Neacsu, & UNH professor R. Grinde
2D B-Spline Covering [CORS/INFORMS2004, UMass Lowell Student Research Symposium 2004, Computers Graphics Forum, 2006] Supported under NSF/DARPA CARGO program motivated by 3D CAD NP-hard problem With graduate student C. Neacsu
Covering Web Sitehttp://www.cs.uml.edu/~kdaniels/covering/covering.htm With graduate student C. Neacsu and undergraduate A. Hussin
Combinatorial Covering Procedure: LAGRANGIAN-COVER IP Model exactly 1 group chosen for each Qj value of 1 contributed to objective function for each triangle covered by a Qj, where that triangle is in a group chosen for that Qj Variables: Parameters:
T1 T2 T3 T4 T5 Combinatorial Covering Procedure: LAGRANGIAN-COVER IP Parameters Triangles: Groups: Qj’s: b11=1 b12=0 b21=0b22=1 b31=1b32=1 G1 Q1 G2 Q2 a11=1 a12=1 a13=1 a21=1a22=1a23=1 a31=1a32=0a33=0 a41=1a42=0a43=0 a51=0a52=1a53=0 G3 G3
exactly 1 group for each Qj Combinatorial Covering Procedure: LAGRANGIAN-COVER IP Constraints k=2 k=1 k=3 b11=1 b12=0 b21=0b22=1 b31=1b32=1 j=1 j=2 Variables: Parameters:
j=1 j=1 j=1 j=1 j=1 j=2 j=2 j=2 j=2 j=2 Combinatorial Covering Procedure: LAGRANGIAN-COVER IP Constraints value of 1 contributed to objective function for each triangle covered by a Qj, where that triangle is in a group chosen for that Qj k=3 k=1 k=2 b11=1 b12=0 b21=0b22=1 b31=1b32=1 a11=1 a12=1 a13=1 a21=1a22=1a23=1 a31=1a32=0a33=0 a41=1a42=0a43=0 a51=0a52=1a53=0 Variables: Parameters:
T1 T2 T3 T4 T5 Combinatorial Covering Procedure: LAGRANGIAN-COVER IP Variables Triangles: Groups: Qj’s: Group choices: G1 for Q1 G2 for Q2 G1 Q1 g11=1 g12=0g21=0g22=1g31=0g32=0 G2 Q2 t1 , t2=1multiply covered G3 t1=1 t2=1t3=1t4=1 t5=1
Lagrangian Relaxation exactly 1 group chosen for each Qj value of 1 contributed to objective function for each triangle covered by a Qj, where that triangle is in a group chosen for that Qj bring into objective function Variables: Parameters:
1 Lagrangian Relaxation maximize Lagrange Multipliers 2 3 4 removing constraints minimize 2 l>=0 and subtracting term < 0 3 Lagrangian Relaxation LR(l) 1 Lower bounds come from any feasible solution to 1 4 Lagrangian Dual: min LR(l), subject to l >= 0
Lagrangian Relaxation Lagrangian Relaxation LR(l) LR(l) is separable SP1 SP2 Solve: if (1-li) >=0 then set ti=1 else set ti=0 Solve: Redistribute: Solve j sub-subproblems - compute gkj coefficients - set to 1 gkjwith largest coefficient For candidate l values, solve SP1, SP2
Lagrangian Relaxation 1 • Generating lower bound for : • SP2 solution yields gkj values feasible for • Modify ti values accordingly • Result is feasible for 1 1 1
Lagrangian Relaxation SP2 SP1 • SP1, SP2 have integrality property • Solutions unchanged when variable integrality not enforced • Optimal value of Lagrangian Dual no better than Linear Programming relaxation of • Use as a heuristic: • Upper bound for • Lower bound for by generating feasible solution to • Fast, predictable execution time • Optimization software libraries not required 1 1 1 1
Lagrangian Relaxation • Search l space using subgradient optimization • Initialize lis (e.g. 0) • Solve SP1 and SP2 • Update upper bound using sum of SP1, SP2 solutions • Generate feasible solution • Improve feasible solution using local exchange heuristic • Update lower bound using feasible solution • Calculate subgradients • Calculate step size • Take a step in subgradient direction • Update lis Iterate until stopping criteria satisfied