260 likes | 502 Views
A New Analysis of the LebMeasure Algorithm for Calculating Hypervolume. Lyndon While Walking Fish Group School of Computer Science & Software Engineering The University of Western Australia. Overview. Metrics for MOEAs Hypervolume LebMeasure and its behaviour
E N D
A New Analysis ofthe LebMeasure Algorithmfor Calculating Hypervolume Lyndon While Walking Fish Group School of Computer Science & Software Engineering The University of Western Australia
Overview • Metrics for MOEAs • Hypervolume • LebMeasure and its behaviour • Empirical data on the performance of LebMeasure • A lower-bound on the complexity of LebMeasure • The general case • Conclusions and future work A New Analysis of LebMeasure
Metrics for MOEAs • A MOEA produces a front of mutually non-dominating solutions to a given problem • m points in n objectives • To compare the performance of MOEAs, we need metrics to compare fronts • Many metrics have been proposed, of several types • cardinality-based metrics • convergence-based metrics • spread-based metrics • volume-based metrics A New Analysis of LebMeasure
Hypervolume (S-metric, Lebesgue measure) • The hypervolume of a front is the size of the portion of objective space collectively dominated by the points on the front • Hypervolume captures in one scalar both the convergence and the spread of the front • Hypervolume has nicer mathematical properties than many other metrics • Hypervolume can be sensitive to scaling of objectivesand to extremal values • Hypervolume is expensive to calculate • enter LebMeasure A New Analysis of LebMeasure
LebMeasure (LM) • Given a mutually non-dominating front S, LM • calculates the hypervolume dominated exclusively by the first point p, then • discards p and processes the rest of S • If the hypervolume dominated exclusively by pis not “hyper-cuboid”, LM • lops off a hyper-cuboid that isdominated exclusively by p, and • replaces p with up to n “spawns” that collectively dominate the remainder of p’s exclusive hypervolume • A spawn is discarded immediately if it dominates no exclusive hypervolume, either because • it has a “zero” objective, or • it is dominated by an unprocessed point A New Analysis of LebMeasure
LebMeasure in action • A dominates exclusively the yellow shape • A lops off the pink hyper-cuboid • A has three potential spawns:A1 = (4,9,4)A2 = (6,7,4)A3 = (6,9,3) • But A2 is dominated by B, so it is discarded immediately A New Analysis of LebMeasure
} guaranteed to be dominated A boost for LebMeasure • Some “spawns of spawns” are guaranteed to be dominated, so LM doesn’t need to generate them at all • This limits the maximum depth of the stack to m + n – 1 A11 A12 A1 A13 A A3 A3 B B B C C C D D D A New Analysis of LebMeasure
But… • This boost greatly reduces the space complexity of LM • the maximum depth of the stack is linear in both m and n • But it does far less for the time complexity of LM • note that the time complexity depends not only on the number of stack slots used, but also on how many times each slot is used • We shall measure the time complexity of LM in terms of the number of points (and spawns, and spawns of spawns, etc) that actually contribute to the hypervolume • i.e. the number of hyper-cuboids that must be summed A New Analysis of LebMeasure
Running LebMeasure m points in n objectives A New Analysis of LebMeasure
Running LebMeasure (in reverse order) m points in n objectives A New Analysis of LebMeasure
Running LebMeasure (in optimal order) m points in n objectives A New Analysis of LebMeasure
Running LebMeasure (first point only) m points in n objectives A New Analysis of LebMeasure
A lower-bound on the complexity of LebMeasure • We can determine a lower-bound on the worst-case complexity of LM by considering a single example • We will derive a recurrence for the number of hyper-cuboids summed for this example, then prove that the recurrence equals 2n−1 A New Analysis of LebMeasure
11222 12122 12212 12221 11122 11212 12112 11221 12121 12211 11112 11121 11211 12111 The simple picture 12222 A New Analysis of LebMeasure
12222 11222 12122 12212 12221 11122 11212 12112 11221 12121 12211 11112 11121 11211 12111 The recursive picture A New Analysis of LebMeasure
A recurrence • h(n,k) gives the number of hyper-cuboids summed for a point (or spawn) with n 2s, of which we can reduce k and still generate points that aren’t dominated by their relatives • hcs(n) gives the total number of hyper-cuboids summed for the example, with n objectives A New Analysis of LebMeasure
(1,1,2,2,2) [h(3,0)] (1,2,1,2,2) [h(3,1)] (1,2,2,1,2) [h(3,2)] (1,2,2,2,1) [h(3,3)] The recurrence in action [h(4,4)] (1,2,2,2,2) A New Analysis of LebMeasure
The recurrence solved • Simple expansion shows that • The paper gives a formal proof using mathematical induction A New Analysis of LebMeasure
The general case • It is difficult to be certain what patterns of points will perform worst for LM • We will describe the behaviour of an illegal “beyond worst case” pattern • Illegal because some points dominate others A New Analysis of LebMeasure
m points in 2 objectives u1v1 • xi denotes the ith best value in objective x • Each vertical list has length m • Total size m2 A New Analysis of LebMeasure
m points in 3 objectives u1v1w1 • Each vertical list has length m • Each 2-way sub-tree has size m2 • Total size m3 A New Analysis of LebMeasure
1 1 1 1 3 3 3 3 2 2 2 2 k m points in 4 objectives u1v1w1x1 • denotes a k-way sub-tree • Each k-way sub-tree has size mk • Total size m4 A New Analysis of LebMeasure
A recurrence and its solution • Again, we can capture this behaviour as a recurrence • By simple expansion (and proved formally in the paper) A New Analysis of LebMeasure
Conclusions • LM is exponential in the number of objectives, in the worst case • Re-ordering the points often makes LMgo faster,but the worst case is still exponential • the proof technique used for the “simple” case will also work for the “unreorderable” case A New Analysis of LebMeasure
Future work • Try to make LM faster • re-order the points • re-order the objectives • Develop and refine other algorithms (e.g. HSO) • possibly develop a hybrid algorithm • Prove that no polynomial-time algorithm existsfor calculating hypervolume A New Analysis of LebMeasure