1 / 30

On the limits of partial compaction

On the limits of partial compaction. Nachshon Cohen and Erez Petrank Technion. Fragmentation. When a program allocates and de-allocates, holes appear in the heap. These holes are called “fragmentation” and Large objects cannot be allocated (even after GC),

zev
Download Presentation

On the limits of partial compaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. On the limits of partial compaction Nachshon Cohen and Erez Petrank Technion

  2. Fragmentation • When a program allocates and de-allocates, holes appear in the heap. • These holes are called “fragmentation” and • Large objects cannot be allocated (even after GC), • The heap gets larger and locality deteriorates • Garbage collection work becomes tougher • Allocation gets complicated. • The amount of fragmentation is hard to define or measure because it depends on future allocations.

  3. How Bad can Fragmentation Be? • Consider a game: • Program tries to consume as much space as possible • Memory manager tries to satisfy program demands within a given space • Program does not use more than M words simultaneously. • How much space will the memory manager need to satisfy the requests at worst case? • [Robson 1971, 1974]: There exists a program that makes any allocator use ½Mlog(n) space, where n is the size of the largest object. • [Robson 1971, 1974]: There is an allocator that can do with Mlog(n).

  4. Compaction Kills Fragmentation • A memory manager that applies compaction after each deletion never needs more than M words. • But compaction is costly! • A common solution: partial compaction. • “Once in a while” compact “some of the objects”. • Our focus: the effectiveness of partial compaction.

  5. Setting a Limit on Compaction • How do we measure the amount of (partial) compaction? • Compaction ratio 1/c: after the program allocates B words, it is allowed to move B/c words. • Now we can ask: how bad can fragmentation be when partial compaction is limited by a budget 1/c?

  6. Bendersky-Petrank [POPL’11] • Lower bound: there exists a program, such that for all allocators the heap space required is at least: • [BP11] built machinery, proposed the definitions, did the math. • But: results are asymptotical and not applicable for realistic parameters. • Question: what do we need to strengthen the results and make them relevant to practice? Answer: more math!

  7. Theorem 1 • There exists a program such that for all allocators the heap space required is at least: This holds for any integral γ. • Recall: 1/c is the compaction ratio, M is the overall space alive at any point in time, n is the size of the largest object.

  8. Theorem 1 Digest Lower bound as a factor of c (compaction budget) M=256mb, n=1mb E.g., for c=100 (which we consider realistic), at least 3.5M = is required.

  9. Proving the Lower Bound • Provide a program that behaves “terribly” • Show that it consumes a large space overhead against any allocator. • Let’s start with Robson: the allocator cannot move objects at all. • The bad program is provably bad for any allocator. • (Even if the allocator is designed specifically to handle this program only…)

  10. Robson’s “Bad” Program (Simplified) • Allocate objects in phases. • For (i=0, i<=log(n), ++i) • Request allocations of objects of size 2i (as many as possible). • Delete as many objects as possible so that an object of size 2i+1cannot be placed in the freed spaces.

  11. Bad Program Against First Fit • Assume (max live space) M=48. • Start by allocating 48 1-word objects. The heap:

  12. Bad Program Against First Fit • Phase 0: Start by allocating 48 1-word objects. The heap:

  13. Bad Program Against First Fit • Phase 0: Start by allocating 48 1-word objects. • Next, delete so that 2-word objects cannot be placed. The heap: x24 Memory available for allocations

  14. Bad Program Against First Fit • Phase 0: Start by allocating 48 1-word objects. • Next, delete so that 2-word objects cannot be placed. • Phase 1: allocate 12 2-word objects. The heap: x24 Memory available for allocations

  15. Bad Program Against First Fit • Phase 0: Start by allocating 48 1-word objects. • Next, delete so that 2-word objects cannot be placed. • Phase 1: allocate 12 2-word objects. • Next, delete so that 4-word objects cannot be placed. The heap: x24 Memory available for allocations 15

  16. Bad Program Against First Fit • Phase 1: allocate 12 2-word objects. • Next, delete so that 4-word objects cannot be placed. • Phase 2: allocate 6 4-word objects. x24 Memory available for allocations 16

  17. First Fit Example -- Observations • In each phase (after the first), we allocate ½M words, and space reuse is not possible. • We have log(n) phases. • Thus, ½Mlog(n) space must be used. • To be accurate: • The lower bound is ½Mlog(n)+M-n+1 • The actual bad program is more complex. • The proof for a general allocator is more complex.

  18. Is This Program Bad Also When Partial Compaction is Allowed? • Observation: • Small objects are surrounded by large gaps. • MM could move a few and make room for future allocations. • Idea: a bad program in the presence of partial compaction, monitors the density of objects in all areas. The heap:

  19. The Adversarial Program • Simplification: assume aligned allocation. • For (i=0, i<=log(n), ++i) • Request allocations of objects of size 2i, with overall space of X words. • Delete all compacted objects. • Partition the memory into consecutive aligned areas of size 2i+1. • Delete as many objects as possible, so that the density of each area is above threshold.

  20. An Execution Example compaction fraction 1/C=1/8, threshold=1/2 • i=0, allocate M=48 objects of size 1. Compaction quota: 48/8 = 6 The heap:

  21. An Execution Example compaction fraction 1/C=1/8, threshold=1/2 • i=0, allocate M=48 objects of size 1. • i=1, memory manager does not compact. • i=1, deletion step. Compaction Quota: = 6 The heap: x23 Memory available for allocations

  22. An Execution Example compaction fraction 1/C=1/8, threshold=1/2 • i=1, allocate 11 objects of size 2. Compaction Quota: = 6 Compaction Quota: = 6+22/8=8.75 The heap: x1 Memory available for allocations

  23. An Execution Example compaction fraction 1/C=1/8, threshold=1/2 • i=1, allocate 11 objects of size 2. • Memory manager compacts. Compaction Quota: = 8.75 Compaction Quota: = 8.75-8=0.75 The heap: x9 x1 Memory available for allocations

  24. An Execution Example compaction fraction 1/C=1/8, threshold=1/2 • i=1, allocate 11 objects of size 2. • Memory manager compacts • And program delete the compacted objects. • i=1, deletion. Density threshold =1/2. Compaction Quota: = 0.75 The heap: x9 x20 Memory available for allocations

  25. An Execution Example compaction fraction 1/C=1/8, threshold=1/2 • i=2, allocate 5 objects of size 4. Compaction Quota: = 0.75 The heap: x20 Memory available for allocations

  26. Bad Program Intuition • The goal is to: • minimize reuse of space. • (Delete and) allocate a lot. • Careful choice of density threshold is crucial to obtain the bound. Obtained by maintaining high density. Achieved by maintaining low density.

  27. Proof Skeleton • Lemma 1: allocation budget of X available in each phase. • Fact: space used ≥ space allocated – space reused. • Lemma 2: space reused < compaction / density • By definition: compaction < ( space allocated ) / c = X log(n)/cSpace reuse < (X log(n) / c) / density Space allocated = X log(n) • Conclusion: space used ≥ space allocated – space reused≥ X log(n) ( 1 – 1 / (density c) ) • And the lower bound follows. The full proof works with non-aligned objects...

  28. Obtaining the Improved Bounds • Use a modified Robson in first log(1/density) steps • and analyze properly. • Use a potential function to combine analysis of the two algorithms. • Allocate a fixed amount of memory per step • allows maintaining proper density. • Handle unaligned objects better • use virtual association of objects to intervals. • Stronger (and dirtier) mathematical analysis

  29. Related Work • Theoretical Work: • Robson’s work [1971, 1974] • Luby-Naor-Orda [1994,1996] • Bendersky-Petrank[2011] • Various memory managers employ partial compaction. For example: • Ben Yitzhak et.al [2003] • Metronome by Bacon et al. [2003] • Pauless collector by Click et al. [2005] • Concurrent Real-Time Garbage Collectors by Pizlo et al. [2007, 2008]

  30. Conclusion • Partial compaction is useful for ameliorating pauses imposed by full compaction. • In this work we studies limits of partial compaction. • Improved previously known bounds substantially • Lower bound now relevant to realistic systems and parameters.

More Related