1 / 29

Intelligent Compilation

Explore the optimization problem using intelligent compilation and machine learning. Learn about method-specific compilation details, code properties, and the application of logistic regression in evaluating optimizations. Experimental results show significant improvements in total and running times for Java JIT compiler.

blackwood
Download Presentation

Intelligent Compilation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intelligent Compilation John Cavazos Computer & Information Sciences Department University of Delaware

  2. The Optimization Problem

  3. The Optimization Problem

  4. The Optimization Problem

  5. The Optimization Problem

  6. The Optimization Problem

  7. The Optimization Problem Intelligent Compilation

  8. Architectural Characteristics • Can use as machine learning features • Example • Mnemonic Description Avg Values • FPU_IDL (Floating Unit Idle) 0.473 • VEC_INS (Vector Instructions) 0.017 • BR_INS (Branch Instructions) 0.047 • L1_ICH (L1 Icache Hits) 0.0006

  9. Domain Knowledge • Generate training data from different domains • Berkeley “motifs” can serve as starting point

  10. Application of Machine Learning

  11. Case Study:Method-Specific Compilation • Use static code properties to characterize Java methods to control the optimizations applied

  12. Method-Specific Compilation Details • Intelligent Java JIT compiler • Used simple code properties • Extracted from a linear pass of code • 26 features used to describe method • Model controlled up to 20 optimizations • Outperformed hand-tuned heuristics

  13. Code Properties Used (inputs) Method Features Meaning Size Number of bytecodes Words allocated for locals space Locals Space Is syncronized, has exceptions, is leaf method Characteristics Declaration Is it declared final, static, private Has array loads and stores primitive and long computations compares, branches, jsrs, switches, put, get, invoke, new, arraylength athrow, checkcast, monitor Fraction of Bytecodes

  14. Optimizations (outputs) Optimization Level Optimizations Controlled Branch Opts Low Constant Prop/ Local CSE Reorder Code Opt Level O0 Copy Prop / Tail Recursion Static Splitting / Branch Opt Med Simple Opts Low Opt Level O1 While into Untils / Loop Unroll Branch Opt High / Redundant BR Simple Opts Med /Load Elim Expression Fold / Coalesce Global Copy Prop / Global CSE SSA Opt Level O2

  15. Generate Training Data • For each method • Evaluate many optimization settings • Use fine-grained timers • Record running time • Record total time • Levels 02 (20 optimizations) • Evaluate 1000 random settings

  16. Logistic Regression • Variant of ordinary regression • Inputs • Continuous, discrete, or mix • Outputs • Restricted to between 0 and 1 • Probability of optimization being beneficial

  17. Jikes RVM Optimized method Optimizer Compiler Heuristic Compiler Heuristic Tuned Method bytecodes Feature extractor LR

  18. Jikes RVM Optimized method Optimizer Compiler Heuristic Compiler Heuristic Tuned Method bytecodes Feature extractor LR

  19. Jikes RVM Optimized method Optimizer Compiler Heuristic Compiler Heuristic Tuned Method bytecodes Feature extractor Feature Vector LR {108;25;0;0;0;0;1;0;0:2;0:0;0:0;0:0;0:0;0:0 0:12;0:0;0:08;0:0;0:0;0:0;0:2;0:32;0:08;0:0}

  20. Jikes RVM Optimized method Optimizer Compiler Heuristic Compiler Heuristic Tuned Method bytecodes Feature extractor Feature Vector LR {108;25;0;0;0;0;1;0;0:2;0:0;0:0;0:0;0:0;0:0 0:12;0:0;0:08;0:0;0:0;0:0;0:2;0:32;0:08;0:0}

  21. Jikes RVM Optimized method Optimizer Compiler Heuristic Compiler Heuristic Tuned Method bytecodes Feature extractor Feature Vector Opt Flags LR {108;25;0;0;0;0;1;0;0:2;0:0;0:0;0:0;0:0;0:0 0:12;0:0;0:08;0:0;0:0;0:0;0:2;0:32;0:08;0:0} {1;0;1;1;0;0;0;1;1;1;1;1;1;1;1;0;1;1;1;0}

  22. Experimental Results • JikesRVM • Highly-tuned Java JIT Compiler • Benchmarks • SPEC JVM 98 • DaCapo benchmarks + 2 others • Total Time (compilation + running time) • Running Time (no compilation time)

  23. SPEC JVM (Highest Opt Level) Total time improvements of 30%!

  24. DaCapo (Highest Opt Level) Running time improvements over 50%!

  25. % of O3 Applied to Hot Methods

  26. % of O3 Applied to Hot Methods OPT compress jess raytrace db javac mpegaudio jack BrchOpt 100 78 88 100 96 98 91 Reorder 100 27 70 83 7 92 97 UNROLL 50 0 5 0 0 0 14

  27. Single-Core Opts Still Important! • Machine learning applied to auto-tuning • Optimization Phase Ordering • Speculative optimizations

  28. Optimizations for Multi-Cores • Auto-code partitioning and mapping • Communication/computation overlap • Task and Data placement/migration

  29. Conclusion • Using machine learning successful • Out-performs hand-tuning • Simple code features can drive optimizations • ML determines which features are important • Optimizations applied only when beneficial

More Related