1 / 35

Oracle Database Performance Secrets Finally Revealed

<Insert Picture Here>. Oracle Database Performance Secrets Finally Revealed. Greg Rahn & Michael Hallas Oracle Real-World Performance Group Server Technologies. About The Real-World Performance Group. The secret to a chef’s masterpiece is all in the recipe. Agenda. Troubleshooting approach

lesley
Download Presentation

Oracle Database Performance Secrets Finally Revealed

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. <Insert Picture Here> Oracle Database Performance Secrets Finally Revealed Greg Rahn & Michael Hallas Oracle Real-World Performance Group Server Technologies

  2. About The Real-World Performance Group

  3. The secret to a chef’s masterpiece is all in the recipe

  4. Agenda • Troubleshooting approach • Intro to the optimizer • Demonstration • Summary of secrets and lessons learned

  5. What is your performance troubleshooting approach?

  6. Do you... • Use Google as your first resource? • Use the Oracle Documentation as your second resource? • Believe in tuning databases by changing block size? • Frequently use database parameters to tune queries? • Believe in “Silver Bullet Tuning”? • Blindly apply previously successful solutions? • Practice the “Change and Test” troubleshooting approach?

  7. The Change and Test Troubleshooting Approach

  8. <Insert Picture Here> What really matters for troubleshooting performance?

  9. <Insert Picture Here> Quite simply... it's all in the approach.

  10. Stack Visualization

  11. The Systematic Troubleshooting Approach • Define the problem, the scope and identify symptoms • Collect and analyze data/metrics • Ask “Why?” five times to identify root cause • Understand and devise change • Make a single change • Observe and measure results • If necessary, back out change; repeat process

  12. Systematic Troubleshooting Guidelines • Don’t just look inside the database, look outside as well • Make exactly one change at a time • Scope of solution matches scope of problem • Choose the right tool for the job • Carefully document change and impact • Suppressing the problem is not the same as root cause • Realize that you may not be able to get to the root cause in just one step

  13. Fix on Failure vs. Fix it ForeverThe benefits of root cause analysis • Fix on Failure • Finger pointing and the blame game • Stressful for everyone • Never time to fix it right the first time, but always plenty of time to keep fixing it time and time again • Fix it Forever • Identify root causes of problems, so permanent solutions can be implemented • Develop a logical, systematic and data driven approach to problem solving

  14. <Insert Picture Here> Example of Applying the “5 Whys”

  15. Applying the “5 Whys” to My batch job ran long last night • Why? - A specific query took 5x as long • Why? - Execution plan changed from HJ to NLJ • Why? - Query optimizer costed the NLJ to be cheaper • Why? - Variables involved in the costing have changed • Why? - Statistics were gathered with wrong options

  16. Choosing Different Levels of Scope • System level • database parameters • alter system • object statistics • Session level • alter session • Statement level • hints • SQL profiles & outlines & baselines

  17. Performance Troubleshooting Toolbox • ADDM, AWR, ASH reports and raw data • SQL Monitoring Active Report (11g) • DBMS_XPLAN • SQL Trace • V$SESSTAT • V$SESSION • Stack dumps (pstack) • OS metrics tools (collectl, iostat, vmstat, mpstat, etc.)

  18. <Insert Picture Here> Quick Introduction To The Optimizer

  19. <Insert Picture Here> An Important Note About Cardinality Estimates Good cardinality estimates generally result in a good plan, however, bad cardinality estimates do not always result in a bad plan

  20. Introducing the Cost-Based OptimizerCost and Cardinality • Cardinality • Estimated number of rows returned from a join, a table or an index • Factors influencing cardinality • Query predicates and query variables • Object statistics

  21. Introducing the OptimizerCost and Cardinality • Cost • Representation of resource consumption • CPU • Disk I/O • Memory • Network I/O • Factors influencing cost • Cardinality and selectivity • Cost model • Parameters • System statistics

  22. Good SQL and Bad SQL • Good SQL • SQL that makes it possible for the optimizer to produce a good cardinality estimate • select * from emp where ename != ‘KING’ • Bad SQL • SQL that makes it difficult for the optimizer to produce a good cardinality estimate • select * from emp where replace(ename, ‘KING’) is not null

  23. Good Plans and Bad Plans • Good Plan • Efficient retrieval or modification of the desired rows • Highly selective index to retrieve few rows from a large table • Scan to retrieve many rows from a large table • Bad Plan • Inefficient retrieval or modification of the desired rows • Scan to retrieve few rows from a large table • Non-selective index to retrieve many rows from a large table

  24. What is a Query Plan? • Access Path • Table scan • Index { fast full | full | range | skip | unique } scan • Join Method • Hash • Merge • Nested loops • Join Order • Distribution Method • Broadcast • Hash

  25. Challenges in Cardinality Estimation • Complex predicates • Correlation • Non-representative bind values • Out of range predicates • Skew • Statistics Quality • Frequency • Histograms • Sample Size

  26. What Is Dynamic Sampling? • Improves quality of cardinality estimates • Objects with no statistics • Avoids the use of heuristics • Less complete than statistics stored in the dictionary • Objects with existing statistics • Predicates with complex expressions

  27. <Insert Picture Here> Demonstration

  28. What Have We Seen?Cardinality Drives Plan Selection ▼ Broadcast ▲ Hash ▲ Broadcast ▼ Hash

  29. What Have We Seen? • SQL Monitor Report • Ideal tool to use for statement troubleshooting • Can be used on active statements • Dynamic Sampling • Good way of getting better cardinality estimates • Be cautious when using DS without table stats • Parallel execution chooses the level automatically (11.2) • RWP used level 4 or 5 for data warehouses (11.1)

  30. What Have We Seen? • SQL Tuning Advisor • Helps identify better plans • SQL Profile • “Patch” a plan • Generated by SQL Tuning Advisor • Identified by the user • Force matching can match literals • SQLT/SQLTXPLAIN • coe_xfr_sql_profile.sql • See MOS note 215187.1

  31. What People Think Are Performance Secrets • Undocumented parameters • Documented parameters • Undocumented events • Index rebuilds and table reorgs • Fiddling with block size • Silver Bullets

  32. What Are The Real-World Performance Secrets • Use a systematic approach, always • Let data (metrics, etc.) guide your troubleshooting • Match the scope of the problem and solution • Realize that you may not be able to get to the root cause in just one step

  33. The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions.The development, release, and timing of any features or functionality described for Oracle’s products remains at the sole discretion of Oracle.

More Related