1 / 45

The Smoothed Analysis of Algorithms: Simplex Methods and Beyond

The Smoothed Analysis of Algorithms: Simplex Methods and Beyond. Shang-Hua Teng Boston University/Akamai. Joint work with Daniel Spielman (MIT). Outline. Why. What. Simplex Method. Numerical Analysis. Condition Numbers/Gaussian Elimination. Conjectures and Open Problems.

paul2
Download Presentation

The Smoothed Analysis of Algorithms: Simplex Methods and Beyond

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Smoothed Analysis of Algorithms:Simplex Methods and Beyond Shang-Hua Teng Boston University/Akamai Joint work with Daniel Spielman (MIT)

  2. Outline Why What Simplex Method Numerical Analysis Condition Numbers/Gaussian Elimination Conjectures and Open Problems

  3. Motivation for Smoothed Analysis Wonderful algorithms and heuristics that work well in practice, but whose performance cannot be understood through traditional analyses. worst-case analysis: if good, is wonderful. But, often exponential for these heuristics examines most contrived inputs average-case analysis: a very special class of inputs may be good, but is it meaningful?

  4. Random is not typical

  5. Analyses of Algorithms: worst case maxx T(x) average case Er T(r) smoothed complexity

  6. Instance of smoothed framework x is Real n-vector sr is Gaussian random vector, variance s2 measure smoothed complexity as function of n and s

  7. Complexity Landscape run time input space

  8. Complexity Landscape worst case run time input space

  9. average case Complexity Landscape worst case run time input space

  10. Smoothed Complexity Landscape run time input space

  11. Smoothed Complexity Landscape run time smoothed complexity input space

  12. Smoothed Analysis of Algorithms • Interpolate between Worst case and Average Case. • Consider neighborhood of every input instance • If low, have to be unlucky to find bad input instance

  13. Motivating Example: Simplex Method for Linear Programming max zT x s.t. A x £ y • Worst-Case: exponential • Average-Case: polynomial • Widely used in practice

  14. Carbs Protein Fat Iron Cost 1 slice bread 30 5 1.5 10 30¢ 1 cup yogurt 10 9 2.5 0 80¢ 2tsp Peanut Butter 6 8 18 6 20¢ US RDA Minimum 300 50 70 100 The Diet Problem Minimize 30 x1 + 80 x2 + 20 x3 s.t. 30x1 + 10 x2 + 6 x3  300 5x1 + 9x2 + 8x3  50 1.5x1 + 2.5 x2 + 18 x3  70 10x1 + 6 x3  100 x1, x2, x3  0

  15. The Simplex Method opt start

  16. History of Linear Programming • Simplex Method (Dantzig, ‘47) • Exponential Worst-Case (Klee-Minty ‘72) • Avg-Case Analysis (Borgwardt ‘77, Smale ‘82, Haimovich, Adler, Megiddo, Shamir, Karp, Todd) • Ellipsoid Method (Khaciyan, ‘79) • Interior-Point Method (Karmarkar, ‘84) • Randomized Simplex Method (mO(d) ) (Kalai ‘92, Matousek-Sharir-Welzl ‘92)

  17. max zT x s.t. A x £ y max zT x s.t. G is Gaussian Smoothed Analysis of Simplex Method [Spielman-Teng 01] Theorem: For all A, simplex method takes expected time polynomial

  18. Shadow Vertices

  19. Another shadow

  20. Shadow vertex pivot rule start z objective

  21. Theorem:For every plane, the expected size of the shadow of the perturbed tope is poly(m,d,1/s )

  22. Polar Linear Program z max  zÎ ConvexHull(a1, a2, ..., am)

  23. Initial Simplex Opt Simplex

  24. Shadow vertex pivot rule

  25. Count facets by discretizingto N directions, N

  26. [ ] Different Facets < c/N Pr Count pairs in different facets So, expect c Facets

  27. Expect cone of large angle

  28. Intuition for Smoothed Analysis of Simplex Method After perturbation, “most” corners have angle bounded away from flat opt start most: some appropriate measure angle: measure by condition number of defining matrix

  29. Condition number at corner Corner is given by Condition number is • sensitivity of x to change in C and b • distance of C to singular

  30. Condition number at corner Corner is given by Condition number is

  31. Connection to Numerical Analysis Measure performance of algorithms in terms of condition number of input Average-case framework of Smale: 1. Bound the running time of an algorithm solving a problem in terms of its condition number. 2. Prove it is unlikely that a random problem instance has large condition number.

  32. Connection to Numerical Analysis Measure performance of algorithms in terms of condition number of input Smoothed Suggestion: 1. Bound the running time of an algorithm solving a problem in terms of its condition number. 2’. Prove it is unlikely that a perturbed problem instance has large condition number.

  33. Edelman ‘88: for standard Gaussian random matrix Condition Number Theorem: for Gaussian random matrix variance centered anywhere [Sankar-Spielman-Teng 02]

  34. Edelman ‘88: for standard Gaussian random matrix Condition Number Theorem: for Gaussian random matrix variance centered anywhere (conjecture) [Sankar-Spielman-Teng 02]

  35. Gaussian Elimination • A = LU • Growth factor: • With partial pivoting, can be 2n • Precision needed (n )bits • For every A,

  36. Condition Number and Iterative LP Solvers Renegar defined condition number for LP maximize subject to • distance of (A, b, c) to ill-posed linear program • related to sensitivity of x to change in (A,b,c) Number of iterations of many LP solvers bounded by function of condition number: Ellipsoid, Perceptron, Interior Point, von Neumann

  37. Smoothed Analysis of Perceptron Algorithm [Blum-Dunagan 01] Theorem: For perceptron algorithm Bound through “wiggle room”, a condition number Note: slightly weaker than a bound on expectation

  38. Compare: worst-case complexity of IPM is iterations, note Smoothed Analysis of Renegar’s Cond Number Theorem: [Dunagan-Spielman-Teng 02] Corollary: smoothed complexity of interior point method is for accuracy e

  39. Perturbations of Structured and Sparse Problems Structured perturbations of structured inputs perturb Zero-preserving perturbations of sparse inputs perturb non-zero entries Or, perturb discrete structure…

  40. Goals of Smoothed Analysis Relax worst-case analysis Maintain mathematical rigor Provide plausible explanation for practical behavior of algorithms Develop a theory closer to practice http://math.mit.edu/~spielman/SmoothedAnalysis

  41. Geometry of

  42. Geometry of should be d1/2 (union bound)

  43. Improving bound on For , Apply to random conjecture Lemma: So

  44. Compare: worst-case complexity of IPM is iterations, note Smoothed Analysis of Renegar’s Cond Number Theorem: [Dunagan-Spielman-Teng 02] Corollary: smoothed complexity of interior point method is for accuracy e conjecture

More Related