1 / 20

ECE 5424: Introduction to Machine Learning

ECE 5424: Introduction to Machine Learning. Topics: SVM Lagrangian Duality (Maybe) SVM dual & kernels Readings: Barber 17.5. Stefan Lee Virginia Tech. Recap of Last Time. Image Courtesy: Arthur Gretton. Image Courtesy: Arthur Gretton. Image Courtesy: Arthur Gretton.

harger
Download Presentation

ECE 5424: Introduction to Machine Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ECE 5424: Introduction to Machine Learning • Topics: • SVM • LagrangianDuality • (Maybe) SVM dual & kernels • Readings: Barber 17.5 Stefan Lee Virginia Tech

  2. Recap of Last Time (C) Dhruv Batra

  3. (C) Dhruv Batra Image Courtesy: Arthur Gretton

  4. (C) Dhruv Batra Image Courtesy: Arthur Gretton

  5. (C) Dhruv Batra Image Courtesy: Arthur Gretton

  6. (C) Dhruv Batra Image Courtesy: Arthur Gretton

  7. Generative vs. Discriminative • Generative Approach (Naïve Bayes) • Estimate p(x|y) and p(y) • Use Bayes Rule to predict y • Discriminative Approach • Estimate p(y|x) directly (Logistic Regression) • Learn “discriminant” function f(x) (Support Vector Machine) (C) Dhruv Batra

  8. w.x + b = +1 w.x + b = -1 w.x + b = 0 x+ x- SVMs are Max-Margin Classifiers Maximize this while gettingexamples correct.

  9. Last Time • Hard-Margin SVM Formulation • Soft-Margin SVM Formulation

  10. Last Time • Can rewrite as a hinge loss SVM: Hinge Loss LR: Logistic Loss

  11. Today • I want to show you how useful SVMs really are by explaining the Kernel trick but to do that…. • …. we need to talk about Lagrangian Duality

  12. Constrained Optimization • How do I solve these sorts of problems: (C) Dhruv Batra

  13. Introducing the Lagrangian • Lets assume we have this problem Define the Lagrangian function: Where and are called Lagrange Multipliers (or dual variables) (C) Dhruv Batra

  14. Gradient of Lagrangian This will find critical points in the constrained function. (C) Dhruv Batra

  15. Building Intuition

  16. Geometric Intuition Image Credit: Wikipedia

  17. Geometric Intuition h Image Credit: Wikipedia

  18. A simple example • Minimize f(x,y) = x + y s.t. x2 + y2 = 1 • L(x,y,) = x + y + (x2+ y2– 1) • , • Makes

  19. Why go through this trouble? • Often solutions derived from the Lagrangian form are easier to solve • Sometimes they offer some useful intuition about the problem • It builds character.

  20. Lagrangian Duality • More formally on overhead • Starring a game-theoretic interpretation, the duality gap, and KKT conditions.

More Related