1 / 13

Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001

Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001. Glenn Fung & Olvi Mangasarian. Data Mining Institute University of Wisconsin - Madison. Support Vector Machines Maximizing the Margin between Bounding Planes. A+. A-.

orpah
Download Presentation

Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proximal Support Vector Machine ClassifiersKDD 2001San Francisco August 26-29, 2001 Glenn Fung & Olvi Mangasarian Data Mining Institute University of Wisconsin - Madison

  2. Support Vector MachinesMaximizing the Margin between Bounding Planes A+ A-

  3. Proximal Vector MachinesFitting the Data using two parallel Bounding Planes A+ A-

  4. Solve the quadratic program for some : min (QP) , s. t. where , denotes or membership. • Marginis maximized by minimizing Standard Support Vector Machine Formulation

  5. min (QP) s. t. Solving for in terms of and gives: min PSVM Formulation We have from the QP SVM formulation: This simple, but critical modification, changes the nature of the optimization problem tremendously!!

  6. Advantages of New Formulation • Objective function remains strongly convex • An explicit exact solution can be written in terms of the problem data • PSVM classifier is obtained by solving a single system of linear equations in the usually small dimensional input space • Exact leave-one-out-correctness can be obtained in terms of problem data

  7. We want to solve: min Linear PSVM • Setting the gradient equal to zero, gives a nonsingular system of linear equations. • Solution of the system gives the desired PSVM classifier

  8. Here, • The linear system to solve depends on: which is of the size is usually much smaller than Linear PSVM Solution

  9. Input Define Calculate Solve Classifier: Linear Proximal SVM Algorithm

  10. Linear PSVM: (Linear separating surface: ) : min (QP) s. t. . Maximizing the margin By QP “duality”, in the “dual space” , gives: min min • Replace by a nonlinear kernel Nonlinear PSVM Formulation

  11. The nonlinear classifier: : • Gaussian (Radial Basis) Kernel • The represents the “similarity” -entryof of data points and The Nonlinear Classifier • Where K is a nonlinear kernel, e.g.:

  12. Similar to the linear case, setting the gradient equal to zero, we obtain: Defining slightly different: • Here, the linear system to solve is of the size Nonlinear PSVM However, reduced kernels techniques can be used (RSVM) to reduce dimensionality.

  13. Input Define Calculate Classifier: Classifier: Linear Proximal SVM Algorithm Non Solve

More Related