160 likes | 167 Views
Explore linear hyperplanes for data separation in Support Vector Machines (SVM) and determine which solution is better by maximizing margin. Learn how to define the optimal boundary and deal with non-linear boundaries.
E N D
Find a linear hyperplane (decision boundary) that will separate the data Support Vector Machines
One Possible Solution Support Vector Machines
Another possible solution Support Vector Machines
Other possible solutions Support Vector Machines
Which one is better? B1 or B2? How do you define better? Support Vector Machines
Find hyperplane maximizes the margin => B1 is better than B2 Support Vector Machines
X2 X1 此條線為由原點沿 與 垂直的線 方向走
W•X +b =0 此線在沿法向量走-b/|w|的距離,與法向量垂直 w•x+b=0 x1-x2 x1 d x2 w w•x+b=1 w•x+b = -1 0
開始推導 • w•x+b=0 這是一條怎樣的線呢? xa , xb為線上兩點 w•xa+b=0, w•xb+b=0 兩式相減 w •(xa-xb)=0 線上兩點所形成的向量與w垂直 所以w是法向量 • w •xs+b=k , xs在上面的線k > 0 • w •xc+b=k’ , xc在下面的線k’ < 0 • 方塊分類為y=1, 圓圈分類為y=-1 • 分類公式 z 是 y=1類或 y=-1類依據:
調整w與b得到 • w•(x1-x2)=2 • |w|*d=2 • ∴
Support Vector Machines • We want to maximize: • Which is equivalent to minimizing: • But subjected to the following constraints: • This is a constrained optimization problem • Numerical approaches to solve it (e.g., quadratic programming)
Min f(w)= • Subject to • Change to • Min f(w)= • Subject to
Nonlinear Support Vector Machines • What if decision boundary is not linear?
Nonlinear Support Vector Machines • Transform data into higher dimensional space