260 likes | 636 Views
MULTILAYER PERCEPTRON. Nurochman , Teknik Informatika UIN Sunan Kalijaga Yogyakarta. Review SLP. X1. w 1. Σ. f(y ). w 2. X2. output. activation f unc. Σ x i .w i. w i. X3. weight. Fungsi Aktivasi. Fungsi undak biner (hard limit) Fungsi undak biner (threshold). .
E N D
MULTILAYER PERCEPTRON Nurochman, TeknikInformatika UIN SunanKalijaga Yogyakarta
Review SLP X1 w1 Σ f(y) w2 X2 output . . . activation func Σxi.wi wi X3 weight
Fungsi Aktivasi • Fungsi undak biner (hard limit) • Fungsi undak biner (threshold)
Fungsi Aktivasi • Fungsi bipolar • Fungsi bipolar dengan threshold
Fungsi Aktivasi • Fungsi Linier (identitas) • Fungsi Sigmoid biner
Learning Algorithm Inisialisasilajupembelajaran (α), nilaiambang (𝛉), bobotserta bias Menghitung Menghitung
Learning Algorithm • Jika y ≠ target, lakukan update bobotdan bias Wibaru = Wlama + α.t.Xi b baru = b lama + α.t • Ulangdarilangkah 2 sampaitidakada update bobotlagi
X1 1 Y 1 X2 1 Problem “OR” X1 X2 net Y, 1 jika net >=1, 0 jika net < 1 1 1 1.1+1.1=2 1 1 0 1.1+0.1=1 1 0 1 0.1+1.1=1 1 0 0 0.1+0.1=0 0 Ternyata BERHASIL mengenali pola
X1 1 Y 2 X2 1 Problem “AND” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2 1 1 1.1+1.1=2 1 1 0 1.1+0.1=1 0 0 1 0.1+1.1=1 0 0 0 0.1+0.1=0 0 Ternyata BERHASIL mengenalipola
X1 2 Y 2 X2 -1 Problem “X1 and not(X2)” X1 X2 net Y, 1 jika net >=2, 0 jika net < 2 1 1 1.2+1.-1=1 0 1 0 1.2+0.-1=2 1 0 1 0.2+1.-1=-1 0 0 0 0.2+0.-1=0 0 Ternyata BERHASIL mengenali pola
F(0,1) = 1 F(1,1) = 0 F(0,0) = 0 F(1,0) = 1 Problem “XOR” X1 X2 Y 1 1 0 1 0 1 0 1 1 0 0 0 GAGAL!
2 2 X1 Z1 1 Y 1 -1 -1 1 X2 Z2 2 2 Solusi • XOR = (x1 ^ ~x2) V (~x1 ^ x2) • Ternyata dibutuhkan sebuah layer tersembunyi
Multi-Layer Perceptron • MLP is a feedforward neural network with at least one hidden layer (Li Min Fu) • Limitations of Single-Layer Perceptron • Neural Network for Nonlinier Pattern Recognition • XOR Problem
x1 1 -1 1 x2 -1 Solution for XOR Problem
Solution from XOR Problem -1 0,1 +1 1 if v > 0 (v) = -1 if v 0 is the sign function. x1 +1 -1 -1 x2 +1 +1 -1
Learning Algorithm • Backpropagation Algorithm • It adjusts the weights of the NN in order to minimize the average squared error Function signals Forward Step Error signals Backward Step
BP has two phases • Forward pass phase: computes ‘functional signal’, feedforward propagation of input pattern signals through network • Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values)
1 Increasing a -10 -8 -6 -4 -2 2 4 6 8 10 Activation Function • Sigmoidal Function