130 likes | 252 Views
Learning Algorithm of MLP. Neural Networks. Multi Layer Perceptrons. f (.). f (.). f (.). Function signal. Error signal. Computations at each neuron j : Neuron output, y j Vector of error gradient, ¶ E / ¶ w ji. Forward propagation. “ Backpropagation Learning Algorithm”.
E N D
Learning Algorithm of MLP Neural Networks Multi Layer Perceptrons f(.) f(.) f(.) Function signal Error signal • Computations at each neuron j: • Neuron output, yj • Vector of error gradient, ¶E/¶wji Forward propagation “Backpropagation Learning Algorithm” Backward propagation
Neural Networks Multi Layer Perceptrons x1(i) yk(i) x2(i) . . . . . . . . . xm(i) Learning Algorithm of MLP Goal: Cost function / performance index: Minimize Weight Modification Rule
Neural Networks Multi Layer Perceptrons x1(i) yk(i) x2(i) . . . . . . . . . xm(i) Learning Algorithm of MLP • Backpropagation Learning Algorithm: • Learning on output neuron • Learning on hidden neurons
Neural Networks Multi Layer Perceptrons . . . . . . Learning Algorithm of MLP Notations: the output of the k-the neuron of the l-th layer,at the i-th time instant the output of the j-the neuron of the l–1-th layer,at the i-th time instant
Neural Networks Multi Layer Perceptrons . . . . . . . . . Back Propagation Learning Algorithm Learning on output neuron Depends on the activation function
Neural Networks Multi Layer Perceptrons . . . . . . . . . Back Propagation Learning Algorithm Learning on hidden neuron
Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm Depends on the activation function Depends on the activation function
Neural Networks Multi Layer Perceptrons f(.) f(.) f(.) Back Propagation Learning Algorithm Forwardpropagation • Set the weights • Calculate output Backwardpropagation • Calculate error • Calculate gradient vector • Update the weights
Neural Networks Multi Layer Perceptrons Influential Factors in Learning • Initial weights and bias • Cost function / performance index • Training data and generalization • Network structure • Number of layers • Number of neurons • Interconnections • Learning Methods • Weight modification rule • Variable or fixed learning rate ()
Neural Networks Multi Layer Perceptrons Homework 4 • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2inputs, 1 hidden layer of 2 neurons, and 1 output layer of 1 neuron, no bias at all (all a = 1). • Be sure to obtain decreasing errors. • Note: Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is six.
Neural Networks Multi Layer Perceptrons Homework 4A (Odd Student-ID) • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of3neurons, and 1 output layer of 1 neuron, withbias (all a = 1). • Be sure to obtain decreasing errors (convergence). • Note: Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is eleven.
Neural Networks Multi Layer Perceptrons Homework 4A (Even Student-ID) • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of3neurons, and 1 output layer of 1 neuron, withbias (all a = 0.8). • Be sure to obtain decreasing errors (convergence). • Note: Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is twelve.