290 likes | 671 Views
Neural Networks And Its Applications. By Dr. Surya Chitra. OUTLINE. Introduction & Software Basic Neural Network & Processing Software Exercise Problem/Project Complementary Technologies Genetic Algorithms Fuzzy Logic Examples of Applications Manufacturing R&D Sales & Marketing
E N D
Neural Networks AndIts Applications By Dr. Surya Chitra
OUTLINE • Introduction & Software • Basic Neural Network & Processing • Software Exercise Problem/Project • Complementary Technologies • Genetic Algorithms • Fuzzy Logic • Examples of Applications • Manufacturing • R&D • Sales & Marketing • Financial
Introduction What is a Neural Network? A computing system made up of a number of highly interconnected processing elements, which processes information by its dynamic state response to external inputs Dr. Robert Hecht-Nielsen A parallel information processing system based on the human nervous system consisting of large number of neurons, which operate in parallel.
Biological Neuron & Its Function Information Processed in Neuron Cell Body and Transferred to Next Neuron via Synaptic Terminal
Processing in Biological Neuron Neurotransmitters Carry information to Next Neuron and It is Further Processed in Next Neuron Cell Body
Artificial Neuron & Its Function Dendrites Neuron Axon Outputs Inputs Processing Element
Processing Steps Inside a NeuronElectronic Implementation Summed Inputs • Sum • Min • Max • Mean • OR/AND Add Bias Weight Transform • Sigmoid • Hyperbola • Sine • Linear Inputs Outputs Processing Element
Sigmoid Transfer Function Transfer 1 Function = ( 1 + e (- sum) )
Basic Neural Network & Its Elements Clustering of Neurons Bias Neurons Output Neurons Hidden Neurons Input Neurons
Back-Propagation NetworkForward Output Flow • Random Set of Weights Generated • Send Inputs to Neurons • Each Neuron Computes Its Output • Calculate Weighted Sum I j = i W i, j-1 * X i, j-1 + B j • Transform the Weighted Sum X j = f (I j) = 1/ (1 + e – (Ij + T) ) • Repeat for all the Neurons
Back-Propagation NetworkBackward Error Propagation • Errors are Propagated Backwards • Update the Network Weights • Gradient Descent Algorithm Wji (n) = * j * Xi Wji (n+1) = Wji (n) + Wji (n) • Add Momentum for Convergence Wji (n) = * j * Xi + * Wji (n-1) Where n = Iteration Number; = Learning Rate = Rate of Momentum (0 to 1)
Back-Propagation NetworkBackward Error Propagation • Gradient Descent Algorithm • Minimization of Mean Squared Errors • Shape of Error • Complex • Multidimensional • Bowl-Shaped • Hills and Valleys • Training by Iterations • Global Minimum is Challenging
Computation Node Bias Unit Input Unit Context Unit Recurrent Neural Network
Higher Order Unit Computation Node Bias Unit Input Unit Time Delay Neural Network
Training - Supervised • Both Inputs & Outputs are Provided • Designer Can Manipulate • Number of Layers • Neurons per Layer • Connection Between Layers • The Summation & Transform Function • Initial Weights • Rules of Training • Back Propagation • Adaptive Feedback Algorithm
Training - Unsupervised • Only Inputs are Provided • System has to Figure Out • Self Organization • Adaptation to Input Changes/Patterns • Grouping of Neurons to Fields • Topological Order • Based on Mammalian Brain • Rules of Training • Adaptive Feedback Algorithm (Kohonen) Topology: Map one space to another without changing geometric Configuration