80 likes | 262 Views
Artificial Neural Network Motivation. By Dr. Rezaeian Modified: Vali Derhami Yazd University, Computer Department vderhami@yazduni.ac.ir HomePage: http://ce.yazduni.ac.ir/derhami.
E N D
Artificial Neural NetworkMotivation By Dr. Rezaeian Modified: Vali Derhami Yazd University, Computer Department vderhami@yazduni.ac.ir HomePage: http://ce.yazduni.ac.ir/derhami
We have a highly interconnected set of some 1011 neurons to facilitate our activities such: breathing, reading, thinking, motion, etc. • Each of our biological neuron: a rich assembly of tissue and chemistry but low speed processors with limited computing power. • Some of our neural structure was with us at birth and some have been established by experience. • It is generally understood that all biological neural functions, including memory, are stored in the connections between them. Learning is viewed as the establishment of new connections between neurons or modification of existing connections Artificial neural networks are an attempt at modeling the information processing capabilities of nervous systems
Biological neurons: Structure and Function of a single neuron + Each neuron has a body (soma), an axon, and many dendrites + Can be in one of the two states: firing and rest. Neuron fires if the total incoming stimulus exceeds the threshold
+ Synapse: thin gap between axon of one neuron and dendrite of another. The number of synapses received by each neuron range from 100 to 100,000. • + Synaptic strength/efficiency: the magnitude of the signal received by a neuron (from another) depends on the efficiency of the synaptic transmission. • + Two types of synapses: Excitatory and Inhibitory • + A neuron will fire , i.e., send an output impulse of about 100 mV down its axon, if sufficient signals from other neurons fall upon its dendrites in a short period time.
History of ANN • 1943: McCulloch & Pitts : Designers of the first neural network. • combining many simple processing units together could lead to an overall increase in computational power. • 1949: Hebb developed the first learning rule (work on perceptron) • 1969: Minsky and Papert's proof that the perceptron could not learn certain type of functions, research into neural networks went into decline throughout the 1970's. • Parker( 1985) and Lecun (1986) independently discovered a learning algorithm for multi-layer networks called back propogation that could solve problems that were not linearly separable.
Artificial Neural Networks specifications • Non-linearity • Input-output mapping (learning ability) • Adaptability • Generalization • Evidential Response (decision with a measure if confidence) • Fault tolerance • VLSI implementability
Some of the applications • Aerospace: aircraft autopilot, flight path simulation • Banking & Financial: credit application evaluator, currency price prediction • Defense: weapon steering, target tracking • Medical: EEG, ECG analysis, prosthesis design • Oil & Gas: exploration • Robotics: vision system • Speech: speech recognition, speech compression, text to speech synthesis • Telecommunication: Image an data compression, real-time translation of spoken language