680 likes | 804 Views
Chapter 9 Perceptrons and their generalizations. Rosenblatt ’ s perceptron Proofs of the theorem Method of stochastic approximation and sigmoid approximation of indicator functions Method of potential functions and Radial basis functions Three theorem of optimization theory Neural Networks.
E N D
Rosenblatt’s perceptron • Proofs of the theorem • Method of stochastic approximation and sigmoid approximation of indicator functions • Method of potential functions and Radial basis functions • Three theorem of optimization theory • Neural Networks
Method of stochastic approximation and sigmoid approximation of indicator functions
Basic Frame for learning process • Use the sigmoid approximation at the stage of estimating the coefficients • Use the indicator functions at the stage of recognition.
Potential function • On-line • Only one element of the training data • RBFs (mid-1980s) • Off-line
Method of potential functions in asymptotic learning theory • Separable condition • Deterministic setting of the PR • Non-separable condition • Stochastic setting of the PR problem
Three Theorems of optimization theory • Fermat’s theorem (1629) • Entire space, without constraints • Lagrange multipliers rule (1788) • Conditional optimization problem • Kuhn-Tucker theorem (1951) • Convex optimizaiton
To find the stationary points of functions • It is necessary to solve a system of n equations with n unknown values.