260 likes | 779 Views
Widrow-Hoff Learning (LMS Algorithm). w. i. ,. 1. w. w. i. ,. 2. =. i. ¼. w. i. ,. R. ADALINE Network. Two-Input ADALINE. Mean Square Error. Training Set:. Input:. Target:. Notation:. Mean Square Error:. Error Analysis. The mean square error for the ADALINE Network is a
E N D
Widrow-Hoff Learning (LMS Algorithm)
w i , 1 w w i , 2 = i ¼ w i , R ADALINE Network
Mean Square Error Training Set: Input: Target: Notation: Mean Square Error:
Error Analysis The mean square error for the ADALINE Network is a quadratic function:
Stationary Point Hessian Matrix: The correlation matrix R must be at least positive semidefinite. If there are any zero eigenvalues, the performance index will either have a weak minumum or else no stationary point, otherwise there will be a unique global minimum x*. If R is positive definite:
Approximate Steepest Descent Approximate mean square error (one sample): Approximate (stochastic) gradient:
Multiple-Neuron Case Matrix Form:
Analysis of Convergence For stability, the eigenvalues of this matrix must fall inside the unit circle.
Since , . 1 – 2 a l > – 1 i Conditions for Stability (where li is an eigenvalue of R) Therefore the stability condition simplifies to
Steady State Response If the system is stable, then a steady state condition will be reached. The solution to this equation is This is also the strong minimum of the performance index.
Example Banana Apple
Iteration One Banana
Iteration Two Apple
Adaptive Filtering Tapped Delay Line Adaptive Filter
2 p k 3 p æ ö m ( k ) = 1.2 sin - - - - - - - - - – - - - - - - è ø 3 4 2 p 2 æ ö = ( 1.2 ) 0.5 cos - - - - - - = – 0.36 è ø 3 Signals
E [ ( s ( k ) + m ( k ) ) v ( k ) ] h = E [ ( s ( k ) + m ( k ) ) v ( k – 1 ) ] Stationary Point 0 0