170 likes | 527 Views
Recursive Least-Squares (RLS) Adaptive Filters. Definition. With the arrival of new data samples estimates are updated recursively. Introduce a weighting factor to the sum-of-error-squares definition. two time-indices n: outer, i: inner. Weighting factor. Forgetting factor.
E N D
Definition • With the arrival of new data samples estimates are updated recursively. • Introduce a weighting factor to the sum-of-error-squares definition two time-indices n: outer, i: inner Weighting factor Forgetting factor : real, positive, <1, →1 =1 → ordinary LS 1/(1- ): memory of the algorithm (ordinary LS has infinite memory) w(n) is kept fixed during the observation interval 1≤i ≤n for which the cost function (n) is defined.
Regularisation • LS cost function can be ill-posed • There is insufficient information in the input data to reconstruct the input-output mapping uniquely • Uncertainty in the mapping due to measurement noise. • To overcome the problem, take ‘prior information’ into account • Prewindowing is assumed! • (not the covariance method) Regularisation term Smooths and stabilises the solution : regularisation parameter
Normal Equations • From method of least-squares we know that then the time-average autocorrelation matrix of the input u(n) becomes • Similarly, the time-average cross-correlation vector between the tap inputs and the desired response is (unaffected from regularisation) • Hence, the optimum (in the LS sense) filter coefficients should satisfy autocorrelation matrix is always non-singular due to this term. (-1 always exists!)
Recursive Computation • Isolate the last term for i=n: • Similarly • We need to calculate -1 to find w→ direct calculation can be costly! • Use Matrix Inversion Lemma (MIL)
Recursive Least-Squares Algorithm • Let • Then, using MIL • Now, letting • We obtain inverse correlation matrix gain vector Riccati equation
Recursive Least-Squares Algorithm • Rearranging • How can w be calculated recursively? Let • After substituting the recursion for P(n) into the first term we obtain • But P(n)u(n)=k(n), hence
Recursive Least-Squares Algorithm • Theterm is calledthe a prioriestimationerror, • Whereastheterm is calledthe a posterioriestimationerror. • Summary; theupdateeqn. • -1 is calculatedrecursivelyandwithscalardivision • Initialisation: (n=0) • If no a prioriinformationexists gain vector a priori error regularisation parameter