7.1 ONE-DIMENSIONAL GRADIENT SEARCH METHOD
In general, we can say that adaptive algorithms are nothing but iterative search algorithms derived from minimizing a cost function with the true statistics replaced by their estimates. To study the adaptive algorithms, it is necessary to have a thorough understanding of the iterative algorithms and their convergence properties. In this chapter, we discuss the steepest descent and the Newton’s method.
The one-coefficient mean-square error (MSE) surface (line in this case) is given by [see (6.15)]
(7.1) |
and it is pictorially shown in Figure 7.1. The first and second derivatives are
Get Adaptive Filtering now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.