7

Newton’s and Steepest Descent Methods

7.1    ONE-DIMENSIONAL GRADIENT SEARCH METHOD

In general, we can say that adaptive algorithms are nothing but iterative search algorithms derived from minimizing a cost function with the true statistics replaced by their estimates. To study the adaptive algorithms, it is necessary to have a thorough understanding of the iterative algorithms and their convergence properties. In this chapter, we discuss the steepest descent and the Newton’s method.

The one-coefficient mean-square error (MSE) surface (line in this case) is given by [see (6.15)]

J(w)=Jmin+rxx(0)(wwo)2;J(wo)J(w)forallws

(7.1)

and it is pictorially shown in Figure 7.1. The first and second derivatives are

J(w)w=2rxx(

Get Adaptive Filtering now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.