next up previous contents
Next: 7.3 Numerical integration Up: 7.2 Root-finding (optimization) Previous: 7.2.1 Bisection   Contents

7.2.2 Newton's Method and the Secant Method

The bisection method is a very intuitive method for finding a root but there are other ways that are more efficient (find the root in fewer iterations). Perhaps the best known root finding algorithm is Newton's method (a.k.a. Newton-Raphson method). Recall that the Taylor's series expansion for a function is

\begin{displaymath}f(x-x_0)=f(x_0)+f'(x_0)(x-x_0)+\cdots\end{displaymath}

In words, the first two terms of the series say that the function can be approximated, at one point, by taking it's value ($f(x_0)$) at a nearby point ($x_0$) and using the slope ($f'(x_0)$) at that point to extrapolate to the point where we want to approximate the function. Figure 7.3 illustrates the idea.
Figure 7.3: Illustration of the Taylor's series expansion approximation for a function $f(x)$. The blue line is the actual function $f(x)$ and the red line is the approximation to $f(x)$ given by the first two terms of the Taylor's series. Close to $x_0$ the error in the approximation is small but as $\vert x-x_0\vert$ increases, the error becomes significant.
\includegraphics[width=0.5\linewidth]{newton.eps}
In the figure, the blue line is the actual function $f(x)$ and the red line is the approximation to $f(x)$ given by the the first two terms of the Taylor's series. In Newton's method, the root of the approximate $f(x)$ is used to get a guess for the actual root (see figure). A new Taylor's series is then computed around this new guess, which yields another guess that is (usually) even closer to the actual root. Two steps of the process are shown in Fig. 7.4
Figure 7.4: Illustration of two steps of the Newton's method for root finding. The starting point for the method is $x_0$. The red line is the first approximation to $f(x)$ with yields the first guess at the root which is $x_1$. The function is then re-approximated by a Taylor's expansion about the point $x_1$, illustrated by the green line. The root of the second approximation is $x_2$, already close to the actual root of $f(x)$.
\includegraphics[width=0.5\linewidth]{newt2.eps}
The algorithm of Newton's root-finding method can be summarized as follows:
  1. Specify the starting guess $x_0$ and the desired accuracy $\epsilon$.
  2. Compute the next guess from the current guess using $x_{i+1}=x_i-f(x_i)/f'(x_i)$.
  3. Repeat step 2 until the desired accuracy is met, that is, until $\vert f(x)\vert<\epsilon$ or $\vert x_{i+1}-x_i\vert<\epsilon$, either criterion can be used depending on the situation.
It is possible that Newton's method will not converge if the initial guess is too far away from the actual root depending on the particular form of $f(x)$. So it is a good idea to limit the number of iterations allowed by your program to prevent the algorithm from iterating forever without success. Implementation of Newton's method is left as an assignment for the student.
Figure 7.5: Illustration of a secant and tangent to $f(x)$. Obviously, as $\Delta x$ becomes smaller and smaller, the secant and tangent will become more and more similar. In the limit $\Delta x\rightarrow 0$, they will become identical.
\includegraphics[width=0.5\linewidth]{secant.eps}
One of the problems with Newton's method is that the derivative of $f(x)$ must be known. In many cases, the derivative is not known or may be too time consuming to compute. In these cases, the secant method offers a viable alternative. The secant method is really the same method as Newton's method except the exact derivative is replaced with an approximate, ``numerical'' derivative. The derivative $f'(x)=\frac{\mathrm{d} f}{\mathrm{d} x}$ is defined as $\lim_{\Delta x\rightarrow0}\frac{f(x+\Delta x)-f(x)}{\Delta x}$. If, instead of taking the limit, a finite $\Delta x$ is used then this expression, rather than giving a tangent to $f(x)$, gives a line (a secant) that passes through $f(x)$ at $x+\Delta x$ and at $x$ as shown in Fig. 7.5. Obviously, as $\Delta x$ becomes smaller, the secant and tangent approach the same slope. In the limit $\Delta x\rightarrow 0$, they will become identical. In the secant method, $f'(x)$ is approximated by a ``finite difference'' derivative (i.e., $\Delta x>0$). Other than this, it is identical to Newton's method. In contrast to the secant method, Newton's method could be called a ``tangent'' method. The secant method algorithm for a computer program is summarized in the following.
  1. Specify two starting guesses $x_0$ and $x_1$, and the desired accuracy $\epsilon$.
  2. Compute the next guess from the current guess using $x_{i+1}=x_i-f(x_i)\times\frac{(x_i-x_{i-1})}{f(x_i)-f(x_{i-1})}$. Note that $\frac{(x_i-x_{i-1})}{f(x_i)-f(x_{i-1})}=\frac{1}{m}$ where $m$ is the just slope of the line passing between $f(x_i)$ and $f(x_{i-1})$.
  3. Repeat step 2 until the desired accuracy is met, that is, until $\vert f(x)\vert<\epsilon$ or $\vert x_{i+1}-x_i\vert<\epsilon$, either criterion can be used depending on the situation.
Actual implementation of the method in a program will be left as an assignment.
next up previous contents
Next: 7.3 Numerical integration Up: 7.2 Root-finding (optimization) Previous: 7.2.1 Bisection   Contents
Gus Hart 2005-01-28