Newton’s method is a computational algorithm that can be used to find the roots of an equation. Let’s illustrate with an example.
FigureA.1.1. In Figure A.1.1, the solid blue line represents the function \(f(x)\text{.}\) Assume that \(x^*\) is an unknown root of \(f(x)\) such that \(f(x^*)=0\text{.}\) We may be able to guess at an approximate value of the root \(x=x_0\text{,}\) but this is unlikely to be very precise. Exmining Figure A.1.1, we can see that we can find a better approximation of the function’s root by using a straight-line (dashed red curve) with the same slope as our function at the location of our initial guess,
We’ll call this new approximated root \(x_1\text{.}\) It’s very possible that this new guess still lacks the precision that we desire, so we can repeat the process again, iterating as many times as we desire using
until \(\left|x_n-x_{n-1}\right|\) is less than some user-defined tolerance that defines the desired precision, as the change in the estimate to our root from one iteration to the next should become smaller as our estimate approaches the real value of the function’s root. Note that this method requires knowledge of the functional form of our function \(f(x)\) as well as knowledge of or the ability to calculate the analytic form of \(f'(x)\text{.}\)
We can be a little more rigorous with the theory underlying Newton’s method. Let’s assume that we have a function \(f(x)\) that has a root \(x^*\) such that \(f(x^*)=0\text{.}\) We can expand our function as a Taylor series around some initial guess for the root, \(x_g\text{,}\) giving us
where \(x^*_\text{est}\) is our estimate of the root based on one iteration of this method. Now, the closer \(x_g\) is to the real root \(x^*\text{,}\) the better our estimate \(x^*_\text{est}\) will be. So, once we have \(x^*_\text{est}\text{,}\) we can use this as our new guess for another iteration, giving us a result identical to what we found above,
which we iterate until \(\left|x_n-x_{n-1}\right|\) is less than some user-defined tolerance that defines the desired precision.
A word of caution: Newton’s method is not foolproof and it can fail sometimes. Please take a look at other computational methods textbooks or webpages to investigate these failure modes. For the purposes of this text, try several different initial guesses if your first choice leads to a failure of the method.
FINDING THE ROOTS OF A SYSTEM OF NONLINEAR EQUATIONS.
where \(\nabla\vec{f}\) is the \(N\times N\) matrix with elements \(\partial f_i /\partial x_j\text{.}\) Since \(\vec{x}^*\) is a root of the equations, we have \(\vec{f}(\vec{x}^*)=0\) so that
where \(\Delta\vec{x}=\vec{x}-\vec{x}'\) now. This is a set of ordinary linear simultaneous equations of the form \(\vec{A}\vec{x} = \vec{v}\) which can be solved using the linalg.solve package in NumPy. Once we’ve solved for \(\Delta \vec{x}\text{,}\) our new estimate \(\vec{x}'\) of the root is