Newtons Method is an iterative method for finding the [[Roots|root]] of a [[Continuous Function]] $f\in \mathcal C[a,b]$. $\huge \begin{align} x_{n+1} = x_{n} - \frac{f(x_{n})}{f'(x_{n})} \end{align}$ This method works by starting at a point, finding the tangent line to the function at that point, and then choosing the next point to be the $x$ value wheere that tangent line equals to 0. ### Asymptomatic Behavior for Smooth Functions Let $f(x)$ be a [[Smooth Function]] with $f(0)=0$. This is the same as any smooth function $g(x)$ with a [[Roots|root]] at $p$ that is shifted by $p$, therefore we can talk about $f$ without [[Loss of Generality]]. $\huge f(x) = g(x-p) $ #### Behavior when first and second derivatives are 0 [[Maclaurin Series]]: $\huge \begin{align} f(x) &= f'(0)x + \frac{1}{2}f''(0)x^{2} + \dots\\ &= f'(0)x + \frac{1}{2}f''(0)x^{2} + \mathcal O(x_{}^{3})\\ \end{align} $ Apply [[Newton's Method]] at $x=x_n$. $\huge x_{n+1} = x_{n} - \frac{f(x_{n})}{f'(x_{n})} $ $\huge \begin{align} f(x_{n}) &= f_{0}'x_{n} + \frac{1}{2}f_{0}'x_{n}^{2} + \mathcal O(x_{n}^{3}) \\ f'(x_{n}) &= f'_{0} + f''_{0}x_{n} + \mathcal O(x_{n}^{2})\\ \end{align}$ We substitute $f,f'$ for its [[Maclaurin Series]]: $\large \begin{align} x_{n+1} &= x_{n} - \frac{ f_{0}'x_{n} + \frac{1}{2}f_{0}''x_{n}^{2} + \mathcal O(x_{n}^{3}) }{ f'_{0} + f''_{0}x_{n} + \mathcal O(x_{n}^{2}) } \\ x_{n+1} &= \frac{ f_{0}'x_{n} + f_{0}''x_{n}^{2} + \mathcal O(x_{n}^{3}) }{ f'_{0} + f''_{0}x_{n} + \mathcal O(x_{n}^{2}) } - \frac{ f_{0}'x_{n} + \frac{1}{2}f_{0}''x_{n}^{2} + \mathcal O(x_{n}^{3}) }{ f'_{0} + f''_{0}x_{n} + \mathcal O(x_{n}^{2}) } \\ &= \frac{ \frac{1}{2}f_{0}''x_{n}^{2} + \mathcal O(x_{n}^{3}) }{ f_{0}' + f_{0}''x_{n}+ \mathcal O(x_{n}^{2}) } \end{align} $ Assume $f_{0}''\neq 0$, because if it is then $\mathcal O(x_{n}^{3})$ is not negligible. As $n\to \infty$, $x_{n}\to 0$. $\huge \begin{align} x_{n+1} &= \frac{ \frac{1}{2}f_{0}'' x_{n}^{2} }{ f'_{0} + f_{0}''x_{n} } + \mathcal O(x_{n}^{3}) \end{align}$ If $f_0' \neq 0$, $\huge x_{n+1} = \frac{f_{0}''}{2f_{0}'} x_{n}^{2} + \mathcal O(x_{n}^{3}) $ This means that (for [[Smooth Function|smooth functions]]), [[Newton's Method]] will have an [[Order of Convergence]] $2$. $\huge \alpha = 2$ We know that the target root is $0$, therefore we can predict the [[Numerical Analysis Error Bounds|Error Bound]] as the following [[Recurrence Relation]]: $\huge \epsilon_{n+1} \approx \left| \frac{f_{0}''}{2f_{0}'} \right| \epsilon_{n}^{2} $ ## Derivative ### Cubic [[Polynomial]] Let $f(x)=ax^{3}+bx^{2}+cx^{1}+d$. $\huge \begin{align} f(x)&=ax^{3}+bx^{2}+cx^{1}+d \\ f'(x) &= 3ax^{2} + 2bx + c \\ \mathcal N (z) &=z - \frac{f(z)}{f'(z)} \\ &= z - \frac{ az^{3}+bz^{2}+cz^{1}+d }{ 3az^{2} + 2bz + c } \\ &= \frac{ \pa{3az^{2} + 2bz + c}z - \pa{ az^{3}+bz^{2}+cz^{1}+d } }{ 3az^{2} + 2bz + c } \\ &= \frac{3az^{3}+2bz^{2}+cz - az^{3}-bz^{2}-cz-d}{ 3az^{2} + 2bz + c } \\ \mathcal N(z) &= \frac{ 2az^{3} + bz^{2} -d }{ 3az^{2} + 2bz + c } \end{align} $ $ \large \begin{align} f(z) &= (z-r_{1})(z-r_{2})(z-r_{3}) \\ &=\pa{z-r_{3}}(z^{2}-(r_{1}+r_{2})z+r_{1}r_{2}) \\ &= z^{3} - z^{2}(r_{1}+r_{2}+r_{3})-r_{3}z^{2}+zr_{1}r_{2}+zr_{3}(r_{1}+r_{2})+r_{1}r_{2}r_{3} \\ \\ &= z^{3} - z^{2}(r_{1}+r_{2}+r_{3})+z(r_{1}r_{2}+r_{3}(r_{1}+r_{2})) +r_{1}r_{2}r_{3} \\ f'(z) &= 3z^{2} - 2z(r_{1}+r_{2}+r_{3})+r_{1}r_{2}+r_{3}(r_{1}+r_{2}) \end{align} $ $\large \begin{align} \end{align} $