# 6. Exercises on Newton’s Method#

## Exercise 1#

a) Show that Newton’s method applied to

leads to fixed point iteration with function

b) Then verify mathematically that the iteration \(x_{k+1} = g(x_k)\) has super-linear convergence.

## Exercise 2#

a) Create a Python function for Newton’s method, with usage

```
(root, errorEstimate, iterations, functionEvaluations) = newton(f, Df, x_0, errorTolerance, maxIterations)
```

(The last input parameter `maxIterations`

could be optional, with a default like `maxIterations=100`

.)

b) based on your function `bisection2`

create a third (and final!) version with usage

```
(root, errorBound, iterations, functionEvaluations) = bisection(f, a, b, errorTolerance, maxIterations)
```

c) Use both of these to solve the equation

i) with [estimated] absolute error of no more than \(10^{-6}\), and then

ii) with [estimated] absolute error of no more than \(10^{-15}\).

Note in particular how many iterations and how many function evaluations are needed.

Graph the function, which will help to find a good starting interval \([a, b]\) and initial approximation \(x_0\).

d) Repeat, this time finding the unique real root of

Again graph the function, to find a good starting interval \([a, b]\) and initial approximation \(x_0\).

e) This second case will behave differently than for \(f_1\) in part (c): describe the difference. (We will discuss the reasons in class.)