4.2. Project on Minimizing Functions (template)#

Brenton LeMesurier, lemesurierb@charleston.edu

Version of November 20, 2025

4.2.1. Introduction#

The background to this project is seen in Chapter Optimization, and will start with the one dimensional case of minimizing a continuous function \(f:[a, b] \to \mathbb{R}\),

comparing two derivative-free methods: the Golden Section Search and Successive Parabolic Interpolation.

If there is time, subsequent goal will be to consider functions of more dimensions — for this project, two is enough: \(f:\mathbb{R}^2 \to \mathbb{R}\).

References:

Exercise 4.12

Describe the golden section search method precisely in pseudocode.

Exercise 4.13

Implement the Golden Section Search and test it on a few examples, such as

  1. \(f(x) = -x e^{-x}\) (whose minimum can be shown to occur at \(x=1\))

  2. The Leonard-Jones potential \(\displaystyle f(x) = \frac{1}{x^{12}} - \frac{1}{x^6}\).

The second in particular requires preparatory work finding a suitable initial triplet.

Note that with a nice smooth “target function” \(f\) as here, one can get numerical confirmation of the result by checking that \(f'(x) \approx 0\).

To be careful, that only checks that it is a critical point; the second derivative test can confirm that it is a local minimum.

Exercise 4.14

Describe the method of Successive Parabolic Interpolation precisely in pseudocode.

Exercise 4.15

Implement the method of Successive Parabolic Interpolation and test it on a few examples, such as those in Exercise 4.13.

4.2.3. Observations and Conclusions#