Notebook for 2023-03-24
The topic for this lecture is optimization in 1D. We concretely consider minimizing $-\sin(x)$ on the interval $[0, 2]$. Of course, we know in advance that the solution to this problem is at $\pi/2$, but it is fun to see what different algorithms do.
Newton's iteration
To minimize a differentiable function $\phi(x)$, we want to solve the stationary equation $\phi'(x) = 0$. We know how to do this by Newton's method for nonlinear equations. Of course, we also know Newton's method can fail to converge without safeguarding! But in this case, it might also converge just fine, but not to a minimizer.
I suggest you add a monitor function and plot the convergence!
newton_opt1d (generic function with 1 method)
1.5707963267954879
-1.5707963267954879
Golden section
As described in the notes, golden section search is a bisection-like solver for finding a minimum of a unimodal function (so one local minimum) in 1D.
golden_section (generic function with 1 method)
6.72044e-7
1.06731e-6