Numerical Optimization¶
Reference: Numerical Optimization by Nocedal and Wright
Categorize your optimization problem¶
Different optimization problems require different classes of optimization algorithms for efficient solution. Some fundamental decision points: The tree below can serve as a guide for which class of optimization algorithm is appropriate.
We will mainly discuss the optimization of smooth functions, with and without constraints.
Sketch of lecture¶
Problem classification
Tree
Concepts
Global and local solutions
Convex sets and functions
Conditions for optima from calculus
Iteration algorithms
\[x_o \rightarrow x_1 \rightarrow \ldots \rightarrow x_k \rightarrow x_{k+1} \rightarrow x_\text{opt}\]
Gradient free methods
Nelder-Mead simplex
Line search
\[\min_{\alpha > 0} f(x_k + \alpha p_k)\]Descent condition \(p_k^T \nabla f_k < 0\)
Steepest descent \(\nabla f\) (Sensitivity to scaling)
Newton direction \((\nabla^2 f)^{-1} \nabla f\) (by minimizing \(m_k\))
Conjugate gradient method
Quasi-Newton methods (SR1, BFGS)
Trust regions
\[ \begin{align}\begin{aligned}\min_{p} m_k(x_k + p)\\where :math:`x_k + p` lie inside the trust region and\end{aligned}\end{align} \]\[ \begin{align}\begin{aligned} m_k(x_k + p) = f_k + p^T\nabla f_k + p^T B_k p\\where :math:`B_k \sim \nabla^2 f_k` Secant condition\end{aligned}\end{align} \]\[B_{k+1}s_k = y_k\]\[s_k = x_{k+1}-x_k = \alpha p_k\]\[y_k = \nabla f_{k+1} - \nabla f_k\]Levenberg-Marquadt
Model formulation
Objective function \(f\)
Variables \(x\)
Constraint functions
Equality
Inequality
[ ]:
[ ]: