{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Numerical Optimization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Reference: [Numerical Optimization by Nocedal and Wright](http://www.springer.com/us/book/9780387303031)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Categorize your optimization problem\n", "\n", "Different optimization problems require different classes of optimization algorithms for efficient solution. Some fundamental decision points: The tree below can serve as a guide for which class of optimization algorithm is appropriate." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![Optimization Tree](https://neos-guide.org/sites/default/files/graphviz/dcd8800d05552f6159a3dc0a0b2bdb38.out.svg)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "We will mainly discuss the optimization of smooth functions, with and without constraints." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Sketch of lecture\n", "\n", "- Problem classification\n", " - Tree\n", "- Concepts\n", " - Global and local solutions\n", " - Convex sets and functions\n", " - Conditions for optima from calculus\n", " - Iteration algorithms \n", "$$\n", " x_o \\rightarrow x_1 \\rightarrow \\ldots \\rightarrow x_k \\rightarrow x_{k+1} \\rightarrow x_\\text{opt}\n", "$$\n", "- Gradient free methods\n", " - Nelder-Mead simplex\n", "- Line search\n", "$$\n", "\\min_{\\alpha > 0} f(x_k + \\alpha p_k)\n", "$$\n", " - Descent condition $p_k^T \\nabla f_k < 0$\n", " - Steepest descent $\\nabla f$ (Sensitivity to scaling)\n", " - Newton direction $(\\nabla^2 f)^{-1} \\nabla f$ (by minimizing $m_k$)\n", " - Conjugate gradient method\n", " - Quasi-Newton methods (SR1, BFGS)\n", "- Trust regions\n", " $$\\min_{p} m_k(x_k + p)$$\n", " where $x_k + p$ lie inside the trust region and\n", " $$\n", " m_k(x_k + p) = f_k + p^T\\nabla f_k + p^T B_k p\n", " $$\n", " where $B_k \\sim \\nabla^2 f_k$\n", " Secant condition\n", " $$B_{k+1}s_k = y_k$$\n", " $$s_k = x_{k+1}-x_k = \\alpha p_k$$\n", " $$y_k = \\nabla f_{k+1} - \\nabla f_k$$ \n", " - Levenberg-Marquadt\n", "- Model formulation\n", "- Objective function $f$\n", "- Variables $x$\n", "- Constraint functions\n", " - Equality\n", " - Inequality" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.5" } }, "nbformat": 4, "nbformat_minor": 2 }