{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Numerical Optimization 2: More Constraints\n",
"\n",
"This notebook uses the [NLopt](https://github.com/JuliaOpt/NLopt.jl) package which has well tested routines for constrained optimization.\n",
"\n",
"(The `Optim` package has preliminary code for similar functionality, see the IPNewton section of the documentation.)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Load Packages and Extra Functions"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"printyellow (generic function with 1 method)"
]
},
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"using Printf, NLopt\n",
"\n",
"include(\"jlFiles/printmat.jl\")"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"using Plots\n",
"\n",
"#pyplot(size=(600,400))\n",
"gr(size=(480,320))\n",
"default(fmt = :svg)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Unconstrained Optimization\n",
"\n",
"In the example below, we want to choose $(x,y)$ so as to minimize the objective function\n",
"\n",
"$\n",
"(x-2)^2 + (4y+3)^2, \n",
"$\n",
"\n",
"without any constraints. The solution should be $(x,y)=(2,-3/4)$.\n",
"\n",
"We use the algorithm `LN_COBYLA` which does not require us to code up the derivatives, but can still handle restrictions."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"lossfun (generic function with 1 method)"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"function lossfun(p,grad,A) #grad is never used, but must be in function def\n",
" (x,y) = (p[1],p[2]) #A could be data or other things that the function depends on \n",
" L = (x-2)^2 + (4*y+3)^2\n",
" return L\n",
"end"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"nx = 2*41\n",
"ny = 2*61\n",
"x = range(1,stop=5,length=nx)\n",
"y = range(-1,stop=0,length=ny)\n",
"\n",
"loss2d = fill(NaN,(nx,ny)) #matrix with loss fn values\n",
"for i = 1:nx, j = 1:ny\n",
" loss2d[i,j] = lossfun([x[i];y[j]],[],0)\n",
"end"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"image/svg+xml": [
"\n",
"\n"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"p1 = contour( x,y,loss2d', #notice: loss2d'\n",
" xlims = (1,5),\n",
" ylims = (-1,0),\n",
" legend = false,\n",
" levels = 21,\n",
" title = \"Contour plot of loss function\",\n",
" xlabel = \"x\",\n",
" ylabel = \"y\" )\n",
"scatter!([2],[-0.75],label=\"optimum\",legend=true)\n",
"display(p1)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Tell NLopt what to minimize as follows\n",
"\n",
"```\n",
"min_objective!(opt,(p,g)->lossfun(p,g,0))\n",
"```\n",
"\n",
"The `p` will be the parameters (over which we optimize), `g` the gradient (which is actually not used but must be there in the call) and `0` corresponds to the extra argument `A` in the definition of `lossfun`. We here use `A=0`, but in other contexts it could clearly be something else."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"minimum at:\n",
" 2.000\n",
" -0.750\n",
"\n"
]
}
],
"source": [
"opt = Opt(:LN_COBYLA, 2) #unconstrained minimization\n",
"lower_bounds!(opt,[-100,-100]) #LN_COBYLA seems to need some restrictions \n",
"min_objective!(opt,(p,g)->lossfun(p,g,0))\n",
"xtol_rel!(opt,1e-8) #set convergence tolerance to a reasonable number \n",
"\n",
"(minf,minx,ret) = optimize(opt,[0.0,0.0])\n",
"println(\"minimum at:\")\n",
"printmat(minx)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Optimization with Bounds on the Parameters\n",
"\n",
"The next few cells illustrate how to impose bounds like $a \\leq x$ and $y \\leq b$."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"image/svg+xml": [
"\n",
"\n"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"p1 = contour( x,y,loss2d',\n",
" xlims = (1,5),\n",
" ylims = (-1,0),\n",
" legend = false,\n",
" levels = 21,\n",
" title = \"Contour plot of loss function\",\n",
" xlabel = \"x\",\n",
" ylabel = \"y\",\n",
" annotation = (3.5,-0.7,text(\"somewhere here\",8)) )\n",
"plot!([2.75,2.75],[-1,0.5],linecolor=:red,linewidth=2,label=\"2.75 < x\",legend=true)\n",
"plot!([1,5],[-0.3,-0.3],linecolor=:black,linewidth=2,label=\"y < -0.3\")\n",
"scatter!([2.75],[-0.75],markercolor=:red,label=\"optimum\")\n",
"display(p1)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"minimum at:\n",
" 2.750\n",
" -0.750\n",
"\n"
]
}
],
"source": [
"opt = Opt(:LN_COBYLA, 2) #bounds on the parameters\n",
"lower_bounds!(opt,[2.75, -Inf])\n",
"upper_bounds!(opt,[Inf, -0.3])\n",
"min_objective!(opt,(p,g)->lossfun(p,g,0))\n",
"xtol_rel!(opt,1e-8)\n",
"\n",
"(minf,minx,ret) = optimize(opt,[3.0, -0.5])\n",
"println(\"minimum at:\")\n",
"printmat(minx)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Optimization with Equality Constraints\n",
"\n",
"We now impose the constraint that $x+2y-3=0.$"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"image/svg+xml": [
"\n",
"\n"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"p1 = contour(x,y,loss2d',\n",
" xlims = (1,5),\n",
" ylims = (-1,0),\n",
" legend = false,\n",
" levels = 21, \n",
" title = \"Contour plot of loss function\",\n",
" xlabel = \"x\",\n",
" ylabel = \"y\" )\n",
"plot!(3 .- 2*y,y,linecolor=:red,linewidth=3,label=\"restriction\",legend=true)\n",
"scatter!([4],[-0.5],markercolor=:red,label=\"optimum\")\n",
"display(p1)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"minimum at:\n",
" 4.000\n",
" -0.500\n",
"\n"
]
}
],
"source": [
"function EqConstrfun(p,grad) #equality constraint\n",
" (x,y) = (p[1],p[2])\n",
" c = x + 2*y - 3\n",
" return c\n",
"end\n",
"\n",
"opt = Opt(:LN_COBYLA, 2)\n",
"equality_constraint!(opt,(p,g) -> EqConstrfun(p,g))\n",
"min_objective!(opt,(p,g)->lossfun(p,g,0))\n",
"xtol_rel!(opt,1e-8)\n",
"\n",
"(minf,minx,ret) = optimize(opt,[3.0, 0.0])\n",
"println(\"minimum at:\")\n",
"printmat(minx)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Optimization with Inequality Constraints\n",
"\n",
"We now impose the constraint that $y \\le -(x-4)^2$."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"data": {
"image/svg+xml": [
"\n",
"\n"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"yRestriction = -(x.-4).^2 #y should be less than this\n",
"\n",
"p1 = contour( x,y,loss2d',\n",
" legend = false,\n",
" levels = 21,\n",
" title = \"Contour plot of loss function\",\n",
" xlabel = \"x\",\n",
" ylabel = \"y\",\n",
" xlims = (1,5),\n",
" ylims = (-1,0),\n",
" annotation = (4.0,-0.7,text(\"somewhere here\",8)) )\n",
"plot!(x,yRestriction,linecolor=:red,linewidth=3,label=\"y is below this curve\",legend=:topleft)\n",
"scatter!([3.1],[-0.79],markercolor=:red,label=\"optimum\")\n",
"display(p1)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"minimum at:\n",
" 3.112\n",
" -0.789\n",
"\n"
]
}
],
"source": [
"function IneqConstrfun(p,grad) #inequality restriction, <=\n",
" (x,y) = (p[1],p[2])\n",
" r = y + (x-4)^2\n",
" return r\n",
"end\n",
"\n",
"opt = Opt(:LN_COBYLA, 2)\n",
"inequality_constraint!(opt,(p,g) -> IneqConstrfun(p,g))\n",
"min_objective!(opt,(p,g)->lossfun(p,g,0))\n",
"xtol_rel!(opt,1e-8)\n",
"\n",
"(minf,minx,ret) = optimize(opt,[4.0, -0.1])\n",
"println(\"minimum at:\")\n",
"printmat(minx)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"@webio": {
"lastCommId": null,
"lastKernelId": null
},
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Julia 1.6.0",
"language": "julia",
"name": "julia-1.6"
},
"language_info": {
"file_extension": ".jl",
"mimetype": "application/julia",
"name": "julia",
"version": "1.6.0"
}
},
"nbformat": 4,
"nbformat_minor": 1
}