MATLAB Optimization Toolbox
Presented by Chin Pei
February 28, 2003
Presentation Outline
Introduction
Function Optimization
Optimization Toolbox
Routines / Algorithms available
Minimization Problems
Unconstrained
Constrained
Example
The Algorithm Description
Multiobjective Optimization
Function Optimization
Optimization concerns the minimization or maximization of functions
Standard Optimization Problem
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
~
~min
xf x
0
g x
j
~0
h x
i Equality Constraints Subject to:
Inequality Constraints
Function Optimization
~f x is the objective function , which measure and evaluate the performance of a system.
In a standard problem, we are minimizing the function.
For maximization, it is equivalent to minimization of the –ve of the objective function.
x
~is a column vector of design variables , which can
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Function Optimization
Constraints – Limitation to the design space.
Can be linear or nonlinear, explicit or implicit functions
~0 g x
j
~0
h x
i Equality Constraints Inequality Constraints
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Most algorithm require less than!!!
Optimization Toolbox
Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for:
Unconstrained optimization
Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi- infinite minimization problems
Quadratic and linear programming
Nonlinear least squares and curve fitting
Nonlinear systems of equations solving
Constrained linear least squares
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Minimization Algorithm
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Minimization Algorithm (Cont.)
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Equation Solving Algorithms
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Least-Squares Algorithms
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Implementing Opt. Toolbox
Most of these optimization routines require the definition of an M-file containing the
function, f , to be minimized.
Maximization is achieved by supplying the routines with – f .
Optimization options passed to the routines change optimization parameters.
Default optimization parameters can be changed through an options structure.
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Unconstrained Minimization
Consider the problem of finding a set of values [x 1 x 2 ] T that solves
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
1
~
2 2
1 2 1 2 2
min ~ x 4 2 4 2 1
x f x e x x x x x
1 2
~
x x x T
Steps
Create an M-file that returns the function value (Objective Function)
Call it objfun.m
Then, invoke the unconstrained minimization routine
Use fminunc
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Step 1 – Obj. Function
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
1 2
~
x x x T
Objective function
Step 2 – Invoke Routine
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
x0 = [-1,1];
options = optimset(‘LargeScale’,’off’);
[xmin,feval,exitflag,output]=
fminunc(‘objfun’,x0,options);
Output arguments
Input arguments Starting with a guess
Optimization parameters settings
Results
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
xmin =
0.5000 -1.0000 feval =
1.3028e-010 exitflag = 1
output =
iterations: 7 funcCount: 40 stepsize: 1
Minimum point of design variables
Objective function value
Exitflag tells if the algorithm is converged.
If exitflag > 0, then local minimum is found
Some other information
More on fminunc – Input
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
fun: Return a function of objective function.
x0: Starts with an initial guess. The guess must be a vector of size of number of design variables.
option: To set some of the optimization parameters. (More after few slides)
P1,P2,…: To pass additional parameters.
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
More on fminunc – Output
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
xmin: Vector of the minimum point (optimal point). The size is the number of design variables.
feval: The objective function value of at the optimal point.
exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0)
output: This structure gives more details about the optimization
grad: The gradient value at the optimal point.
Options Setting – optimset
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
The routines in Optimization Toolbox has a set of default optimization parameters.
However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc.
There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or
hessian, etc.
Options Setting (Cont.)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Type help optimset in command window, a list of options setting available will be displayed.
How to read? For example:
LargeScale - Use large-scale algorithm if possible [ {on} | off ]
The default is with { }
Options Setting (Cont.)
LargeScale - Use large-scale algorithm if possible [ {on} | off ]
Since the default is on, if we would like to turn off, we just type:
Options =
optimset(‘LargeScale’, ‘off’) Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Useful Option Settings
Display - Level of display [ off | iter | notify | final ]
MaxIter - Maximum number of iterations allowed [ positive integer ]
TolCon - Termination tolerance on the constraint violation [ positive scalar ]
TolFun - Termination tolerance on the function value [ positive scalar ]
TolX - Termination tolerance on X [ positive
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Highly recommended to use!!!
fminunc and fminsearch
fminunc uses algorithm with gradient and hessian information .
Two modes:
Large-Scale: interior-reflective Newton
Medium-Scale: quasi-Newton (BFGS)
Not preferred in solving highly discontinuous functions .
This function may only give local solutions .
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
fminunc and fminsearch
fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous , fminsearch may be more robust.
This is a direct search method that does not use numerical or analytic gradients as in
fminunc.
Introduction
Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Constrained Minimization
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
[xmin,feval,exitflag,output,lambda,grad,hessian]
=
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options, P1,P2,…)
Vector of Lagrange
Multiplier at optimal
point
Example
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
~
1 2 3
min
~x
f x x x x
2
1 2
2 x x 0
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
1 2 3
0 x x x , , 30 Subject to:
1 2 2 0
,
A B
0 30
0 , 30
LB UB
function f = myfun(x)
f=-x(1)*x(2)*x(3);
Example (Cont.)
2
1 2
2 x x 0
For
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]
function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2);
Ceq=[]; Remember to return a null
Matrix if the constraint does not apply
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
Example (Cont.)
x0=[10;10;10];
A=[-1 -2 -2;1 2 2];
B=[0 72]';
LB = [0 0 0]';
UB = [30 30 30]';
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
1 2 2 0
1 2 2 , 72
A B
Initial guess (3 design variables)
CAREFUL!!!
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
0 30
0 , 30
0 30
LB UB
Example (Cont.)
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search).
> In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213 In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6
Optimization terminated successfully:
Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon
Active Constraints:
2 9 x =
0.00050378663220 0.00000000000000 30.00000000000000
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization
Multiobjective Optimization Conclusion Conclusion
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
1 2
0 30
0 30
0 30
x x x
Const. 1 Const. 2 Const. 3
Const. 4
Const. 5 Const. 6
Const. 7 Const. 8
Multiobjective Optimization
Previous examples involved problems with a single objective function.
Now let us look at solving problem with multiobjective function by lsqnonlin.
Example is taken by designing an optimal PID controller for an plant.
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Simulink Example
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion
Goal: Optimize the control parameters in Simulink model optsim.mdl in order to minimize the error between the output and input.
Plant description:
• Third order under-damped with actuator limits.
• Actuation limits are a saturation limit and a slew rate limit.
Simulink Example (Cont.)
Initial PID Controller Design Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Solving Methodology
Design variables are the gains in PID controller (K P , K I and K D ) .
Objective function is the error between the output and input.
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Solving Methodology (Cont.)
Let pid = [ Kp Ki Kd ] T
Let also the step input is unity.
F = yout - 1
Construct a function tracklsq for objective function.
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Objective Function
function F = tracklsq(pid,a1,a2) Kp = pid(1);
Ki = pid(2);
Kd = pid(3);
% Compute function value
opt = simset('solver','ode5','SrcWorkspace','Current');
[tout,xout,yout] = sim('optsim',[0 100],opt);
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion
The idea is perform nonlinear least squares minimization of the errors from time 0 to 100 at the time step of 1.
So, there are 101 objective functions to minimize.
The lsqnonlin
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion
[X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN]
= LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
Invoking the Routine
clear all Optsim;
pid0 = [0.63 0.0504 1.9688];
a1 = 3; a2 = 43;
options =
optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun ',0.001);
pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2) Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Results
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Results (Cont.)
Initial Design
Optimization Process Optimal Controller Result
Introduction
Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization
Multiobjective Optimization Conclusion Conclusion
Conclusion
Easy to use! But, we do not know what is happening behind the routine. Therefore, it is still important to understand the limitation of each routine.
Basic steps:
Recognize the class of optimization problem
Define the design variables
Create objective function
Recognize the constraints
Start an initial guess