• Non ci sono risultati.

MATLAB Optimization Toolbox

N/A
N/A
Protected

Academic year: 2021

Condividi "MATLAB Optimization Toolbox"

Copied!
41
0
0

Testo completo

(1)

MATLAB Optimization Toolbox

Presented by Chin Pei

February 28, 2003

(2)

Presentation Outline

Introduction

Function Optimization

Optimization Toolbox

Routines / Algorithms available

Minimization Problems

Unconstrained

Constrained

Example

The Algorithm Description

Multiobjective Optimization

(3)

Function Optimization

 Optimization concerns the minimization or maximization of functions

 Standard Optimization Problem

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

~

 

~

min

x

f x

  0

g x

j

 

~

0

h x

i

 Equality Constraints Subject to:

Inequality Constraints

(4)

Function Optimization

 

~

f x is the objective function , which measure and evaluate the performance of a system.

In a standard problem, we are minimizing the function.

For maximization, it is equivalent to minimization of the –ve of the objective function.

x

~

is a column vector of design variables , which can

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(5)

Function Optimization

Constraints – Limitation to the design space.

Can be linear or nonlinear, explicit or implicit functions

 

~

0 g x

j

 

~

0

h x

i

 Equality Constraints Inequality Constraints

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

Most algorithm require less than!!!

(6)

Optimization Toolbox

Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for:

Unconstrained optimization

Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi- infinite minimization problems

Quadratic and linear programming

Nonlinear least squares and curve fitting

Nonlinear systems of equations solving

Constrained linear least squares

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(7)

Minimization Algorithm

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(8)

Minimization Algorithm (Cont.)

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(9)

Equation Solving Algorithms

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(10)

Least-Squares Algorithms

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(11)

Implementing Opt. Toolbox

 Most of these optimization routines require the definition of an M-file containing the

function, f , to be minimized.

 Maximization is achieved by supplying the routines with – f .

 Optimization options passed to the routines change optimization parameters.

 Default optimization parameters can be changed through an options structure.

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(12)

Unconstrained Minimization

 Consider the problem of finding a set of values [x 1 x 2 ] T that solves

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

 

1

~

2 2

1 2 1 2 2

min ~ x 4 2 4 2 1

x f xe xxx xx

 1 2 

~

xx x T

(13)

Steps

 Create an M-file that returns the function value (Objective Function)

Call it objfun.m

 Then, invoke the unconstrained minimization routine

Use fminunc

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(14)

Step 1 – Obj. Function

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

function f = objfun(x)

f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);

 1 2 

~

xx x T

Objective function

(15)

Step 2 – Invoke Routine

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

x0 = [-1,1];

options = optimset(‘LargeScale’,’off’);

[xmin,feval,exitflag,output]=

fminunc(‘objfun’,x0,options);

Output arguments

Input arguments Starting with a guess

Optimization parameters settings

(16)

Results

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

xmin =

0.5000 -1.0000 feval =

1.3028e-010 exitflag = 1

output =

iterations: 7 funcCount: 40 stepsize: 1

Minimum point of design variables

Objective function value

Exitflag tells if the algorithm is converged.

If exitflag > 0, then local minimum is found

Some other information

(17)

More on fminunc – Input

[xmin,feval,exitflag,output,grad,hessian]=

fminunc(fun,x0,options,P1,P2,…)

fun: Return a function of objective function.

x0: Starts with an initial guess. The guess must be a vector of size of number of design variables.

option: To set some of the optimization parameters. (More after few slides)

P1,P2,…: To pass additional parameters.

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(18)

More on fminunc – Output

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

[xmin,feval,exitflag,output,grad,hessian]=

fminunc(fun,x0,options,P1,P2,…)

xmin: Vector of the minimum point (optimal point). The size is the number of design variables.

feval: The objective function value of at the optimal point.

exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0)

output: This structure gives more details about the optimization

grad: The gradient value at the optimal point.

(19)

Options Setting – optimset

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

Options =

optimset(‘param1’,value1, ‘param2’,value2,…)

The routines in Optimization Toolbox has a set of default optimization parameters.

However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc.

There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or

hessian, etc.

(20)

Options Setting (Cont.)

Options =

optimset(‘param1’,value1, ‘param2’,value2,…)

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

Type help optimset in command window, a list of options setting available will be displayed.

How to read? For example:

LargeScale - Use large-scale algorithm if possible [ {on} | off ]

The default is with { }

(21)

Options Setting (Cont.)

LargeScale - Use large-scale algorithm if possible [ {on} | off ]

Since the default is on, if we would like to turn off, we just type:

Options =

optimset(‘LargeScale’, ‘off’) Options =

optimset(‘param1’,value1, ‘param2’,value2,…)

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(22)

Useful Option Settings

Display - Level of display [ off | iter | notify | final ]

MaxIter - Maximum number of iterations allowed [ positive integer ]

TolCon - Termination tolerance on the constraint violation [ positive scalar ]

TolFun - Termination tolerance on the function value [ positive scalar ]

TolX - Termination tolerance on X [ positive

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

Highly recommended to use!!!

(23)

fminunc and fminsearch

fminunc uses algorithm with gradient and hessian information .

 Two modes:

Large-Scale: interior-reflective Newton

Medium-Scale: quasi-Newton (BFGS)

 Not preferred in solving highly discontinuous functions .

This function may only give local solutions .

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(24)

fminunc and fminsearch

fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous , fminsearch may be more robust.

 This is a direct search method that does not use numerical or analytic gradients as in

fminunc.

Introduction

Introduction Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(25)

Constrained Minimization

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

[xmin,feval,exitflag,output,lambda,grad,hessian]

=

fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options, P1,P2,…)

Vector of Lagrange

Multiplier at optimal

point

(26)

Example

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

~

 

1 2 3

min

~

x

f x   x x x

2

1 2

2 xx  0

1 2 3

1 2 3

2 2 0

2 2 72

x x x

x x x

   

  

1 2 3

0  x x x , ,  30 Subject to:

1 2 2 0

,

A      B  

     

0 30

0 , 30

LB UB

   

   

     

function f = myfun(x)

f=-x(1)*x(2)*x(3);

(27)

Example (Cont.)

2

1 2

2 xx  0

For

Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]

function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2);

Ceq=[]; Remember to return a null

Matrix if the constraint does not apply

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

(28)

Example (Cont.)

x0=[10;10;10];

A=[-1 -2 -2;1 2 2];

B=[0 72]';

LB = [0 0 0]';

UB = [30 30 30]';

[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)

1 2 2 0

1 2 2 , 72

A      B  

     

   

Initial guess (3 design variables)

CAREFUL!!!

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

0 30

0 , 30

0 30

LB UB

   

   

     

   

   

(29)

Example (Cont.)

Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search).

> In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213 In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6

Optimization terminated successfully:

Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon

Active Constraints:

2 9 x =

0.00050378663220 0.00000000000000 30.00000000000000

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion Conclusion

1 2 3

1 2 3

2 2 0

2 2 72

x x x

x x x

   

  

1 2

0 30

0 30

0 30

x x x

 

 

 

Const. 1 Const. 2 Const. 3

Const. 4

Const. 5 Const. 6

Const. 7 Const. 8

(30)

Multiobjective Optimization

 Previous examples involved problems with a single objective function.

 Now let us look at solving problem with multiobjective function by lsqnonlin.

 Example is taken by designing an optimal PID controller for an plant.

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(31)

Simulink Example

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion

Goal: Optimize the control parameters in Simulink model optsim.mdl in order to minimize the error between the output and input.

Plant description:

• Third order under-damped with actuator limits.

• Actuation limits are a saturation limit and a slew rate limit.

(32)

Simulink Example (Cont.)

Initial PID Controller Design Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(33)

Solving Methodology

 Design variables are the gains in PID controller (K P , K I and K D ) .

 Objective function is the error between the output and input.

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(34)

Solving Methodology (Cont.)

 Let pid = [ Kp Ki Kd ] T

 Let also the step input is unity.

F = yout - 1

Construct a function tracklsq for objective function.

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(35)

Objective Function

function F = tracklsq(pid,a1,a2) Kp = pid(1);

Ki = pid(2);

Kd = pid(3);

% Compute function value

opt = simset('solver','ode5','SrcWorkspace','Current');

[tout,xout,yout] = sim('optsim',[0 100],opt);

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion

The idea is perform nonlinear least squares minimization of the errors from time 0 to 100 at the time step of 1.

So, there are 101 objective functions to minimize.

(36)

The lsqnonlin

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization Conclusion Conclusion

[X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN]

= LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)

(37)

Invoking the Routine

clear all Optsim;

pid0 = [0.63 0.0504 1.9688];

a1 = 3; a2 = 43;

options =

optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun ',0.001);

pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2) Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(38)

Results

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(39)

Results (Cont.)

Initial Design

Optimization Process Optimal Controller Result

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization

Multiobjective Optimization Conclusion Conclusion

(40)

Conclusion

Easy to use! But, we do not know what is happening behind the routine. Therefore, it is still important to understand the limitation of each routine.

Basic steps:

Recognize the class of optimization problem

Define the design variables

Create objective function

Recognize the constraints

Start an initial guess

Invoke suitable routine

Introduction

Introduction Unconstrained Minimization Unconstrained Minimization Constrained Minimization Constrained Minimization Multiobjective Optimization

Multiobjective Optimization Conclusion

(41)

Thank You!

Questions & Suggestions?

Riferimenti

Documenti correlati

Most of these methods can perform two operations, ranking and subset selection: in the former, the importance of each individual feature is evaluated, usually by neglect- ing

Nella fig.7 sono mostrati gli effetti della variazione dei livelli di DA nella shell del nucleus accumbens in seguito a somministrazione di etanolo in ratti pre-trattati con

Scelte di politica criminale che individuino condotte sintomatiche di (tenta- tivi) d’infiltrazione mafiosa, al fine di reprimerle sul nascere, ap- paiono particolarmente indicate

However, in spite the complex admixture history of most breeds at K = 24, many of the local Italian cattle breeds have retained unique identities and are differentiated breeds, and

Non tutti sanno che proprio in Basilicata, precisamente in Val d’Agri, è presente il più grande giacimento di idrocarburi dell’Europa continentale, dal

Il territorio dell’isolotto, un tipo di Friche (Clèment, 2006), cosparso di rovine e macerie che si dispongono ben visibili agli occhi del protagonista, incontra il destino, che

Keywords: Illness expectation, Chronic disease, Mind-body connection, Health Psychology, Placebo, Nocebo Mind, body, and placebo.. The relationship between mind and body has a long

Prendere in considerazione, quindi, gli aspetti ‘ideologici’ di un testo letterario senza per forza di cose legare questi a delle esigenze e dinamiche esclusivamente retoriche