Sei sulla pagina 1di 23

Introduction to Optimization Methods

Introduction to Non-Linear
Optimization

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization in Process Plants

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Tree

Figure 1: Optimization tree.

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

What is Optimization?
Optimization is an iterative process by which a desired solution
(max/min) of the problem can be found while satisfying all its
constraint or bounded conditions.

Figure 2: Optimum solution is found


while satisfying its constraint (derivative
must be zero at optimum).

Optimization problem could be linear or non-linear.


Non linear optimization is accomplished by numerical Search
Methods.
Search methods are used iteratively before a solution is achieved.
The search procedure is termed as algorithm.
2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

What is Optimization?(Cont.)

Linear problem solved by Simplex or Graphical methods.


The solution of the linear problem lies on boundaries of the feasible
region.

Figure 3: Solution of linear problem Figure 4: Three dimensional solution of


non-linear problem
Non-linear problem solution lies within and on the boundaries of the
feasible region.
2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Fundamentals of Non-Linear Optimization

Single Objective function f(x) Maximize X1 + 1.5 X2


Maximization Subject to:
X1 + X2 150
Minimization 0.25 X1 + 0.5 X2 50
Design Variables, xi , i=0,1,2,3.. X1 50
X2 25
Constraints X1 0, X2 0
Inequality Figure 5: Example of design variables and
Equality constraints used in non-linear optimization.
Optimal points
Local minima/maxima points: A point or Solution x* is at local point
if there is no other x in its Neighborhood less than x*
Global minima/maxima points: A point or Solution x** is at global
point if there is no other x in entire search space less than x**

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Fundamentals of Non-Linear Optimization (Cont.)

Figure 6: Global versus local optimization. Figure 7: Local point is equal to global point if
the function is convex.

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Fundamentals of Non-Linear Optimization (Cont.)


Function f is convex if f(Xa) is less than value of the corresponding
point joining f(X1) and f(X2).
Convexity condition Hessian 2nd order derivative) matrix of
function f must be positive semi definite ( eigen values +ve or
zero).

Figure 8: Convex and nonconvex set Figure 9: Convex function

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Mathematical Background
Slop or gradient of the objective function f represent the
direction in which the function will decrease/increase most rapidly
df f ( x x) f ( x) f
lim lim
dx x0 x x 0 x

Taylor series expansion


df 1 d2 f
f ( x p x) (x) 2
(x) 2 .......
dx xp 2! dx xp

Jacobian matrix of gradient of f with respect to several variables


f f f
x y z
J
g g g
x y z

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Mathematical Background (Cont.)


Slope -First order Condition (FOC) Provides functions slope information

f ( X *) 0
Hessian Second derivative of function of several variables, Sign
indicates max.(+ve) or min.(-ve)
2 f 2 f
2
x yx
H 2
f 2 f
xy y 2

Second order condition (SOC)
Eigen values of H(X*) are all positive
Determinants of all lower order of H(X*) are +ve

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Algorithm

Deterministic - specific rules to move from one iteration to next ,


gradient, Hessian

Stochastic probalistic rules are used for subsequent iteration

Optimal Design Engineering Design based on


optimization algorithm

Lagrangian method sum of objective function and linear


combination of the constraints.

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Methods
Deterministic
Direct Search Use Objective function values to locate minimum
Gradient Based first or second order of objective function.
Minimization objective function f(x) is used with ve sign
f(x) for maximization problem.
Single Variable
Newton Raphson is Gradient based technique (FOC)
Golden Search step size reducing iterative method
Multivariable Techniques ( Make use of Single variable Techniques
specially Golden Section)
Unconstrained Optimization
a.) Powell Method Quadratic (degree 2) objective function polynomial is
non-gradient based.
b.) Gradient Based Steepest Descent (FOC) or Least Square minimum
(LMS)
c.) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)
2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Optimization Methods - Constrained

Constrained Optimization
a.) Indirect approach by transforming into unconstrained
problem.
b.) Exterior Penalty Function (EPF) and Augmented Lagrange
Multiplier
c.) Direct Method Sequential Linear Programming (SLP), SQP and
Steepest Generalized Reduced Gradient Method (GRG)

Figure 10: Descent Gradient or LMS


2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Optimization Methods (Cont.)

Global Optimization Stochastic techniques

Simulated Annealing (SA) method minimum


energy principle of cooling metal crystalline structure
Genetic Algorithm (GA) Survival of the fittest
principle based upon evolutionary theory

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Optimization Methods (Examples)

Multivariable Gradient based optimization


J is the cost function to be minimized in two
dimension

The contours of the J paraboloid shrinks as it is


decrease
Figure 11: Multivariable Gradient based optimization
function retval = Example6_1(x)
% example 6.1
retval = 3 + (x(1) - 1.5*x(2))^2 + (x(2) - 2)^2;

>> SteepestDescent('Example6_1', [0.5 0.5],


20, 0.0001, 0, 1, 20)
Where
[0.5 0.5] -initial guess value
20 -No. of iteration
0.001 -Golden search tol.
0 -initial step size
1 -step interval
20 -scanning step
>> ans
2.7585 1.8960
Figure 12: Steepest Descent
2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization

Numerical Optimization
Newton Raphson Method
1.Root Solver System of nonlinear equations
2.(MATLAB Optimization fsolve)
f ( xi )
xi 1 xi
f ' ( xi 1 )
2. One dimensional Solver
(MATLAB - OPTIMIZATION Method )
f ' ( xi )
xi 1 xi
f ' ' ( xi 1 )
Steepest Gradient Ascent/Descent
Methods
xi 1 f ( xi .d i )
f ( xi ) .f ( xi ).d i
f ( xi ).d i Is the magnitude of descent direction.
d 2 f ( xi ) achieve the steepest descent
d 2 f ( xi ) achieve steepest ascent

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Steepest Gradient
Solve the following for two step steepest ascent 1.2
1 0
f ( x, y ) 2.25 xy 1.75 y 1.5 x 2 2 y 1
2
The partial derivatives can be evaluated at the initial guesses, x = 1 and 0.8 max
y = 1, f
3 x 2.25 y 3(1) 2.25(1) 0.75 0.6
x
f
2.25 x 4 y 1.75 2.25(1) 4(1) 1.75 0 0.4
y
0.2
Therefore, the search direction is 0.75i.
0
f (1 0.75h, 1) 0.5 0.5625h 0.84375h 2 0 0.2 0.4 0.6 0.8 1 1.2

This can be differentiated and set equal to zero and solved for h* = 0.33333. Therefore, the result for the
first iteration is x = 1 0.75(0.3333) = 0.75 and y = 1 + 0(0.3333) = 1. For the second iteration, the
partial derivatives can be evaluated as,
f
3(0.75) 2.25(1) 0
x
f
2.25(0.75) 4(1) 1.75 0.5625
y

Therefore, the search direction is 0.5625j.


f (0.75, 1 0.5625h) 0.59375 0.316406h 0.63281h 2

This can be differentiated and set equal to zero and solved for h* = 0.25.
Therefore, the result for the second iteration is x = 0.75 + 0(0.25) = 0.75 and y = 1 + (0.5625)0.25 = 0.859375.

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

%chapra14.5 Contd
clear
clc
Clf x2
ww1=0:0.01:1.2;
ww2=ww1;
[w1,w2]=meshgrid(ww1,ww2);
J=-1.5*w1.^2+2.25*w2.*w1-2*w2.^2+1.75*w2;
cs=contour(w1,w2,J,70);
%clabel(cs);
hold
grid
w1=1; w2=1; h=0;
for i=1:10
syms h
dfw1=-3*w1(i)+2.25*w2(i);
dfw2=2.25*w1(i)-4*w2(i)+1.75;
fw1=-1.5*(w1(i)+dfw1*h).^2 + 2.25*(w2(i)+dfw2*h).*(w1(i)+
dfw1*h)-2*(w2(i)+dfw2*h).^2+1.75*(w2(i)+dfw2*h);
J=-1.5*w1(i)^2+2.25*w2(i)*w1(i)-2*w2(i)^2+1.75*w2(i)
g=solve(fw1);
h=sum(g)/2;
w1(i+1)=w1(i)+dfw1*h;
w2(i+1)=w2(i)+dfw2*h;
plot(w1,w2) xi
pause(0.05)
End
MATLAB OPTIMIZATION TOOLBOX
%startchapra.m
clc
w1(i), w2(i) function J=chaprafun(x)
clear
w1=x(1);
x0=[1 1];
w2=x(2)
options=optimset('LargeScale','off','Display','iter','Maxiter',
J=-(-1.5*w1^2+2.25*w2*w1-2*w2^2+1.75*w2);
20,'MaxFunEvals',100,'TolX',1e-3,'TolFun',1e-3);
[x,fval]=fminunc(@chaprafun,x0,options)

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

Newton Raphson Four Bar Mechanism


Sine and Cosine angle components - All angles referenced from global x - axis

In above equations 1 = 0 as it is along the x-axis and other three angles are time varying.
1st angular velocity derivative

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

If input is applied to link2 - DC motor then 2 would be the input to the syste

In Matrix for

2. Numerical Solution for Non Algebraic Equations

_
f ( xi )
xi 1 x i
f ' ( xi 1 )

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

_
f (q)
qi
f ' ( qi )

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization
_
f ( i )
i i _ Newton-Raphson
f ' ( i )

%fouropt.m
function f=fouropt(x)
the = 0;
r1=12; r2=4; r3=10; r4=7;
f=-[r2*cos(the)+r3*cos(x(1))-r1*cos(0)-r4*cos(x(2));
r2*sin(the)+r3*sin(x(1))-r1*sin(0)-r4*sin(x(2))];

%startfouropt.m
clc
clear
x0=[0.1 0.1];
options=optimset('LargeScale','off','Display','iter','Maxiter',
200,'MaxFunEvals',100,'TolX',1e-8,'TolFun',1e-8);
[x,fval]=fsolve(@fouropt,x0,options);
theta3=x(1)*57.3 Foursimmechm.m, foursimmech.mdl and possol4.m
theta4=x(2)*57.3

2008 Solutions 4U Sdn Bhd. All Rights Reserved


Fundamental of Optimization

References:

1)Steven C. Chapra , Raymond P. Canale, Numerical Methods for Engineers, McGraw Hill,
Singapore, 2006
2)Kalyanmoy Deb, Optimization for Engineering Design, Prentice Hall, New Dehli, 1996
3)Optimization Toolbox for use with MATLAB, User guide Ver. 3, MathWorks, Natick, MA, USA,
2006

2008 Solutions 4U Sdn Bhd. All Rights Reserved

Potrebbero piacerti anche