Sei sulla pagina 1di 1

Advance Mathematics

Outline Linear Algebra

Reference Books 《⼯程最优化设计》

mathematical modeling identifying the objective,variables,constraints

an region enclosed by multiple constraint


Domain
boundaries
construction of a appropirate model
Feasible Region ⼦主题 1
Basic Steps
an optimization algorithm can be used to find the
once the model is formulated
solution

optimization is the process of minimizing or


maximzing of a function subject to constraints on
its variables

linear problems vortex(漩涡) conversion simplex method

golden section search method(GSM)

Fibonacci search method (FSM)


one-dimensional
Quadratic interpolation method

Cubic interpolation method

Steepest descent method


unconstrained optimization
Newton method
optimization is also called "linear programming" Use info of derivatives(派⽣物)
Chapter 1 nonlinear problems Quasi-Newton Method

Conjugate Gradient method

⼦主题 4

Don't use info Powell method

solve directly Feasible direction method

constrained optimization Penalty fuchtion method


solve indirectly
SQP method

1. determinate the domains of the variables

2. identify the feasible region of te solutions


Graphical method
3. plot a few contours of objective function to
search the descent direction of the function

4. find the optimal solutions of the problem

search direction:S0

determinate the optimal step length to minimize


iteration methods descent methods
the function value :a

obtain the new iteration point X1=X0+a*S0

the direction the function value increases most negative gradient direction decreases most
rapidly rapidly

a vector comprised of all the first-order partial


partial derivative gradient
derivatives at this point

decribes the local variation of the function values


at one point

梯度是列向量
directional derivative 注意转置⽅向
⽅向向量也是列向量

one variable

Taylor Expansion ⼆阶:Hessian matrix significant in optimization


multiple variables ⼆次近似
⼀阶

positive definite

positive semi-definite

type of matrix negative positive

negative semi-definite

indefinite
Fundamental of Optimization Positive Definite Quadratic Function(PDQF)
1. the contours of PDQFs are a set of concentric
ellipses; the center of this set is the minimum
point of the PDQFs
characteristics
2. for the non-positive-definite QFs, the contours
near the minimum point are approximately as
ellipses; the contours become irregular when they
are away form the minimum point.

if the straight line segment connecting any two


convexity set
points in S lies entirely inside S
Convexity
for any two points ,the straight line lies above the
convexity function
graph of the function

global minimizer

Minimizer local minimizer

strong minimizer

Descent direction 通过⻆度来判定

only one variables

Extremum conditions first order


multiple variables
second order
Optimization Methods
unimodal function(单峰函数)
STEP1: an initial interval including the minimum
point needs to be determined
basic steps to determine the initial interval

interpolate(差补) two points inside the current


known interval [a,b] , then compare the two
points.
1-D dimensional search method STEP2: then the length of the interval is reduced
after every iteration until the minimum point is when the given convergence accuracy, some
found point inside the interval can be approximately
considered as the minimum point along this
search direction.

ratio of interval reduction

Determining Initial Interval some basic steps


Linear search methods
the principle to determine two pionts
GSM ratio is 0.618
and the principle to determine the convergence

Fn递增:1,1,2,3,5,8,13,21......
FSM
Ln递减

the minimum point Xp 两个条件确定

Powell's quadratic interpolation method(QIM) 类似待定系数法 the direction of parabolic

convergence criterion of QIM X2-Xp

Davidon's cubic interpolation method (CIM)

searching direction negative gradient direction at iteration point

Sk=-gk=-梯度 the search direction

Steepest Descent Method X2=X1+a*Sk df/da=0 a is the step length

Convergence judgement

fault need a lot iterations

utilize the negative gradient and the Hessian


Taylor Expansion of second order
matrix of the function

Newton's Direction Sk
Newton Method
the step length α is 1

Chapter4: Unconstrained Optimization Methods if the Hessians are not positive definite ,the
fault
method is not applicable

DFP 经过迭代H接近H阵的逆
use a positive definite matrix to approximate the f
两种⽅法获得校正值 better
的⼆阶偏导
Quasi-Newton Method BFGS
经过迭代,B接近H阵 B=H的逆

superlinear convergence 1<β<2

PRP
Conjugate Gradient Method
FR

main method : utilizing the information of the first-


order and second order derivatives of the
objective function

constraint equations ought to be equalities if there exists inequalities ,bring in slack variables 将不等式变为等式
constraints
nonnegativity constraints

Basic solution only satisfies the constraint equations

satisfies both constraint equations and the non-


Basic feasible solution
negative constraints for variables

solutions the feasible solution which achieve the optimal


optimal solution
value

set n-m variables to zeros non-basic variables


n variables ; m equations (n>m)
m variables Basic variables

all the constraint equations are the inequalities introduce slack variables basic variables 变量系数为0,其他系数为要求的表达式中的系数
Linear Programming
object function is the sum of all the artificial
variables
Basic Feasible Solution
when the value of the object function is zero, the
constraint equations are equalities introduce artificial variables
solution is founded

the basic feasible solution is the part without the


artificial variables

the principle column with the smallest judgement


number

the row has the smallest quotient(商):b/a ( a属于


The simplex method
第k列 ; a >0 )

the judgement number for the basic variable is and the judgement number for the non-basic
always zero variable 有公式

the iteration points are always confined in the


direct methods feasible region, after considering all the Feasible Direction Method
constraints.

constraints are put into the objective function


such that constrained problems can be converted Penalty Function Method
indirect methods to unconstrained problem, or nonlinear problems
are converted to relatively simple quadratic SQP (Sequential Quadratic Programming) Method
Constrained Optimization Methods
programming problems.

effective constraints ECs for equality-constrained problems Kuhn-Tucker Condition

depending on whether the investigated point is


on the constraint boundary or not
ineffective constraints
ECs for inequality-constrained problems introducing slack variables

⾃由主题

⾃由主题

Potrebbero piacerti anche