Sei sulla pagina 1di 5

Linear and Nonlinear Optimization Question 1.

HW2

Naeemullah Khan & Sultan Albarakati

Newton Method with Backtracking Line Search.

Codes:
function [value1,hist]=newton_bt(rosen,grad,rosenhessian,x0) x=x0; hist=x0; while (norm(grad(x))>1e-5) p=-rosenhessian(x)\grad(x); t=btlinesearch(rosen,rosengrad(x),p,x); x=x+t*p; hist=[hist x]; end value1=x; end

function g=rosengrad(x) %calculating gradient components x1_g=x(1); x2_g=x(2); grad_x1=-40*(x2_g-x1_g.^2).*x1_g-2*(1-x1_g); grad_x2=20*(x2_g-x1_g.^2); g=[grad_x1;grad_x2]; end

function h = rosenhessian(x) h = zeros(2,2); h(1,1) = 2 - 40*x(2) + 120*x(1)^2; h(2,2) = 20; h(1,2) = -40*x(1); h(2,1) = -40*x(1); end function f=rosen(xi) x1=xi(1); x2=xi(2); f=10*(x2-x1^2)^2+(1-x1).^2; end

Linear and Nonlinear Optimization

HW2

Naeemullah Khan & Sultan Albarakati

Question 2. Performance of Newton Method. a)

Figure 1 (Newton's Method)

Linear and Nonlinear Optimization

HW2

Naeemullah Khan & Sultan Albarakati

b)

The expression for the rate of convergence of Newton Method near the minima is

|fk-f*|<C|fk-1-f*|2
c) The rate of convergence for Newton method is quadratic while for steepest decent it was linear. Therefore we see that Newton method converges much faster than steepest decent. Furthermore the constant C in steepest decent convergence is a function of condition number of Hessian and gives poor result in case of large condition number. This is not the case in Newton method. Newton method however requires the solution of a linear system of the order of the variable number involved to find the direction of variable change at every iteration.

Figure 2 (Steepest Decent)


Note: The rate of convergence for steepest decent method is given by

|fk-f*|<C|fk-1-f*|

Linear and Nonlinear Optimization

HW2

Naeemullah Khan & Sultan Albarakati

Question 3

Figure 3 (function of 100 variable Convergence)


The above figure is the plot of log error and iteration count for a function of 100 variables and we can see that the error converge in 11 iterations.

Linear and Nonlinear Optimization

HW2

Naeemullah Khan & Sultan Albarakati

Code:
function f=q3(x) global a; if max (abs(x))>1 f=inf; return end f=0; for i=1:500 if a(:,i)'*x>1 f= inf; return end f=f+log(1-a(:,i)'*x); end f=f+sum(log(1-x.^2)); f=-f; disp(f) end function g = grad3(x) g=zeros(100,1); global a; for j =1:100 sum = 0; for i =1:500 sum = sum + (a(j,i)/(1- a(:,i)'*x)); end g(j) = sum + (2*x(j)/(1- x(j)^2)); end

function h = hess3(x) global a; for j = 1:100 for k = 1:100 sum2 = 0; for i = 1:500 sum2 = sum2 + a(j,i)*a(k,i) / (1- (a(:,i)'*x))^2; end if j == k h(j,k) = sum2 + (2+ 2*x(j)^2)/(1-x(j)^2)^2; else h(j,k) = sum2; end end end

Potrebbero piacerti anche