Sei sulla pagina 1di 4

Practice Problem Set 7

OA4201 Nonlinear Programming

1. Consider the problem

min x12  x22


s.t. x1  x2  1
x1  2
x2  2

a) Explain why the feasible set  x 2 x1  x2  1, x1  2, x2  2 is convex.

b) Draw the feasible region and the contours of the objective function. Identify the
globally optimal solution graphically.

c) Show that the direction d k  x k  x k of the conditional gradient method at the


solution xk = (2, 0) equals d k  (3, 2) ' . Draw this direction. (Hint: solve the direction
finding problem graphically and set its optimal solution equal to x k .)

d) With d k and xk as given above, show that mk = 1 in the limited Armijo rule (Armijo
with feasibility checks) using parameters s = 1,  = 0.1,  = 0.5. That is, show that

for m = 0, f ( x k )  f ( x k   m sd k )   m sf ( x k ) ' d k and

for m = 1, f ( x k )  f ( x k   m sd k )   m sf ( x k ) ' d k and x k   m sd k satisfies the


constraints.

e) Show that x* = (½, ½) satisfies the first-order necessary condition for constrained
optimization over a convex set.

2. Consider the following problem with nonlinear objective function and linear constraints:

min f ( x)
s.t. Ax  b
x0

In feasible direction methods (such as the conditional gradient method and the reduced
gradient method), we need an initial feasible solution. Formulate a linear program for
finding a feasible solution of the given problem.
Solutions to Practice Problem Set 7
OA4201 Nonlinear Programming
1. Consider the problem

min x12  x22


s.t. x1  x2  1
x1  2
x2  2

a) Explain why the feasible set  x  2 x1  x2  1, x1  2, x2  2 is convex.

Let’s use the definition of a convex set. Pick x , x   x 2 x1  x2  1, x1  2, x2  2 and


  [0,1] arbitrarily. Consider  x  (1   ) x and show that this point satisfies all three
constraints:  x1  (1   ) x1   x2  (1   ) x2   ( x1  x2 )  (1   )( x1  x2 )  1 ,
 x1  (1   ) x1  2 ,
 x2  (1   ) x2  2 , which completes the proof.
Note also that this feasible set is defined by a system of linear inequalities. As we have
seen, such a set is always convex.

b) Draw the feasible region and the contours of the objective function. Identify the
globally optimal solution graphically.

The globally optimal solution is x* = (½, ½) .


c) Show that the direction d k  x k  x k of the conditional gradient method at the solution
xk = (2, 0) equals d k  (3, 2) ' . Draw this direction. (Hint: solve the direction finding
problem graphically and set its optimal solution equal to x k .)

f ( x )  x12  x22
 2x 
f ( x )   1 
 2 x2 
The objective function in the direction finding problem is f ( x k ) '( x  x k ) . Hence the
direction finding problem takes the form:

min 4 x1  8
s.t. x1  x2  1
x1  2
x2  2

Graphically we obtain that x k  (1, 2) . Hence,


 1  2   3 
dk       
 2  0  2 

d) With d k and xk as given above, show that mk = 1 in the limited Armijo rule (Armijo
with feasibility checks) using parameters s = 1,  = 0.1,  = 0.5. That is, show that

for m = 0, f ( x k )  f ( x k   m sd k )   m sf ( x k ) ' d k and

for m = 1, f ( x k )  f ( x k   m sd k )   m sf ( x k ) ' d k and x k   m sd k satisfies the


constraints.
f ( x k )  f ( xk   m sd k )  4  (2  0.5m3)2  (0  0.5m 2)2  1 while
 3 
 m sf ( x k ) ' d k  0.1  0.5m (4, 0)    1.2 for m = 0,
2

f ( x k )  f ( xk   m sd k )  4  (2  0.5m3)2  (0  0.5m 2)2  2.75 while


 3 
 m sf ( x k ) ' d k  0.1  0.5m (4, 0)    0.6 for m = 1.
2
 2  3   0.5 
Furthermore, x k   sd k     0.5      , which is feasible.
0 2  1 

e) Show that x* = (½, ½) satisfies the first-order necessary condition for constrained
optimization over a convex set.

f ( x )  x12  x22
 2x 
f ( x )   1 
 2 x2 
Hence,
 x  1/ 2 
f ( x*) '( x  x*)  (1,1)  1   x1  x2  1  0
 x2  1/ 2 
for all feasible solutions.

2. Consider the following problem with nonlinear objective function and linear constraints:

min f ( x)
s.t. Ax  b
x0

In feasible direction methods (such as the conditional gradient method and the reduced
gradient method), we need an initial feasible solution. Formulate a linear program for
finding a feasible solution of the given problem.

Answer: Solving the problem

min c ' x
s.t. Ax  b
x0

as you learned in LP will provide a feasible point for any cost vector c.

Potrebbero piacerti anche