Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
ABSTRACT
In this paper, we have used one of the preconditioned conjugate
gradient algorithm with the Quasi – Newton approximation; namely the
BFGS preconditioned algorithm which was suggested by (AL-Bayati and
Aref, 2001). In this paper we have suggested a new algorithm for
constrained optimization with robust numerical results, for solving
constrained optimization problems.
ﺍﻟﻤﻠﺨﺹ
ﻓﻲ ﻫﺫﺍ ﺍﻟﺒﺤﺙ ﺘﻡ ﺍﺴﺘﺨﺩﺍﻡ ﺇﺤﺩﻯ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ ﺍﻟﻤﺸﺭﻭﻁﺔ ﻤﻊ
ﺘﻘﺭﻴﺒﺎﺕ ﺍﺸﺒﺎﻩ ﻨﻴﻭﺘﻥ ﻭﻫﻲ ﺨﻭﺍﺭﺯﻤﻴﺔ BFGSﺍﻟﻤﺸﺭﻭﻁﺔ ﻭﺍﻟﻤﺘﻭﺍﺯﻴـﺔ ﻭﺍﻟﻤﻘﺘﺭﺤـﺔ ﻤـﻥ ﻗﺒل
) (Al-Bayati and Aref, 2001ﻓﻲ ﺤل ﻤﺴﺎﺌل ﺍﻻﻤﺜﻠﻴﺔ ﺍﻟﻤﻘﻴﺩﺓ ﻭﺍﻗﺘﺭﺍﺡ ﺨﻭﺍﺭﺯﻤﻴﺔ ﺠﺩﻴﺩﺓ ﻓﻲ
ﻫﺫﺍ ﺍﻟﻤﺠﺎل .ﺘﻡ ﺍﺨﺘﺒﺎﺭ ﺍﻟﺨﻭﺍﺭﺯﻤﻴﺔ ﺍﻟﻤﻘﺘﺭﺤﺔ ﻋﻠﻰ ﺒﻌﺽ ﺍﻟﻤﺴﺎﺌل ﺍﻟﻤﻘﻴﺩﺓ ﻭﺃﺜﺒﺘﺕ ﺍﻟﻨﺘﺎﺌﺞ ﺍﻟﻌﺩﺩﻴﺔ
ﻜﻔﺎﺀﺓ ﺍﻟﺨﻭﺍﺭﺯﻤﻴﺔ ﺍﻟﺠﺩﻴﺩﺓ ﻤﻘﺎﺭﻨﺔ" ﻤﻊ ﺍﻟﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻷﺼﻠﻴﺔ ﺍﻟﺘﻲ ﺍﺴﺘﺨﺩﻤﺕ ﻟﺤل ﻤﺴﺎﺌل ﺍﻻﻤﺜﻠﻴﺔ
ﺍﻟﻤﻘﻴﺩﺓ.
Introduction .1ﺍﻟﻤﻘﺩﻤﺔ:
ﻟﺩﻴﻨﺎ ﻤﺴﺄﻟﺔ ﺍﻻﻤﺜﻠﻴﺔ ﺍﻟﻤﻘﻴﺩﺓ ﺍﻵﺘﻴﺔ:
min )f ( x
)…(1
s.t. c k ( x) ≥ 0, ∀k ∈ I
ﺤﻴﺙ ﺍﻥ fﻭ ckﻫﻲ ﺩﻭﺍل ﻤﻥ Enﺍﻟﻰ .Eﻭﻟﺘﻜﻥ Iﻤﺠﻤﻭﻋﺔ ﻤﻨﺘﻬﻴﺔ ﻤﻥ ﺍﻷﻋﺩﺍﺩ ﺍﻟﺼﺤﻴﺤﺔ
ﺍﻟﻤﻭﺠﺒﺔ ،ﺒﻔﺭﺽ ) f(xﻭ ) ck(xﻫﻲ ﺩﻭﺍل ﻗﺎﺒﻠﺔ ﻟﻼﺸﺘﻘﺎﻕ ﻤﺭﺘﻴﻥ ﻋﻠﻰ ﻓﺘﺭﺓ ﻤﻔﺘﻭﺤﺔ Fﺤﻴﺙ :
١١
ﻋﺒﺎﺱ ﻴﻭﻨﺱ ﺍﻟﺒﻴﺎﺘﻲ ﻭﺒﺎﻥ ﺍﺤﻤﺩ ﻤﺘﺭﺍﺱ
ﻤﻥ ﺍﻟﻤﻤﻜﻥ ﺇﻴﺠﺎﺩ ﺤل ﻟﻤﺴﺄﻟﺔ ﻤﻘﻴﺩﺓ ﻋﺎﻤﺔ ﺒﺎﺴﺘﺨﺩﺍﻡ ﺩﺍﻟﺔ ﻫﺩﻑ ﻤﺤﻭﺭﺓ
) θ( x, r ) = f ( x ) + rB( x )…(3
ﺤﻴﺙ ﺃﻥ rﺘﻌﺭﻑ ﻋﻠﻰ ﺍﻨﻬﺎ ﻤﻌﻠﻤﺔ ﺴﻴﻁﺭﺓ ) (control parameterﻭﺍﻟﺤﺩ ﺍﻟﺜﺎﻨﻲ ) B(xﻫﻭ ﺩﺍﻟﺔ
. Barrierﺍﻟﻘﻴﻤﺔ ﺍﻻﺒﺘﺩﺍﺌﻴﺔ rﺘﻌﺘﺒﺭ ﺫﺍﺕ ﺍﻫﻤﻴﺔ ﻓﻲ ﺍﺨﺘﺯﺍل ﻋﺩﺩ ﺍﻟﺘﻜﺭﺍﺭﺍﺕ ﻟﺘﻘﻠﻴل ﺍﻟﺩﺍﻟﺔ ) . θ ( x, r
ﺍﻟﻘﻴﻤﺔ ﺍﻻﺒﺘﺩﺍﺌﻴﺔ r0ﺍﻗﺘﺭﺤﻬﺎ ﺍﻟﻌﺎﻟﻤﺎﻥ ) (McCormick ), ( Fiaccoﻓﻲ ﻋﺎﻡ ) (1968ﻭﻜﺎﻨﺕ
ﻜﺎﻻﺘﻲ:
) ∇f ( x )T ∇B( x
= r0 )…(4
) ∇B( x )T ∇B( x
ﻭﺘﻡ ﺍﻴﺠﺎﺩ ﻋﺩﺓ ﺒﺩﺍﺌل ﻟﻘﻴﻤﺔ r0ﺭﺍﺠﻊ ) (Al-Bayati and Hamed , 2000ﻭ (Al-Bayati and
Hamed , 1999).
ﻓﻲ ﻫﺫﺍ ﺍﻟﺒﺤﺙ ﺘﻡ ﺍﺴﺘﺨﺩﺍﻡ ﺨﻭﺍﺭﺯﻤﻴﺔ BFGSﺍﻟﻤﺘﻭﺍﺯﻴﺔ ﻭﺍﻟﻤﺸﺭﻭﻁﺔ ﺍﻟﻤﻘﺘﺭﺤﺔ ﻤﻥ ﻗﺒل
) (Al-Bayati and Aref , 2001ﻓﻲ ﺤل ﻤﺴﺎﺌل ﺍﻻﻤﺜﻠﺔ ﺍﻟﻤﻘﻴﺩﺓ.
.2ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ ﺍﻟﻤﺸﺭﻭﻁ ﻤﺴﺒﻘﺎ":
)Preconditioned Conjugate Gradient Methods (PCG
ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺃﺸﺒﺎﻩ ﻨﻴﻭﺘﻥ Quasi-Newton (QN)-ﻭﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓـــــــﻕ
) Conjugate Gradient (CGﻓﻲ ﺘﻁﺒﻴﻘﺎﺕ ﺍﻟﺩﻭﺍل ﺍﻟﻌﺎﻤﺔ ﻟﻬﺎ ﻤﺴﺎﻭﺉ ﻭﻤﺤﺎﺴﻥ ﻋﻤﻠﻴﺔ ،ﺤﻴﺙ ﺍﻥ
ﺨﻭﺍﺭﺯﻤﻴﺔ QNﺃﺴﺭﻉ )ﻭﺘﺘﻁﻠﺏ ﺤﺴﺎﺒﺎﺕ ﺍﻗل ﻟﻘﻴﻤﺔ ﺍﻟﺩﺍﻟﺔ ﺍﻟﻤﻌﻴﻨﺔ( ﻤﻥ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﺘﺩﺭﺝ
ﺍﻟﻤﺘﺭﺍﻓﻕ ﻜﺫﻟﻙ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ QNﺘﻜﻭﻥ ﻤﺘﺘﺎﺒﻌﺔ ﻤﻥ ﻤﺼﻔﻭﻓﺎﺕ ﻤﺘﻨﺎﻅﺭﺓ ،ﻤﺭﺒﻌﺔ ﻭﻤﻭﺠﺒﺔ ﻗﻁﻌﺎﹰ ،
ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﻤﺘﺭﻱ ﺍﻟﻤﺘﻐﻴﺭ ﺍﻟﻤﺤﺩﻭﺩ ﻻ ﻴﻜﻭﻥ ﺍﺴﺘﻌﻤﺎﻟﻬﺎ ﻓﻌﺎﻻﹰ ﺒﺴﺒﺏ ﺍﻟﺨﺯﻥ ﻓﻲ ﺤﺎﻟﺔ ﺘﺯﺍﻴﺩ ﻋﺩﺩ
ﺍﻟﻤﺘﻐﻴﺭﺍﺕ ﺒﺸﻜل ﺍﻁﺭﺍﺩﻱ ،ﻭﻟﻬﺫﺍ ﺘﻡ ﺍﻟﺘﻁﺭﻕ ﺍﻟﻰ ﺼﻨﻑ ﺠﺩﻴﺩ ﻤﻥ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ
CGﺍﻟﻤﻁﻭﺭ ﺘﺴﻤﻰ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ ﺍﻟﻤﺸﺭﻭﻁ ) .(PCGﻓﻜﺭﺓ ﺍﻻﺸﺘﺭﺍﻁ ﻫﻲ ﻟﺘﺤﻭﻴل
ﺍﻟﻤﺴﺄﻟﺔ ،ﺇﺫ ﺍﻥ ﻤﺼﻔﻭﻓﺔ Hessianﻟﻠﻤﺴﺄﻟﺔ ﺍﻟﻤﺘﺤﻭﻟﺔ ﺘﻤﺘﻠﻙ ﺍﻟﻘﻴﻡ ﺍﻟﻤﻤﻴﺯﺓ ﺍﻟﻤﺘﺠﻤﻌﺔ ﻭﺘﻜﻭﻥ ﺤﺴﻨﺔ
ﺍﻟﺸﺭﻁ .well – conditionedﺍﻟﻐﺭﺽ ﻤﻥ ﺨﻭﺍﺭﺯﻤﻴﺔ ) (PCGﻫﻭ ﻟﺤﻔﻅ ﻤﺘﻁﻠﺒﺎﺕ ﺍﻟﺨﺯﻥ ﻤﻥ
ﺭﺘﺒﺔ nﻤﻊ ﺘﺤﺴﻴﻥ ﺘﻘﺎﺭﺏ ﺍﻟﺨﻭﺍﺭﺯﻤﻴﺔ ﻭﻟﻤﺯﻴﺩ ﻤﻥ ﺍﻟﻤﻌﻠﻭﻤﺎﺕ ﻓﻲ ﻫﺫﺍ ﺍﻟﺼﺩﺩ ﻴﻤﻜﻥ ﺍﻟﺭﺠﻭﻉ ﺍﻟﻰ
)(Buckely and Lenier, 1983 ; Nazareth, 1979ﻜﺫﻟﻙ ﻴﻤﻜﻥ ﺍﻟﺭﺠﻭﻉ ﺍﻟﻰ (Buckely
) ,1978; AL- Bayati and Mohammed Ali, 2002ﻓﻲ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ PCGﻭﺤﻴﺙ ﺍﻥ
ﺍﺘﺠﺎﻩ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ ﻴﺘﻭﻟﺩ ﺒﺎﻻﻋﺘﻤﺎﺩ ﻋﻠﻰ ﺘﺤﻭﻴﻼﺕ ﺍﻟﻤﺼﻔﻭﻓﺔ.
١٢
ﺍﺴﺘﺨﺩﺍﻡ ﺘﺤﺩﻴﺙ BFGSﺍﻟﻤﺸﺭﻭﻁ ﻓﻲ ﺍﻻﻤﺜﻠﻴﺔ ﺍﻟﻤﻘﻴﺩﺓ
١٣
ﻋﺒﺎﺱ ﻴﻭﻨﺱ ﺍﻟﺒﻴﺎﺘﻲ ﻭﺒﺎﻥ ﺍﺤﻤﺩ ﻤﺘﺭﺍﺱ
ﻟﺘﻘﺭﻴﺏ ﻤﻌﻜﻭﺱ Hessainﺘﻜﻭﻥ ﻤﻭﺠﺒﺔ ﺍﻟﺘﻌﺭﻴﻑ ﻭﻤﺘﻨﺎﻅﺭﺓ .ﻋﻤﻭﻤﺎﹰ ﻓﺈﻥ ﻤﺼﻔﻭﻓﺔ ﺍﻟﺸﺭﻁ Hﻴﺠﺏ
ﺍﻥ ﺘﺴﺘﺨﺩﻡ ﻓﻲ ﻜل ﻤﻜﺎﻥ ﻟﻠﺤﺼﻭل ﻋﻠﻰ ﺍﻟﺘﻭﻗﻑ ﺍﻟﺘﺭﺒﻴﻌﻲ ﻤﻊ ﺨﻭﺍﺭﺯﻤﻴﺔ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ.
ﻫﺫﺍ ﺒﺎﻟﻨﺴﺒﺔ ﻟﻠﻨﻤﺎﺫﺝ ﺍﻟﺘﺭﺒﻴﻌﻴﺔ ،ﺍﻤﺎ ﻓﻲ ﺍﻟﻨﻤﺎﺫﺝ ﻏﻴﺭ ﺍﻟﺘﺭﺒﻴﻌﻴﺔ ﻓﻴﻤﻜﻥ ﺍﻟﺭﺠـﻭﻉ ﺍﻟﻰ
).(Al-Bayati and Al-Assady, 1997
2.3ﺘﻘﺭﻴﺒﺎﺕ QNﻤﺤﺩﻭﺩﺓ ﺍﻟﺫﺍﻜﺭﺓ :
Limited – Memory Quasi – Newton Approximations
ﺃﺜﺒﺕ ﺘﺤﺩﻴﺩ ﺍﻟﺫﺍﻜﺭﺓ ﻟﺘﻘﺭﻴﺒﺎﺕ QNﻤﻌﻨﻰ ﺍﺨﺭ ﻟﻼﺸﺘﺭﺍﻁ ﻤﻊ ﺘﻘﺭﻴﺒﺎﺕ ، QNﻭﻴﻤﻜﻥ
ﺍﻋﺘﺒﺎﺭ ﺘﻘﺭﻴﺏ ﻤﻌﻜﻭﺱ ﻤﺼﻔﻭﻓﺔ Hessianﺍﻻﺒﺘﺩﺍﺌﻲ ﻤﺼﻔﻭﻓﺔ ﻤﺘﻨﺎﻅﺭﺓ ﻭﻤﻭﺠﺒﺔ ﺍﻟﺘﻌﺭﻴﻑ ﻴﻤﻜﻥ
ﺨﺯﻨﻬﺎ ﻓﻲ ﺍﻟﺫﺍﻜﺭﺓ ﺍﻟﻤﺘﻭﺍﻓﺭﺓ ﺤﺘﻰ ﻭﺍﻥ ﻀﺭﺒﺕ ﻓﻲ ﻤﺘﺠﻪ ﺴﻌﺘﻪ ) O(nﻤﻥ ﺍﻟﻌﻤﻠﻴﺎﺕ ،ﻋﺎﺩﺓ ﻤﺎ ﺘﻜﻭﻥ
ﻤﺼﻔﻭﻓﺔ ﻗﻁﺭﻴﺔ ﺍﻭ ﺸﺭﻴﻁﻴﺔ ) ( diagonal or bandedﺒﺩﻻﹰ ﻤﻥ ﺍﻟﻌﻨﺎﺼﺭ ﺍﻟﺘﻁﻭﻴﺭﻴﺔ ﺍﻟﻭﺍﻀﺤﺔ
ﺍﻟﺘﻘﺭﻴﺏ ،ﻭﻴﻜﻭﻥ ﺍﻟﺘﻁﻭﻴﺭ ﺘﻤﺎﻤﺎﹰ ﺒﻭﺍﺴﻁﺔ ﺨﺯﻥ ﺍﻟﻤﺘﺠﻬﺎﺕ ﻤﻊ ﺍﻟﻤﻌﻠﻤﺎﺕ ﺍﻟﺘﻲ ﺘﻌﺭﻑ ﺒﺘﻁﻭﻴﺭ
) (Shanno, 1982ﻭﻟﻬﺫﺍ ﻴﺘﻁﻠﺏ ﺍﻟﺘﻘﺭﻴﺏ ﺍﻻﺒﺘﺩﺍﺌﻲ ﺨﺯﻨﺎﹰ ﻗﻠﻴﻼﹰ ،ﻭﺘﺨﺯﻥ ﻤﺠﻤﻭﻋﺔ ﺍﻻﺯﻭﺍﺝ ﻤﻥ
ﺍﻟﻤﺘﺠﻬﺎﺕ ﻭﺍﻟﻤﻌﻠﻤﺎﺕ ﺒﺩﻻﹰ ﻤﻥ ﻤﺼﻔﻭﻓﺔ ﺴﻌﺔ .nxnﻭﺍﻟﺨﺯﻥ ﺫﻭ ﺍﻟﺘﻘﺭﻴﺏ ﺍﻟﺫﻱ ﻴﺨﺯﻥ ﻓﻲ ﻫﺫﻩ
ﺍﻟﻁﺭﻴﻘﺔ ﻫﻭ ﺘﻘﺭﻴﺏ QNﻤﺤﺩﻭﺩ ﺍﻟﺫﺍﻜﺭﺓ .ﻭﻫﻨﺎﻙ ﺍﻟﺨﺯﻥ ﺫﻭ ﺍﻟﺨﻁﻭﺓ ﺍﻟﻭﺍﺤﺩﺓ ﻭﺍﻟﺨﺯﻥ ﺫﻭ ﺍﻟﺨﻁﻭﺘﻴﻥ
ﻟﻤﺯﻴﺩ ﻤﻥ ﺍﻟﻤﻌﻠﻭﻤﺎﺕ ﺍﻨﻅﺭ ).(AL- Bayati and Ahmed, 1997
Improved BFGS Algorithm .3ﺨﻭﺍﺭﺯﻤﻴﺔ BFGSﺍﻟﻤﻁﻭﺭﺓ:
ﻓﻲ ﺨﻭﺍﺭﺯﻤﻴﺎﺕ ﺍﻟﻤﺘﺭﻱ ﺍﻟﻤﺘﻐﻴﺭ ﺍﻟﻤﺸﺭﻭﻁ ﺍﻟﻤﻌﺭﻭﻓﺔ ﻨﺒﺩﺃ ﻤﻥ ﻨﻘﻁﺔ ﺍﺨﺘﻴﺎﺭﻴﺔ ﻻﺘﺠﺎﻩ
ﺍﻟﺒﺤﺙ skﻋﻨﺩ ﺍﻟﺘﻜﺭﺍﺭ k -ﻭﺘﺤﺴﺏ
) s k = − H k ∆f ( x k )…(5
ﺤﻴﺙ ﺍﻥ ﺍﻟﺩﺍﻟﺔ fﺍﻟﺩﺍﻟﺔ ﺍﻟﻤﺼﻐﺭﺓ )ﻋﻠﻰ ﻓﺭﺽ ﺍﻨﻬﺎ ﻤﺴﺘﻤﺭﺓ ﻭﺘﻤﻠﻙ ﻤﺸﺘﻘﺔ ﺜﺎﻨﻴﺔ( ﻭ xkﻨﻘﻁﺔ ﺍﻟﺘﻜﺭﺍﺭ
ﺍﻟﻤﺘﺩﺍﻭﻟﺔ ،ﻭ Hkﺍﻟﺘﻘﺭﻴﺏ ﺍﻟﻰ ﻤﻌﻜﻭﺱ ﻤﺼﻔﻭﻓﺔ Hessianﻋﻨﺩ xkﻨﻘﻁﺔ ﺍﻟﺘﻜﺭﺍﺭ ﺍﻟﺘﺎﻟﻲ xk+ 1
ﻨﺤﺼل ﻋﻠﻴﻬﺎ ﻤﻥ :
x k +1 = x k + λ k s k )…(6
ﺤﻴﺙ ﺍﻥ Dkﺘﻘﻭﻡ ﻤﻘﺎﻡ ﺍﻟﺘﻘﺭﻴﺏ ﻓﻲ ﺘﺼﻐﻴﺭ fﻋﻠﻰ ﻁﻭل ﺍﺘﺠﺎﻩ ﺍﻟﺒﺤﺙ sﻤﻥ .xﺃﺨﻴﺭﺍ" ،ﻨﻁﻭﺭ
k k
Hkﺒﻭﺍﺴﻁﺔ ﺇﻀﺎﻓﺘﻬﺎ ﺍﻟﻰ ﺍﻟﻤﺼﻔﻭﻓﺔ Dkﻤﻥ ﺍﻟﺭﺘﺒﺔ ﺍﻷﻭﻟﻰ ﺍﻭ ﺍﻟﺜﺎﻨﻴﺔ ﺍﻟﺘﺼﺤﻴﺤﻴﺔ ﻟﻜﻲ ﻨﺤﺼل ﻋﻠﻰ
Hk+ 1= Hk+ Dk )…(7
ﺤﻴﺙ ﺍﻥ Dkﺘﻌﺘﻤﺩ ﻋﻠﻰ
σ k = xk +1 − xk )…(8
k +1
y = g( x
k
) ) − g( x
k
)…(9
١٤
ﺍﺴﺘﺨﺩﺍﻡ ﺘﺤﺩﻴﺙ BFGSﺍﻟﻤﺸﺭﻭﻁ ﻓﻲ ﺍﻻﻤﺜﻠﻴﺔ ﺍﻟﻤﻘﻴﺩﺓ
ﺤﻴﺙ ) . g ( x ) = ∆f ( x
ﻫﻨﺎﻟﻙ ﺍﻟﻌﺩﻴﺩ ﻤﻥ ﺍﻻﺨﺘﻴﺎﺭﺍﺕ ﺍﻟﻤﻤﻜﻨﺔ ﻟﻠﻤﺼﻔﻭﻓﺔ ﺍﻟﺘﺼﺤﻴﺤﻴﺔ Dkﻤﻥ ﺃﻫﻤﻬﺎ
(Fletcher and Powell, 1963 ; Broyden, 1970 ; Davidon , 1959 ; Al-Bayati,
1991 ).
ﻨﺨﺘﺎﺭ ﺍﻭﻻﹰ ﻤﺼﻔﻭﻓﺔ ﺍﻟﺭﺘﺒﺔ ﺍﻟﺜﺎﻨﻴﺔ ﺍﻟﻤﻁﻭﺭﺓ BFGS
ﺤﻴﺙ ﺍﻥ Dkﺘﻌﺭﻑ ﺒﺎﻟﺸﻜل ﺍﻵﺘﻲ:
H k y k ( y k )T H k σ k ( σ k )T
= Dk + + w k ( w k )T )…(10
k T
( y ) Hk y k
(σ ) y
k T k
ﻋﻨﺩﻤﺎ
1 σk H ( y k )T
wk = (( y k )T H k y k ) 2 k T k − kk T
(σ ) y ( y ) H k yk
ﻭﻴﻤﻜﻥ ﺘﻁﻭﻴﺭ Hkﻤﻊ ﻁﺭﻴﻘﺔ ﺍﻟﺘﺩﺭﺝ ﺍﻟﻤﺘﺭﺍﻓﻕ ﻋﻥ ﻁﺭﻴﻕ ﺍﺘﺠﺎﻩ ﺍﻟﺒﺤﺙ sk+ 1ﻟﻜﻲ ﻨﺤﺼل ﻋﻠﻰ
( g k +1 )T H k +1 g k +1
s k +1 = − H k +1 g k +1 + k T K
sk )…(11
(g ) g
ﺤﻴﺙ ﺍﻥ Hk+ 1ﻫﻲ ﻤﺼﻔﻭﻓﺔ ﻤﺘﻨﺎﻅﺭﺓ ﻭﻤﻭﺠﺒﺔ ﺍﻟﺘﻌﺭﻴﻑ:
ﻟﻤﺯﻴﺩ ﻤﻥ ﺍﻟﻤﻌﻠﻭﻤﺎﺕ ﻋﻥ ﺨﻭﺍﺭﺯﻤﻴﺔ BFGSﺍﻨﻅﺭ ).(Al-Bayati and Ahmed, 1996
ﻋﺩﺩ ﺼﺤﻴﺢ ﻤﻭﺠﺏ . ε ﺘﻭﻗﻑ ﺤﻴﺙ ﺍﻟﺨﻁﻭﺓ ﺍﻟﺭﺍﺒﻌﺔ -:ﺭﺍﺠﻊ ﺇﺫﺍ ﻜﺎﻥ g( x k +1 ) < ε
١٥
ﻋﺒﺎﺱ ﻴﻭﻨﺱ ﺍﻟﺒﻴﺎﺘﻲ ﻭﺒﺎﻥ ﺍﺤﻤﺩ ﻤﺘﺭﺍﺱ
ﺨﻭﺍﺭﺯﻤﻴﺔ )(2
ﺍﻟﺨﻁﻭﺓ ﺍﻷﻭﻟﻰ ﻭﺍﻟﺜﺎﻨﻴﺔ /ﻜﻤﺎ ﻓﻲ ﺨﻭﺍﺭﺯﻤﻴﺔ )(1
ﺍﻟﺨﻁﻭﺓ ﺍﻟﺜﺎﻟﺜﺔ :ﺍﺤﺴﺏ
m
φ( x, r ) = f ( x ) + }) ∑ r B{c ( x
k =1
k i
ﻋﺩﺩ ﺼﺤﻴﺢ ﻤﻭﺠﺏ( . ε ﺍﻟﺨﻁﻭﺓ ﺍﻟﺨﺎﻤﺴﺔ -:ﺍﻓﺤﺹ ﺇﺫﺍ ﻜﺎﻥ g( x k +1 ) < εﺘﻭﻗﻑ )ﺤﻴﺙ
ﻭﺍﻻ ﺍﺫﻫﺏ ﺍﻟﻰ ﺍﻟﺨﻁﻭﺓ ﺍﻟﺴﺎﺩﺴﺔ.
ﺍﻟﺨﻁﻭﺓ ﺍﻟﺴﺎﺩﺴﺔ -:ﺍﺤﺴﺏ ﻓﻲ ﺍﻟﺘﻭﺍﺯﻱ ) ، g ( x k ), g ( x k1 ),..., g ( x kn ), f ( x k
) y kj = g ( xkj ) − g( x k ﺤﻴﺙ ، xkj = xk + δ j
ﻭﺍﺤﺴﺏ γ kj , j = 1,2 ,...,nﻤﻥ ﺨﻼل
δj H k y kj
= γ kj −
( y kj )T δ j ( y kj )T H k y kj
١٦
ﺍﺴﺘﺨﺩﺍﻡ ﺘﺤﺩﻴﺙ BFGSﺍﻟﻤﺸﺭﻭﻁ ﻓﻲ ﺍﻻﻤﺜﻠﻴﺔ ﺍﻟﻤﻘﻴﺩﺓ
١٧
ﻋﺒﺎﺱ ﻴﻭﻨﺱ ﺍﻟﺒﻴﺎﺘﻲ ﻭﺒﺎﻥ ﺍﺤﻤﺩ ﻤﺘﺭﺍﺱ
Table (1)
SUMT ﻤﻘﺎﺭﻨﺔ ﺍﻟﺨﻭﺍﺭﺯﻤﻴﺔ ﺍﻟﺠﺩﻴﺩﺓ ﻤﻊ ﺨﻭﺍﺭﺯﻤﻴﺔ
SUMT algorithm New algorithm
Test function
NOI (NOF) NOI (NOF)
1. 32 (129) 26 (106)
2. 24 (81) 21 (73)
3. 37 (137) 27 (99)
4. 13 (50) 10 (42)
5. 39 (122) 31 (100)
Total 145 (519) 115 (420)
Table (2)
SUMT Algorithm New Algorithm
NOI 100% 79.3
NOF 100% 82.3
Appendix ﺍﻟﻤﻠﺤﻕ.7
Test functions
(1) min f ( x ) = ( x1 − 2 )2 + ( x2 − 1 ) 2 s.p( 2,7)
s.t
x1 − 2 x2 = −1
− x12
+ x22 + 1 ≥ 0
4
(2) min f ( x ) = − x12 x2
s.t
x1 x2 + ( x12 ) / 2 = 6
x1 + x2 ≥ 0
(3) min f ( x ) = x12 + x22 s.p(0.9,2)
s.t.
x1 + 2 x2 = 4
x12 + x22 ≤ 5
xi ≥ 0
(4) min f ( x ) = ( x1 − x2 ) 2 + ( x2 − 2 x2 )2 s.p (0,1)
s.t.
x12 − x2 ≤ 0
(5) min f ( x ) = x1 x4 ( x1 + x2 + x3 ) + x3 x0=(4,3,3,3)
s.t.
x12 + x22 + x32 + x42 = 40
x1 x2 x3 ≥ 25
5 ≥ xi ≥ 1
١٨
ﺍﻟﻤﺸﺭﻭﻁ ﻓﻲ ﺍﻻﻤﺜﻠﻴﺔ ﺍﻟﻤﻘﻴﺩﺓBFGS ﺍﺴﺘﺨﺩﺍﻡ ﺘﺤﺩﻴﺙ
ﺍﻟﻤﺼﺎﺩﺭ
١٩
ﻋﺒﺎﺱ ﻴﻭﻨﺱ ﺍﻟﺒﻴﺎﺘﻲ ﻭﺒﺎﻥ ﺍﺤﻤﺩ ﻤﺘﺭﺍﺱ
٢٠