Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Copyright
c 2005 Andreas Antoniou
Victoria, BC, Canada
Email: aantoniou@ieee.org
M0(ω)
Gain
ω, rad/s
J
Y a0j + a1j z + z 2
H(z) = H0
b0j + b1j z + z 2
j=1
J
Y a0j + a1j z + z 2
H(z) = H0
b0j + b1j z + z 2
j=1
a0j = b0j = 0
ei (x) = e(x, ωi )
for i = 1, 2, . . . , K .
M(x, ω)
M0(ω)
e(x, ωi)
Gain
ω1 ω2 ωi ωK
ω, rad/s
t For filter design, the most important norms are the L2 and L∞
norms which are defined as
" K
#1/2
X
L2 (x) = |ei (x)|2
( Ki=1 )1/p
X p
and L∞ (x) = lim |ei (x)|
p→∞
i=1
( K " #p )1/p
_ X |ei (x)|
= E (x) lim _
p→∞
i=1 E (x)
_
= E (x)
_
where E (x) = max |ei (x)|
1≤i ≤K
minimize Ψ(x)
x
minimize Ψ(x)
x
t If
Ψ(x) = L2 (x)
a least-squares solution is obtained and if
Ψ(x) = L∞ (x)
Minimum point
for k = 1, 2, . . . , n.
for k = 1, 2, . . . , n.
t The solution of the above equation can be expressed as
δ = −H−1 g
···
δ = −H−1 g
and 2 2 2
∂ f (x) ∂ f (x) ∂ f (x)
2 ∂x1 ∂x2 ··· ∂x1 ∂xn
∂x1
∂ 2 f (x) 2
∂ f (x)
2
∂ f (x)
∂x22
···
H = ∂x2 ∂x1 ∂x2 ∂xn
. .. ..
.. . .
2 2 2
∂ f (x) ∂ f (x) ∂ f (x)
∂xn ∂x1 ∂xn ∂x2 ··· ∂xn2
···
δ = −H−1 g
t This equation will give the solution if and only if the following
two conditions hold:
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
···
δ = −H−1 g
t This equation will give the solution if and only if the following
two conditions hold:
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t If f (x) is a quadratic function, its second partial derivatives
are constants, i.e., H is a constant symmetric matrix, and its
third and higher derivatives are zero, i.e., condition (1) holds.
···
δ = −H−1 g
t This equation will give the solution if and only if the following
two conditions hold:
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t If f (x) is a quadratic function, its second partial derivatives
are constants, i.e., H is a constant symmetric matrix, and its
third and higher derivatives are zero, i.e., condition (1) holds.
t If f (x) has a minimum at a stationary point, then the Hessian
matrix is positive definite at the minimum point.
In such a case, the Hessian is nonsingular, i.e., condition (2)
holds.
···
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t When the two conditions apply, we go to MATLAB and very
quickly we obtain the solution of the optimization problem,
i.e., the minimum point and the minimum value of the
function, are obtained as
^ ^ ^
x = x + δ, f (x)
···
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t When the two conditions apply, we go to MATLAB and very
quickly we obtain the solution of the optimization problem,
i.e., the minimum point and the minimum value of the
function, are obtained as
^ ^ ^
x = x + δ, f (x)
···
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t We need optimization algorithms to solve problems that are not
quadratic.
···
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t We need optimization algorithms to solve problems that are not
quadratic.
δ = −H−1 g
···
1. The remainder o(||δ||22 ) can be neglected.
2. The Hessian is nonsingular.
t We need optimization algorithms to solve problems that are not
quadratic.
δ = −H−1 g
δ = −H−1 g
d = −H−1 g
d = −H−1 g
δ = αd
H=I