77 Estimates ofthe Power Spectrum
Ti+ ee
77.3 Least Squares Estimation
next chapter we will minimize the mean
soning cine method of last squares estimation may be introduced wath the et
crane Situation. We conduct an experiment by measuring specific data valees ot
cvious date sang design after that wil estimate the data at time n using one
previous data sample at time n — 1, namely
R(n) = -ax(n - 1) 7301
cre £(n) and xin) are real discrete sequences, and isa parameter to be estimated,
25 2am Xn) is an estimate oF the true value x(n). Consequently, we heen error,
fined as
e(n) = x(n) ~ X(n) = x(n) + ax(n — 1) 7732)
Bae crs mode! is predicting the tue value ofthe data at ine using a weighted
Sa oF the data atime n ~ 1. This model is called linear prediction (L9);
SlOn(AR), The method of eat square sys thatthe constant ican be chosoran gg
sy atimize the sum of the enor oversome interval fromn = Otome Nn tee
Ns that is, we minimize the total squared error:
De'a)= ¥ (ia)-sa)y D&O) +axQ-)P p99
Tonsulred error is sometimes normalized by the number of datapoints expressed
Une has no fect on the solution, as we show below. The total shana
= Soften called the prediction error or the prediction error poe,
*,*ord about notation is appropriate here. The method of least squares assumes
5 re Cealing with data and does not use statistics. Consequently, the lowe
ston is used for functions. In Chapter 8 we will consider Xin) to be a ean
Socess and solve for the parameter 4, using expectation. The proceers ca
{Pe mean square error and give the true value (not an estimate) hy the
(x(n) + ax(n- 1) x(n — 1) 77.3.)
aD x(n 1) x(n 1) = - Sxl) x1)