Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1. Consider two equally probable one dimensional densities are of the form fX (x|i ) =
Ke|xai |/bi .
(a) Determine K in terms of ai and bi .
(b) Calculate the likelihood ratio as a function of the variables
fX (x|1 )
(c) Sketch a graph of the likelihood ratio fX (x|2 )
for the case of a1 = 0, b1 = 1, a2 = 1
and b2 = 2.
(a) Suppose the maximum acceptable error rate for classifying a patter that is actually
in 1 as if it were in 2 is E1 . Determine the single point decision boundary in
terms of the variables given.
(b) For this boundary, what is the error rate for classifying 2 as 1 .
(c) Assuming equal prior probabilities, derive and expression for minimum probability
of error.
3. Let max (x) be the state of nature for which P (max |x) P (i |x), i = 1, . . . , c .
(a) Show that P (max |x) 1/c.
(b) Show that for the minimum error rate decision rule the average probability of error
is given by:
Z
P (error) = P (max |x)fX (x)dx
c1
(c) Use these two results to show that P (error) c
1
Questions are taken from Pattern Classification Duda and Hart.
4. Often in pattern classification, one has the option either to assign the pattern to one of
c classes or to reject it as being unrecognizable. If the cost for rejects is not too high,
rejection may be a desirable action. Let
0 i=j
(i |j ) = r i = c + 1
s otherwise
where r is the loss incurred for choosing the (c + 1)th action, rejection, and s is the
loss incurred for making any substitution error. Show that the minimum risk is obtained
if we decide i if P (i |x) P (j |x) for all j and if P (i |x) 1 r /s , and reject
otherwise. What happens if r = 0? What happens if r s ?
6. Let the components of the vector x = (x1 , . . . , xd )t be binary valued and let P (j )
be the prior probability for the state of nature j and j = 1, . . . , c. Define pij =
P r(xi = 1|j ), i = 1, . . . , dandj = 1, . . . , c with the components of xi being statistically
independent for all x in j . Show that the minimum probability of error is achieved by
the following decision rule: Decide k if gk (x) gj (x) for all j and k, where
d d
X pij X
gj (x) = xi ln + ln(1 pij ) + ln P (j )
i=1
1 pij i=1