Sei sulla pagina 1di 2

Name: Roll no:

Department of Electronics & Communication Engineering


National Institute of Technology Karnataka, Surathkal
EC 447 Pattern Recognition and Machine Learning
Assignment I Max Marks: 0

Please avoid proofs by stories/intuitions

1. Consider two equally probable one dimensional densities are of the form fX (x|i ) =
Ke|xai |/bi .
(a) Determine K in terms of ai and bi .
(b) Calculate the likelihood ratio as a function of the variables
fX (x|1 )
(c) Sketch a graph of the likelihood ratio fX (x|2 )
for the case of a1 = 0, b1 = 1, a2 = 1
and b2 = 2.

2. Consider two Cauchy distributions in one dimension:


1 1
fX (x|i ) = .  xai 2
b 1 +
b

(a) Suppose the maximum acceptable error rate for classifying a patter that is actually
in 1 as if it were in 2 is E1 . Determine the single point decision boundary in
terms of the variables given.
(b) For this boundary, what is the error rate for classifying 2 as 1 .
(c) Assuming equal prior probabilities, derive and expression for minimum probability
of error.

3. Let max (x) be the state of nature for which P (max |x) P (i |x), i = 1, . . . , c .
(a) Show that P (max |x) 1/c.
(b) Show that for the minimum error rate decision rule the average probability of error
is given by:
Z
P (error) = P (max |x)fX (x)dx

c1
(c) Use these two results to show that P (error) c
1
Questions are taken from Pattern Classification Duda and Hart.
4. Often in pattern classification, one has the option either to assign the pattern to one of
c classes or to reject it as being unrecognizable. If the cost for rejects is not too high,
rejection may be a desirable action. Let

0 i=j
(i |j ) = r i = c + 1
s otherwise

where r is the loss incurred for choosing the (c + 1)th action, rejection, and s is the
loss incurred for making any substitution error. Show that the minimum risk is obtained
if we decide i if P (i |x) P (j |x) for all j and if P (i |x) 1 r /s , and reject
otherwise. What happens if r = 0? What happens if r s ?

5. Let fX (x|i ) N (i , 2 ), for a two category one-dimensional problem with P (1 ) =


P (2 ) = 12 . Show that the minimum probability of error is given by
Z
1 2
Pe = eu /2 du
2 a
where a = |2 1 |/(2)

6. Let the components of the vector x = (x1 , . . . , xd )t be binary valued and let P (j )
be the prior probability for the state of nature j and j = 1, . . . , c. Define pij =
P r(xi = 1|j ), i = 1, . . . , dandj = 1, . . . , c with the components of xi being statistically
independent for all x in j . Show that the minimum probability of error is achieved by
the following decision rule: Decide k if gk (x) gj (x) for all j and k, where
d d
X pij X
gj (x) = xi ln + ln(1 pij ) + ln P (j )
i=1
1 pij i=1

7. The poisson distribution for a discrete variable x = 0, 1, 2, . . . and real parameter is


x
p(x|) = e
x!
Consider two equally probable categories having Poisson distribution with differing pa-
rameters; assume for definiteness 1 > 2 . What is the Bayes classification decision ?
What is the Bayes error rate ?

Potrebbero piacerti anche