Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
PROBLEM SHEET #1
FEB 3, 2020
INDIAN INSTITUTE OF SCIENCE
(a) Prove that under 0 − 1 risk the decision boundary will satisfy ?
Z Z
P (x|ω1 )dx = P (x|ω2 )dx
R2 R1
Now suppose that Gaussian noise ηi with zero mean and variance σ 2 is added
independently to each of the variables xi . By making use of E[ηi ] = 0 and
E[ηi ηj ] = δij σ 2 , show that minimizing ED averaged over the noise distribu-
tion is equivalent to minimizing the sum-of-squares error for noise-free input
variables with the addition of a weight-decay regularization term, in which
the bias parameter w0 is omitted from the regularizer. (Bishop 3.4)
5. A student needs to achieve a decision on which courses to take, based only on
his first lecture. From previous experience he knows the following
where λr is the loss incurred for choosing the (c + 1)th action, rejection,
and λs is the loss incurred for making a substitution error. Show that the
minimum risk is obtained if we decide ωi if P (ωi |x) ≥ P (ωj |x) for all j and if
P (ωi |x) ≥ 1 − λr /λs , and reject otherwise. What happens if λr = 0? What
happens if λr > λs ?
E0 270: MACHINE LEARNING (JAN-APRIL 2020) 3