Sei sulla pagina 1di 2

STATS 200 (Stanford University, Summer 2015)

Homework 4:

Due at 12:50 p.m. on August 3

DeGroot & Schervish X.Y .Z means Exercise Z at the end of Section X.Y in our text,
Probability and Statistics (Fourth Edition) by Morris H. DeGroot and Mark J. Schervish.
1. Let X be drawn from a discrete uniform distribution on {1, . . . , N }, where N 1 is an
= X. (We have
unknown positive integer. The maximum likelihood estimator of N is N
discussed this result before, so you do not need to show it.)
(as an estimator of N ).
(a) Find the bias, variance, and mean squared error of N
Note: The discrete uniform distribution on {1, . . . , N } has mean (N + 1)/2 and
variance (N 2 1)/12. You may use these facts without proof.
that is an unbiased estimator of N .
(b) Find an estimator N
), and show that MSEN (N
) > MSEN (N
) for all N 2.
(c) Find MSEN (N
2. Let X Bin(n, ), where 0 < < 1 and is unknown. Let = 1/. Prove that no unbiased
estimator of exists.
Hint: Here, an estimator is fully specified by the value it takes for each x {0, . . . , n}.
Let be any arbitrary estimator of , and let tx be the value that takes when X = x.
when is specified in this way.
Then look at the form of E ()
3. DeGroot & Schervish 8.8.2.
Note: The Geometric(p) distribution as given in Definition 5.5.2 of DeGroot & Schervish
has mean (1 p)/p. You may use this fact without proof.
4. DeGroot & Schervish 8.8.14.
5. Let X1 , . . . , Xn iid Bin(1, ), where 0 < < 1 and is unknown.
(a) Find the Fisher information I() for the sample.
(b) We have shown before that the maximum likelihood estimator of is = n1 ni=1 Xi .
Does it agree
Use your answer to part (a) to state the asymptotic distribution of .
with the result obtained by using the central limit theorem?
6. Let X1 , . . . , Xn iid Pareto(k, ), where the Pareto(k, ) distribution has pdf

x+1
f (x) =

if x k,
if x < k.

Suppose that k > 0 is known and > 0 is unknown.


(a) Find the maximum likelihood estimator
of .
(b) Find the asymptotic distribution of
.
(c) Now suppose instead that k > 0 is unknown and > 0 is known. Compute Ek [`X (k)],
and explain why your answer does not contradict Lemma 7.2.1 from the lecture notes.

Homework 4: Due at 12:50 p.m. on August 3

7. Let X1 , . . . , Xn be iid continuous random variables with a pdf f (x) that is symmetric
about , where R is unknown. Suppose that Var (X1 ) = 2 < is known, which
implies that E (X1 ) = . Then is both the true mean and true median of the pdf f (x),
so it seems plausible that both the sample mean X n and the sample median, which we
will call Mn , could be good estimators of .
(a) Use the central limit theorem to state the asymptotic distribution of the sample
mean X n .
Suppose f (), the value of the pdf at the true mean (and median), satisfies f () > 0.
Then it can be shown that the asymptotic distribution of the sample median Mn is

n(Mn ) D N (0,

1
2 ).

4[f ()]

(You do not need to show this.)


(b) Suppose f (x) is the pdf of a N (, 2 ) distribution, where 2 > 0 is known. Find the
asymptotic relative efficiency of the sample median compared to the sample mean,
and use it to state which estimator performs better asymptotically.
(c) Suppose f (x) = 21 exp(x ), where > 0 is known. Find the asymptotic
relative efficiency of the sample median compared to the sample mean, and use it to
state which estimator performs better asymptotically.
Note: Under this pdf, Var (X1 ) = 2/2 . You may use this fact without proof.

Potrebbero piacerti anche