Nicolas Christou Exam 2 20 February 2

Download Report

Transcript Nicolas Christou Exam 2 20 February 2

University of California, Los Angeles
Department of Statistics
Statistics 100B
Instructor: Nicolas Christou
Exam 2
20 February 2014
Name:
Problem 1 (25 points)
Suppose the prices (in 
$), Y1
, Y2 , Y3 , Y
4
Y1
 Y2 



N4 (µ, Σ), where, Y = 
 Y3 , µ = 
Y4
Answer the following questions:
of 
objects A, B,C, and D are jointly
normally distributed as Y ∼

1
3 2 3 3


3 
, and Σ =  2 5 5 4 .
 3 5 9 5 
6 
4
3 4 5 6
a. Suppose a person wants to buy three of product A, four of product B, and one of product C. Find the
probability that the person will spend more than $30.
b. Find the moment generating function of Y1 .
c. Find the joint moment generating function of (Y1 , Y3 ).
d. Find the correlation coefficient between Y3 and Y4 .
Problem 2 (25 points)
Answer the following questions:
a. We have discussed in class that the Fisher information matrix can be found also using the variance
covariance matrix of the score function vector in the multi-parameter case. For example if θ 0 =
(θ1 , θ2 , . . . , θk ) then the Fisher information matrix based on a random sample of X1 , X2 , . . . , Xn can
be computed using



In = var(S) = n 

var(S1 )
cov(S2 , S1 )
..
.
cov(S1 , S2 ) . . .
var(S2 ) . . .
.. . .
.
.
cov(Sk , S1 ) cov(Sk , S2 ) . . .
cov(S1 , Sk )
cov(S2 , Sk )
..
.


∂lnf (x)

.
 , where Si =
∂θi

var(Sk )
Let X1 , X2 , . . . , Xn be i.i.d. random variables with Xi ∼ N (µ, σ). Use the theorem above to find the
Fisher information matrix. It is given that E(X − µ)3 = 0, and E(X − µ)4 = 3σ 4 .
b. Suppose X1 , X2 , X3 denotes an i.i.d. random sample from an exponential distribution with parameter
1
λ . Consider the following five estimators of λ:
θˆ1 = X1
X1 + X2
θˆ2 =
2
X1 + 2X2
θˆ3 =
3
θˆ4 = min(X1 , X2 , X3 )
¯
θˆ5 = X
Which estimators are unbiased? Among the unbiased estimators, which has the smallest variance?
Please show all your work.
c. Let X1 , X2 , . . . , Xn be i.i.d. random variables with Xi ∼ N (5, σ). Find the MLE of σ 2 .
d. Refer to question (c). Is the estimator of σ 2 MVUE? Explain.
Problem 3 (25 points)
Answer the following questions:
a. Let X1 , X2 , . . . , Xn be i.i.d. random variables on the interval [0, 1] with pdf
f (x) =
Γ(3α)
xα−1 (1 − x)2α−1 , α > 0.
Γ(α)Γ(2α)
It can be shown that E(X) =
1
3
and var(X) =
2
9(3α+1) .
Find the method of moments estimator of α.
b. Consider the linear combination Z0 = w1 Z1 + w2 Z2 + w3 Z3 + w4 Z4 , where Z is a random process with
dij
mean zero and covariance function cov(Zi , Zj ) = 4e− 100 , where dij is the distance between points i, j.
The four points here, form a rectangle with sides of 30 meters and 40 meters (distance between 1, 2 is
30, while distance between 2, 3 is 40 - see picture below). Compute the variance of Z0 . It is given that
w1 = w2 = 1 and w3 = w4 = −1.
1
2
4
3
c. Let X1 , X2 , . . . , X49 be i.i.d random variables with Xi uniform on the interval (0, 10). Find the expected value of X(n) , i.e. find the expected value of max(X1 , X2 , . . . , X49 ).
d. Consider the joint bivariate normal distribution of X and Y :
"
"
2 2
##
x − µx
y − µy
x − µx
x − µy
1
1
p
+
− 2ρ
.
f (x, y) =
exp −
2(1 − ρ2 )
σx
σy
σx
σy
2πσX σY 1 − ρ2
Is it true that when ρ = 0 the joint pdf can be expressed as f (x, y) = f (x) × f (y). What does this
imply?
Problem 4 (25 points)
Answer the following questions:
a. Suppose Z 0 = (Z1 , Z2 , . . . , Zn ) follows a multivariate normal distribution with E(Zi ) = µ, var(Zi ) =
σ 2 , and that the covariance matrix of Z can be expressed as Σ = σ 2 V , where V is an n × n matrix
of known constants. Write the log likelihood function and use it to obtain explicit expression for the
maximum likelihood estimate of σ 2 . Note: |σ 2 V | = (σ 2 )n |V |.
2
b. Let XP
1 , X2 , . . . , Xn be i.i.d. random variables with Xi ∼ N (µ, σ). The unbiased estimator of σ is
n
2
k
(x
−¯
x
)
i
s2 = i=1n−1
. Find ES. Hint: If X ∼ Γ(α, β) then for any k > −α we have EX k = β Γ(α+k)
.
Γ(α)
c. Consider the simple regression model through the origin (no intercept): yi = β1 xi + i , for i = 1, . . . , n,
with E(i ) = 0, var(i ) = σ 2 , cov(i , j ) = 0 for i 6= j because i , j are independent, and i ∼PN (0, σ).
n
xi yi
.
It can be shown that using the method of maximum likelihood the MLE of β1 is given by βˆ1 = Pi=1
n
i=1
Find cov(Yˆi , Yˆj ).
d. Refer to question (c). Find the distribution of βˆ1 .
x2i