for this week.

Download Report

Transcript for this week.

SOME GENERAL PROBLEMS
1
Problem
• A certain lion has three possible states of activity
each night; they are ‘very active’ (denoted by θ1),
‘moderately active’ (denoted by θ2), and ‘lethargic
(lacking energy)’ (denoted by θ3). Also, each night
this lion eats people; it eats i people with probability
p(i|θ), θ ϵ Θ={θ1, θ2, θ3} . Of course, the probability
distribution of the number of people eaten depends
on the lion’s activity state θ ϵ Θ. The numeric values
are given in the following table.
2
Problem
i
0
1
2
3
4
p(i|θ1)
p(i|θ2)
p(i|θ3)
0
0.05
0.9
0.05
0.05
0.08
0.05
0.8
0.02
0.8
0.1
0
0.1
0
0
If we are told X=x0 people were eaten last
night, how should we estimate the lion’s
activity state (θ1, θ2 or θ3)?
3
Solution
• One reasonable method is to estimate θ as
that in Θ for which p(x0|θ) is largest. In other
words, the θ ϵ Θ that provides the largest
probability of observing what we did observe.
ˆ  ˆ ( X ) : the MLE of θ based on X
ˆ ( 0 )   3 , ˆ (1)   3 , ˆ ( 2 )   2 , ˆ ( 3 )   1 , ˆ ( 4 )   1
(Taken from “Dudewicz and Mishra, 1988, Modern Mathematical
Statistics, Wiley”)
4
Problem
• Consider the Laplace distribution centered at
the origin and with the shape parameter β,
which for all x has the p.d.f.
f (x |  ) 
1
2
e
| x|/ 
,   0.
Find MME and MLE of β.
5
Problem
• Let X1,…,Xn be independent r.v.s each with
lognormal distribution, ln N(,2). Find the
MMEs of ,2
6
STATISTICAL INFERENCE
PART III
BETTER OR BEST ESTIMATORS,
FISHER INFORMATION, CRAMERRAO LOWER BOUND (CRLB)
7
RECALL: EXPONENTIAL CLASS OF PDFS
• If the pdf can be written in the following form
k
f ( x ;  )  h ( x ) c ( ) exp(
 w j ( ) t j ( x ) )
j 1
then, the pdf is a member of exponential
class of pdfs. (Here, k is the number of
parameters)
8
EXPONENTIAL CLASS and CSS
• Random Sample from Regular Exponential Class
n
Y 

t j (X i )
is a css for .
i 1
9
RAO-BLACKWELL THEOREM
•
Let X1, X2,…,Xn have joint pdf or pmf
f(x1,x2,…,xn;) and let S=(S1,S2,…,Sk) be a
vector of jss for . If T is an UE of () and
(S)=E(TS), then
i) (S) is an UE of () .
ii) (S) is a fn of S, so it is free of .
iii) Var((S) ) Var(T) for all .
• (S) is a better unbiased estimator of () .
10
RAO-BLACKWELL THEOREM
• Notes:
• (S)=E(TS) is at least as good as T.
• For finding the best UE, it is enough to
consider UEs that are functions of a ss,
because all such estimators are at least as good
as the rest of the UEs.
11
Example
•
•
•
•
•
Hogg & Craig, Exercise 10.10
X1,X2~Exp(θ)
Find joint p.d.f. of ss Y1=X1+X2 for θ and Y2=X2.
Show that Y2 is UE of θ with variance θ².
Find φ(y1)=E(Y2|Y1) and variance of φ(Y1).
12
THE MINIMUM VARIANCE UNBIASED
ESTIMATOR
• Rao-Blackwell Theorem: If T is an
unbiased estimator of , and S is a ss for
, then (S)=E(TS) is
– an UE of , i.e.,E[(S)]=E[E(TS)]= and
– with a smaller variance than Var(T).
13
LEHMANN-SCHEFFE THEOREM
• Let Y be a css for . If there is a function Y
which is an UE of , then the function is the
unique Minimum Variance Unbiased
Estimator (UMVUE) of .
• Y css for .
• T(y)=fn(y) and E[T(Y)]=.
T(Y) is the UMVUE of .
So, it is the best unbiased estimator of .
14
THE MINIMUM VARIANCE UNBIASED
ESTIMATOR
• Let Y be a css for . Since Y is complete, there
could be only a unique function of Y which is
an UE of .
• Let U1(Y) and U2(Y) be two function of Y.
Since they are UE’s, E(U1(Y)U2(Y))=0 imply
W(Y)=U1(Y)U2(Y)=0 for all possible values of
Y. Therefore, U1(Y)=U2(Y) for all Y.
15
Example
• Let X1,X2,…,Xn ~Poi(μ). Find UMVUE of μ.
• Solution steps: n
– Show that S   X i is css for μ.
i 1
– Find a statistics (such as S*) that is UE of μ and a
function of S.
– Then, S* is UMVUE of μ by Lehmann-Scheffe Thm.
16
Note
• The estimator found by Rao-Blackwell Thm
may not be unique. But, the estimator found
by Lehmann-Scheffe Thm is unique.
17
RECALL: EXPONENTIAL CLASS OF PDFS
• If the pdf can be written in the following form
k
f ( x ;  )  h ( x ) c ( ) exp(
 w j ( ) t j ( x ) )
j 1
then, the pdf is a member of exponential
class of pdfs. (Here, k is the number of
parameters)
18
EXPONENTIAL CLASS and CSS
• Random Sample from Regular Exponential Class
n
Y 

t j (X i )
is a css for .
i 1
If Y is an UE of , Y is the UMVUE of .
19
EXAMPLES
Let X1,X2,…~Bin(1,p), i.e., Ber(p).
This family is a member of exponential family of
distributions.
t1 ( x )  x
n
x  0 ,..., n
for
n
 t1 ( x i )   x i
i 1
is a CSS for p.
i 1
X
X
is UE of p and a function of CSS.
is UMVUE of p.
20
EXAMPLES
X~N(,2) where both  and 2 is unknown.
Find a css for  and 2 .
21
FISHER INFORMATION AND
INFORMATION CRITERIA
• X, f(x;), , xA (not depend on ).
  x ;    ln f  x ;  
Definitions
and
notations:
  x ;   
  x ;   
f  x ;   
f  x ;   
 ln f  x ; 


 ln f  x ; 
2

2
 f  x ;



 f  x ;
2


2
22
FISHER INFORMATION AND
INFORMATION CRITERIA
The Fisher Information in a random variable X:
I    E   x ;    V   x ;     E   x ;    0
2
The Fisher Information in the random sample:
I n    nI  
Let’s prove the equalities above.
23
FISHER INFORMATION AND
INFORMATION CRITERIA
 f  x ;  dx  1 
A
d
d
 f  x ;  dx  0
A
  f  x ;  dx  0
A
  f  x ;  dx  0
A
  x ;   
  x ;   
 ln f  x ; 

f  x ;  
f  x ; 

f  x ; 


f  x ; 
    x ;   
2
24
FISHER INFORMATION AND
INFORMATION CRITERIA
E    X ; 
E   X ; 
     x ;   f  x ;  dx  0
A
     x ;   f  x ;  dx
A
 f  x ;  
 
    x ; 
A  f  x ; 
2
  f  x ;  dx
  f  x ;  dx     x ; 
A
 0  E    x ; 

 f  x ;  dx
2
A

2
25
FISHER INFORMATION AND
INFORMATION CRITERIA
 E   x ;     E   x ;    V   x ;  
2
The Fisher Information in a random variable X:
I    E   x ;    V   x ;     E   x ;    0
2
The Fisher Information in the random sample:
I n    nI  
Proof of the last equality is available on Casella & Berger (1990), pg.
310-311.
26
CRAMER-RAO LOWER BOUND (CRLB)
•
•
•
•
Let X1,X2,…,Xn be sample random variables.
Range of X does not depend on .
Y=U(X1,X2,…,Xn): a statistic; does’nt contain .
Let E(Y)=m().
2

m  
V Y  
I n  
 The Cramer - Rao Lower Bound
• Let prove this!
27
CRAMER-RAO LOWER BOUND (CRLB)
• -1Corr(Y,Z)1
1 
• 0 Corr(Y,Z)21 

Cov Y , Z 
V Z 
Cov Y , Z 
0
V Y
 V Z 
Cov Y , Z 2
V Y V  Z 
1
1
2
0
 V (Y )
• Take Z=′(x1,x2,…,xn;)
• Then, E(Z)=0 and V(Z)=In() (from previous slides).
28
CRAMER-RAO LOWER BOUND (CRLB)
• Cov(Y,Z)=E(YZ)-E(Y)E(Z)=E(YZ)
E  Y .Z      u  x 1 , x 2 ,  , x n    x 1 , x 2 ,  , x n ;  f ( x 1 ,  x n ;  ) dx 1dx 2  dx n
f  x1 ,  , x n ; 

    u  x1 ,  , x n 
f  x1 ,  , x n ;  dx 1  dx n
f  x1 ,  , x n ;  
    u  x1 ,  , x n  f  x1 ,  , x n ;  dx1  dx n  m  
29
CRAMER-RAO LOWER BOUND (CRLB)
• E(Y.Z)=mʹ(), Cov(Y,Z)=mʹ(), V(Z)=In()
Cov Y , Z 
V Z 
2
0
 V (Y )
The Cramer-Rao Inequality
(Information Inequality)
m  2
V Y  
I n  
 The Cramer - Rao Lower Bound
30
CRAMER-RAO LOWER BOUND (CRLB)
• CRLB is the lower bound for the variance of
an unbiased estimator of m().
• When V(Y)=CRLB, Y is the MVUE of m().
• For a r.s., remember that In()=n I(), so,
m  2
V Y  
nI  
 The Cramer - Rao Lower Bound
31
ASYMPTOTIC DISTRIBUTION OF MLEs
• ˆ : MLE of 
• X1,X2,…,Xn is a random sample.
2

asym ptotically
 m     
ˆ
m 
~
N  m   , R C L B 
m  
nI   


 
asym pt .
ˆ ~
large n

ˆ  

1
nI  


1 
N  ,

 nI    
nI  






d
ˆ
   
 N  0,1 
32
EFFICIENT ESTIMATOR
• T is an efficient estimator (EE) of  if
– T is UE of , and,
– Var(T)=CRLB
• T is an efficient estimator (EE) of its
expectation, m(), if its variance reaches the
CRLB.
• An EE of m() may not exist.
• The EE of m(), if exists, is unique.
• The EE of m() is the unique MVUE of m().
33
ASYMPTOTIC EFFICIENT ESTIMATOR
• Y is an asymptotic EE of m() if
lim E Y   m 
n 

and
lim V Y   CRLB
n 
34
EXAMPLES
A r.s. of size n from X~Poi(θ).
a) Find CRLB for any UE of θ.
b) Find UMVUE of θ.
c) Find an EE for θ.
d) Find CRLB for any UE of exp{-2θ}. Assume
n=1, and show that (  1) x is UMVUE of exp{2θ}. Is this a reasonable estimator?
35
EXAMPLE
A r.s. of size n from X~Exp(). Find UMVUE of ,
if exists.
36
Summary
• We covered 3 methods for finding good
estimators (possibly UMVUE):
– Rao-Blackwell Theorem (Use a ss T, an UE U, and
create a new statistic by E(U|T))
– Lehmann-Scheffe Theorem (Use a css T which is
also UE)
– Cramer-Rao Lower Bound (Find an UE with
variance=CRLB)
37
Problems
• Let X 1 , X 2 ,..., X n be a random sample from gamma
distribution, Xi~Gamma(2,θ). The p.d.f. of X1 is given
by:
1
x /
f ( x1 ) 
 ( 2 )
2
xe
;
x  0,  0
a) Find a complete and sufficient statistic for θ.
b) Find a minimal sufficient statistic for θ.
c) Find CRLB for the variance of an unbiased estimator
of θ.
d) Find a UMVUE of θ.
38
Problems
• Suppose X1,…,Xn are independent with density
for θ>0
a) Find a complete sufficient statistic.
b) Find the CRLB for the variance of unbiased
estimators of 1/θ.
c) Find the UMVUE of 1/θ if there is one.
39