Moment Generating Functions

Download Report

Transcript Moment Generating Functions

Lecture X
 Definition 2.3.3. Let X be a random variable with cdf
FX. The moment generating function (mgf) of X (or
FX), denoted MX(t), is
 
M X t   E e
tX
provided that the expectation exists for t in some
neighborhood of 0. That is, there is an h>0 such that,
for all t in –h<t<h, E[etX] exists.
 If the expectation does not exist in a neighborhood of
0, we say that the moment generating function does
not exist.
 More explicitly, the moment generating function can
be defined as:

M X t    etx f x dx

for continuousrandom variables, and
M X t    etx PX  x for discreterandom variables
x
 Theorem 2.3.2: If X has mgf MX(t), then
 n
E  X   M X
n
 0
where we define
n
M
(n)
X
d
0  n M X t 
dt
t 0
 First note that etx can be approximated around zero
using a Taylor series expansion:
1
1
2
3

M X  t   E etx   E e0  tet 0  x  0   t 2et 0  x  0   t 3et 0  x  0  
2
6

2
3
2 t
3 t
 1  E  x  t  E  x   E  x  
2
6


Note for any moment n:
 n
MX
dn
n
n 1
n2
2






 n M X t   E  x   E  x  t  E  x  t 
dt
Thus, as t0
 
M X 0  E x
n 
n
 Leibnitz’s Rule: If f(x,θ), a(θ), and b(θ) are
differentiable with respect to θ, then
d
d
b  
d
d
a   f x, dx  f b ,  d a   f a ,  d b 
b   

f  x, dx
a   
 Berger and Casella proof: Assume that we can
differentiate under the integral using Leibnitz’s rule,
we have
d
d  tx
M X t    e f x dx
dt
dt 
  d
tx 
   e  f x dx
  dt



  xetx f  x dx

 Letting t->0, this integral simply becomes



xf  x  dx  E  x 
 This proof can be extended for any moment of the
distribution function.
Moment Generating Functions for
Specific Distributions
 Application to the Uniform Distribution:
M X t   
b
a
etx
1 1 tx b ebt  e at
dx 
e 
a
ba
ba t
t b  a 

 Following the expansion developed earlier, we have:




1 2
1 3
2 2
1  1  b  a t  b  a t  b  a 3 t 3  
2
6
M X t  
b  a t

b
 1



 a 2 t 2 b3  a 3 t 3


2b  a t
6b  a t
2


1 b  a b  a  t 2 1 b  a  b 2  ab  a 2 t 3
 1


b  a  t 6
b  a 
2
t
1
1 2


 1  a  b t  a  ab  b 2 t 2  
2
6


 Letting b=1 and a=0, the last expression becomes:
1
1 2 1 3
M X t   1  t  t  t  
2 6
24
 The first three moments of the uniform distribution
are then:
1
M X 0 
2
1
1
2 
M X 0  2 
6
3
1
1
3 
M X 0  6 
24
4
1
 Application to the Univariate Normal Distribution
1
M X t  
 2
1

 2


1  x   2

tx
2 2
e e

dx

1 x   
 exptx  2  2

2

 dx

 Focusing on the term in the exponent, we have
1 x   
1  x     2tx 2
tx 

2
2 
2
2
1 x 2  2 x   2  2tx 2

2
2
1 x 2  2 x  tx 2   2

2
2
1 x 2  2 x   t 2   2

2
2
2
2




 The next state is to complete the square in the
numerator.

   c  0
x    t   0
x  2 x   t
2
2
2
2
2
x  2 x  t     2t   t   0
2
2
2
c  2t   t 
2
2
2
4
4
 The complete expression then becomes:
2
2
4 2
x



t


2

t


t

1
1  x  
tx 

2
2 
2
2

2
2
x



t


1
2
2
1 22
 t   t
2
 The moment generating function then becomes:
1 2 2 1

M X t   exp t   t 
2

  2
1 2 2

 exp t   t 
2



 1 x    t 2
 exp  2  2

dx


 Taking the first derivative with respect to t, we get:


1 2 2

M X t      t exp t   t 
2


1
2
 Letting t->0, this becomes:
M X 0  
1
 The second derivative of the moment generating
function with respect to t yields:
1 2 2

M X t    exp t   t  
2


1 2 2

2
2
   t    t exp t   t 
2


2 
2



 Again, letting t->0 yields
M X 0    
2 
2
2
 Let X and Y be independent random variables with
moment generating functions MX(t) and MY(t).
Consider their sum Z=X+Y and its moment generating
function:
t x y
tx ty


e e  
M Z  t   E e   E e

E

E etx  E ety   M X  t  M Y  t 
tz
 We conclude that the moment generating function for
two independent random variables is equal to the
product of the moment generating functions of each
variable.
 Skipping ahead slightly, the multivariate normal
distribution function can be written as:
1
 1

1
f x  
 exp  x   '  x   
2
 2

where Σ is the variance matrix and μ is a vector of
means.
 In order to derive the moment generating function, we
now need a vector t. The moment generating function
can then be defined as:
1


M X  t   exp   ' t  t ' t 
2


 Normal variables are independent if the variance
matrix is a diagonal matrix.
 Note that if the variance matrix is diagonal, the
moment generating function for the normal can be
written as:

12 0 0  

1 
 
2
M X  t   exp   ' t  t '  0 2 0  t 
2

 0 0 32  

 

1 2 2 2 2 2 2 

 exp  1t1   2t2  3t3   t1 1  t2 2  t3 3  
2



1 2 2 
1 2 
1 2 
 exp   1t1  t1 1     2t2  2    3t3  3  
2
2  
2 
 

 M X1  t  M X 2  t  M X 3  t 