Relation between Binomial and Poisson Distributions

Download Report

Transcript Relation between Binomial and Poisson Distributions

More on Distribution Function
• The distribution of a random variable X can be determined directly from its
cumulative distribution function FX .
• Theorem: Let X be any random variable, with cumulative distribution
function FX . Let B be any subset of real numbers. Then P X  B  can be
determined solely from the values of FX (x).
• Proof:
STA347 - week 5
1
Expectation
• In the long run, rolling a die repeatedly what average result do you expect?
• In 6,000,000 rolls expect about 1,000,000 1’s, 1,000,000 2’s etc.
Average is
• For a random variable X, the Expectation (or expected value or mean) of X
is the expected average value of X in the long run.
• Symbols: μ, μX, E(X) and EX.
STA347 - week 5
2
Expectation of Discrete Random Variable
• For a discrete random variable X with pmf
whenever the sum converge absolutely
STA347 - week 5
.
3
Examples
1) Roll a die. Let X = outcome on 1 roll. Then E(X) = 3.5.
2) Bernoulli trials
and
. Then
3) X ~ Binomial(n, p). Then
4) X ~ Geometric(p). Then
5) X ~ Poisson(λ). Then
STA347 - week 5
4
Expectation of Continuous Random Variable
• For a continuous random variable X with density
whenever this integral converge absolutely.
STA347 - week 5
5
Examples
1) X ~ Uniform(a, b). Then
2) X ~ Exponential(λ). Then
3) X is a random variable with density
(i) Check if this is a valid density.
(ii) Find E(X)
STA347 - week 5
6
4) X ~ Gamma(α, λ). Then
5) X ~ Beta(α, β). Then
STA347 - week 5
7
Theorem
For g: R  R
• If X is a discrete random variable then
Eg  X    g x  p X x 
x
• If X is a continuous random variable
Eg  X    g x  f X x dx


• Proof:
STA347 - week 5
8
Examples
1. Suppose X ~ Uniform(0, 1). Let Y  X 2 then,
2. Suppose X ~ Poisson(λ). Let Y  e X, then
STA347 - week 5
9
Properties of Expectation
For X, Y random variables and a, b  R constants,
• E(aX + b) = aE(X) + b
Proof: Continuous case
• E(aX + bY) = aE(X) + bE(Y)
Proof to come…
• If X is a non-negative random variable, then E(X) = 0
if and only if X = 0 with probability 1.
• If X is a non-negative random variable, then E(X) ≥ 0
• E(a) = a
STA347 - week 5
10
Moments
• The kth moment of a distribution is E(Xk). We are usually interested in 1st
and 2nd moments (sometimes in 3rd and 4th)
• Some second moments:
1. Suppose X ~ Uniform(0, 1), then
 
E X2 
1
3
2. Suppose X ~ Geometric(p), then
  x

EX
2
2
pq x 1 
x 1
STA347 - week 5
11
Variance
• The expected value of a random variable E(X) is a measure of the “center”
of a distribution.
• The variance is a measure of how closely concentrated to center (µ) the
probability is. It is also called 2nd central moment.
• Definition
The variance of a random variable X is

 
Var X   E  X  E X   E  X   
2
2

• Claim: Var X   EX 2   EX 2  EX 2    2
Proof:
• We can use the above formula for convenience of calculation.
• The standard deviation of a random variable X is denoted by σX ; it is the
square root of the variance i.e.  X  Var X  .
STA347 - week 5
12
Properties of Variance
For X, Y random variables and are constants, then
• Var(aX + b) = a2Var(X)
Proof:
• Var(aX + bY) = a2Var(X) + b2Var(Y) + 2abE[(X – E(X ))(Y – E(Y ))]
Proof:
• Var(X) ≥ 0
• Var(X) = 0 if and only if X = E(X) with probability 1
• Var(a) = 0
STA347 - week 5
13
Examples
1. Suppose X ~ Uniform(0, 1), then E  X  
2
1 1
1
Var X      
3  2
12
1
1
and E X 2   therefore
3
2
2. Suppose X ~ Geometric(p), then E  X  
Var X  
1 q
1
and E X 2   2 therefore
p
p
1 q 1
q 1 p


 2
2
2
2
p
p
p
p
3. Suppose X ~ Bernoulli(p), then E X   p and EX 2   12 p  0 2 q  p
therefore,
Var X   p  p 2  p1  p
STA347 - week 5
14
Joint Distribution of two or More Random Variables
• Sometimes more than one measurement (r.v.) is taken on each member of
the sample space. In cases like this there will be a few random variables
defined on the same probability space and we would like to explore their
joint distribution.
• Joint behavior of 2 random variable (continuous or discrete), X and Y
is determined by their joint cumulative distribution function
FX ,Y x, y   P X  x, Y  y .
• n – dimensional case
FX1 ,...,X n x1 ,...,xn   P X 1  x1 , , X n  xn .
STA347 - week 5
15
Properties of Joint Distribution Function
For random variables X, Y , FX,Y : R2  [0,1] given by FX,Y (x,y) = P(X ≤ x,Y ≤ x)


lim FX ,Y  x, y   0
x  
y  
lim FX ,Y  x, y   1
x 
y 
• FX,Y (x,y) is non-decreasing in each variable i.e.
FX ,Y x1 , y1   FX ,Y x2 , y2 
if x1 ≤ x2 and y1 ≤ y2 .
• lim FX ,Y x, y   FY  y 
x 
and
lim FX ,Y x, y   FX x 
y 
STA347 - week 5
16
Discrete case
• Suppose X, Y are discrete random variables defined on the same probability
space.
• The joint probability mass function of 2 discrete random variables X and Y
is the function pX,Y(x,y) defined for all pairs of real numbers x and y by
pX ,Y x, y   P X  x and Y  y 
• For a joint pmf pX,Y(x,y) we must have: pX,Y(x,y) ≥ 0 for all values of x,y and
 p x, y   1
X ,Y
x
y
STA347 - week 5
17
Example for illustration
• Toss a coin 3 times. Define,
X: number of heads on 1st toss, Y: total number of heads.
• The sample space is Ω ={TTT, TTH, THT, HTT, THH, HTH, HHT, HHH}.
• We display the joint distribution of X and Y in the following table
• Can we recover the probability mass function for X and Y from the joint table?
• To find the probability mass function of X we sum the appropriate rows of the
table of the joint probability function.
• Similarly, to find the mass function for Y we sum the appropriate columns.
STA347 - week 5
18
Marginal Probability Function
• The marginal probability mass function for X is
p X x   p X ,Y x, y 
y
• The marginal probability mass function for Y is
pY  y    p X ,Y x, y 
x
STA347 - week 5
19
• Case of several discrete random variables is analogous.
• If X1,…,Xm are discrete random variables on the same sample space
with joint probability function
pX1,...X n x1,...xm   P X1  x1,..., X m  xm 
The marginal probability function for X1 is
pX1 x1  
p
X1 ,...X n
x1,...xm 
x2 ,...,xm
• The 2-dimentional marginal probability function for X1 and X2 is
pX1X 2 x1, x2  
p
X1 ,...X n
x1, x2 , x3 ,...,xm 
x3 ,...,xm
STA347 - week 5
20
Example
• Roll a die twice. Let X: number of 1’s and Y: total of the 2 die.
There is no available form of the joint mass function for X, Y. We display
the joint distribution of X and Y with the following table.
• The marginal probability mass function of X and Y are
• Find P(X ≤ 1 and Y ≤ 4)
STA347 - week 5
21
The Joint Distribution of two Continuous R.V’s
• Definition
Random variables X and Y are (jointly) continuous if there is a non-negative
function fX,Y(x,y) such that
P X , Y   A   f X ,Y x, y dxdy
A
for any “reasonable” 2-dimensional set A.
• fX,Y(x,y) is called a joint density function for (X, Y).
• In particular , if A = {(X, Y): X ≤ x, Y ≤ x}, the joint CDF of X,Y is
FX ,Y x, y   
x

y
 
f X ,Y u, v du, dv
• From Fundamental Theorem of Calculus we have
2
2
f X .Y x, y  
FX ,Y x, y  
FX ,Y x, y 
xy
yx
STA347 - week 5
22
Properties of joint density function
•
f X ,Y x, y   0 for all x, y  R
• It’s integral over R2 is

 

 
f X ,Y x, y dxdy  1
STA347 - week 5
23
Example
• Consider the following bivariate density function

12 2
 x  xy
f X ,Y x, y    7

0


0  x 1, 0  y 1
otherwise
• Check if it’s a valid density function.
• Compute P(X > Y).
STA347 - week 5
24
Marginal Densities and Distribution Functions
• The marginal (cumulative) distribution function of X is
FX x   P X  x   
x


 
f X ,Y u, y dydu
• The marginal density of X is then

f X x   FX' x    f X ,Y x, y dy

• Similarly the marginal density of Y is

fY  y    f X ,Y x, y dx

STA347 - week 5
25
Example
• Consider the following bivariate density function
6 xy 2
f X ,Y x, y   
0

0  x 1, 0  y 1
otherwise
• Check if it is a valid density function.
• Find the joint CDF of (X, Y) and compute P(X ≤ ½ ,Y ≤ ½ ).
• Compute P(X ≤ 2 ,Y ≤ ½ ).
• Find the marginal densities of X and Y.
STA347 - week 5
26
Generalization to higher dimensions
Suppose X, Y, Z are jointly continuous random variables with density f(x,y,z), then
• Marginal density of X is given by:

f X x   

 f x, y, z dydz
 
• Marginal density of X, Y is given by :

f X ,Y x, y    f x, y, z dz

STA347 - week 5
27
Example
Given the following joint CDF of X, Y
FX ,Y x, y  x 2 y  y 2 x  x 2 y 2
0  x 1, 0  y 1
• Find the joint density of X, Y.
• Find the marginal densities of X and Y and identify them.
STA347 - week 5
28
Example
Consider the joint density
2 e   y
f X ,Y x, y   
 0
0 x y
otherwise
where λ is a positive parameter.
• Check if it is a valid density.
• Find the marginal densities of X and Y and identify them.
STA347 - week 5
29
Independence of random variables
• Recall the definition of random variable X: mapping from Ω to R such that
   : X    x F for x  R .
• By definition of F, this implies that (X > 1.4) is an event and for the
discrete case (X = 2) is an event.
• In general  X  A   : X    A is an event for any set A that is formed
by taking unions / complements / intersections of intervals from R.
• Definition
Random variables X and Y are independent if the events  X  A and Y  B
are independent.
STA347 - week 5
30
Theorem
• Two discrete random variables X and Y with joint pmf pX,Y(x,y) and
marginal mass function pX(x) and pY(y), are independent if and only if
pX ,Y x, y   pX x pY  y 
• Proof:
• Question: Back to the rolling die 2 times example, are X and Y
independent?
STA347 - week 5
31
Theorem
• Suppose X and Y are jointly continuous random variables. X and Y
are independent if and only if given any two densities for X and Y
their product is the joint density for the pair (X,Y) i.e.
f X ,Y x, y   f X x fY  y 
Proof:
• If X and Y are independent random variables and Z =g(X), W = h(Y)
then Z, W are also independent.
STA347 - week 5
32
Example
• Suppose X and Y are discrete random variables whose values are the nonnegative integers and their joint probability function is
p X ,Y x, y  
1 x y     
 e
x! y!
x, y  0,1,2...
Are X and Y independent? What are their marginal distributions?
• Factorization is enough for independence, but we need to be careful of
constant terms for factors to be marginal probability functions.
STA347 - week 5
33
Example and Important Comment
• The joint density for X, Y is given by

4 x  y 2
f X ,Y x, y   
 0

x, y  0 , x  y  1
otherwise
• Are X, Y independent?
• Independence requires that the set of points where the joint density is
positive must be the Cartesian product of the set of points where the
marginal densities are positive i.e. the set of points where fX,Y(x,y) >0
must be (possibly infinite) rectangles.
STA347 - week 5
34