Jointly distributed Random variables Multivariate distributions

Download Report

Transcript Jointly distributed Random variables Multivariate distributions

Jointly distributed Random
variables
Multivariate distributions
Quite often there will be 2 or more random
variables (X, Y, etc) defined for the same
random experiment.
Example:
A bridge hand (13 cards) is selected from a deck
of 52 cards.
X = the number of spades in the hand.
Y = the number of hearts in the hand.
In this example we will define:
p(x,y) = P[X = x, Y = y]
The function
p(x,y) = P[X = x, Y = y]
is called the joint probability function of
X and Y.
Note:
The possible values of X are 0, 1, 2, …, 13
The possible values of Y are also 0, 1, 2, …, 13
and X + Y ≤ 13.
p  x, y   P  X  x, Y  y 
The number of
ways of choosing
the x spades for the
hand
The number of
ways of choosing
the y hearts for the
hand
The number of ways
of completing the hand
with diamonds and
clubs.
13 13  26 
  

x  y 13  x  y 


 52 
The total number of
 
ways of choosing the
13
 
13 cards for the hand
Table: p(x,y)
0
13 13  26 
  

x  y 13  x  y 


 52 
 
 13 
1
2
3
4
5
6
7
8
9
10
11
12
13
0 0.0000
0.0002
0.0009
0.0024
0.0035
0.0032
0.0018
0.0006
0.0001
0.0000
0.0000
0.0000
0.0000
0.0000
1 0.0002
0.0021
0.0085
0.0183
0.0229
0.0173
0.0081
0.0023
0.0004
0.0000
0.0000
0.0000
0.0000
-
2 0.0009
0.0085
0.0299
0.0549
0.0578
0.0364
0.0139
0.0032
0.0004
0.0000
0.0000
0.0000
-
-
3 0.0024
0.0183
0.0549
0.0847
0.0741
0.0381
0.0116
0.0020
0.0002
0.0000
0.0000
-
-
-
4 0.0035
0.0229
0.0578
0.0741
0.0530
0.0217
0.0050
0.0006
0.0000
0.0000
-
-
-
-
5 0.0032
0.0173
0.0364
0.0381
0.0217
0.0068
0.0011
0.0001
0.0000
-
-
-
-
-
6 0.0018
0.0081
0.0139
0.0116
0.0050
0.0011
0.0001
0.0000
-
-
-
-
-
-
7 0.0006
0.0023
0.0032
0.0020
0.0006
0.0001
0.0000
-
-
-
-
-
-
-
8 0.0001
0.0004
0.0004
0.0002
0.0000
0.0000
-
-
-
-
-
-
-
-
9 0.0000
0.0000
0.0000
0.0000
0.0000
-
-
-
-
-
-
-
-
-
10 0.0000
0.0000
0.0000
0.0000
-
-
-
-
-
-
-
-
-
-
11 0.0000
0.0000
0.0000
-
-
-
-
-
-
-
-
-
-
-
12 0.0000
13 0.0000
0.0000
-
-
-
-
-
-
-
-
-
-
-
-
-
Bar graph: p(x,y)
0.0900
p(x,y)
0.0800
0.0700
0.0600
0.0500
0.0400
0.0300
0.0200
12
0.0100
9
6
0
1
3
2
3
4
5
x
6
7
8
9
0
10 11
12 13
y
Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
Now
p(x,y) = P[X = x, Y = y]
The possible values of X are 0, 1, 2, 3, 4, 5.
The possible values of Y are 0, 1, 2, 3, 4, 5.
and X + Y ≤ 5
A typical outcome of rolling a die n = 5 times
will be a sequence F5FF6 where F denotes the
outcome {1,2,3,4}. The probability of any such
sequence will be:
x
y
1 1 4
     
6 6 6
5 x  y
where
x = the number of sixes in the sequence and
y = the number of fives in the sequence
Now
p(x,y) = P[X = x, Y = y]
x
y
1 1 4
K     
6 6 6
5 x  y
Where K = the number of sequences of length 5
containing x sixes and y fives.
 5  5  x  5  x  y 
  


x
y
5

x

y
 



   5  x ! 
5!
5!





 x ! 5  x !   y ! 5  x  y !  x ! y ! 5  x  y !



Thus
p(x,y) = P[X = x, Y = y]
x
y
5!
1 1 4

     
x ! y ! 5  x  y !  6   6   6 
if x + y ≤ 5 .
5 x  y
Table: p(x,y)
x
y
5!
1 1 4

     
x ! y ! 5  x  y !  6   6   6 
0
1
2
3
4
5
0
0.1317
0.1646
0.0823
0.0206
0.0026
0.0001
1
0.1646
0.1646
0.0617
0.0103
0.0006
0
2
0.0823
0.0617
0.0154
0.0013
0
0
3
0.0206
0.0103
0.0013
0
0
0
5 x  y
4
0.0026
0.0006
0
0
0
0
5
0.0001
0
0
0
0
0
Bar graph: p(x,y)
p(x,y)
0.1800
0.1600
0.1400
0.1200
0.1000
0.0800
0.0600
0.0400
5
4
0.0200
3
2
0.0000
0
1
1
2
x
3
0
4
5
y
General properties of the joint probability
function;
p(x,y) = P[X = x, Y = y]
1.
0  p  x, y   1
2.
 p  x, y   1
x
3.
y
P  X , Y   A   p  x, y 
 x, y   A
Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
What is the probability that we roll more sixes
than fives
i.e. what is P[X > Y]?
Table: p(x,y)
x
y
5!
1 1 4

     
x ! y ! 5  x  y !  6   6   6 
0
1
2
3
4
5
0
0.1317
0.1646
0.0823
0.0206
0.0026
0.0001
1
0.1646
0.1646
0.0617
0.0103
0.0006
0
2
0.0823
0.0617
0.0154
0.0013
0
0
3
0.0206
0.0103
0.0013
0
0
0
P  X  Y    p  x, y   0.3441
x y
5 x  y
4
0.0026
0.0006
0
0
0
0
5
0.0001
0
0
0
0
0
Marginal and conditional
distributions
Definition:
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX(x) = P[X = x] is called the marginal
probability function of X.
and
pY(y) = P[Y = y] is called the marginal
probability function of Y.
Note: Let y1, y2, y3, … denote the possible values of Y.
pX  x   P  X  x 
 P  X  x, Y  y1   X  x, Y  y2  

 P  X  x, Y  y1   P  X  x, Y  y2  
 p  x, y1   p  x, y2  
  p  x, y j    p  x, y 
j
y
Thus the marginal probability function of X, pX(x) is
obtained from the joint probability function of X and Y by
summing p(x,y) over the possible values of Y.
Also
pY  y   P Y  y 
 P  X  x1 , Y  y   X  x2 , Y  y 
 P  X  x1 , Y  y   P  X  x2 , Y  y  
 p  x1 , y   p  x2 , y  
  p  xi , y    p  x, y 
i
x

Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
0
1
2
3
4
5
p Y (y )
0
0.1317
0.1646
0.0823
0.0206
0.0026
0.0001
0.4019
1
0.1646
0.1646
0.0617
0.0103
0.0006
0
0.4019
2
0.0823
0.0617
0.0154
0.0013
0
0
0.1608
3
0.0206
0.0103
0.0013
0
0
0
0.0322
4
0.0026
0.0006
0
0
0
0
0.0032
5
0.0001
0
0
0
0
0
0.0001
p X (x )
0.4019
0.4019
0.1608
0.0322
0.0032
0.0001
Conditional Distributions
Definition:
Let X and Y denote two discrete random variables
with joint probability function
p(x,y) = P[X = x, Y = y]
Then
pX |Y(x|y) = P[X = x|Y = y] is called the conditional
probability function of X given Y
=y
and
pY |X(y|x) = P[Y = y|X = x] is called the conditional
probability function of Y given
X=x
Note
pX Y  x y   P  X  x Y  y 
P  X  x, Y  y  p  x , y 


P Y  y 
pY  y 
and
pY X  y x   P Y  y X  x 

P  X  x, Y  y 
P  X  x

p  x, y 
pX  x 
• Marginal distributions describe how one
variable behaves ignoring the other variable.
• Conditional distributions describe how one
variable behaves when the other variable is
held fixed
Example:
A die is rolled n = 5 times
X = the number of times a “six” appears.
Y = the number of times a “five” appears.
y
x
0
1
2
3
4
5
p Y (y )
0
0.1317
0.1646
0.0823
0.0206
0.0026
0.0001
0.4019
1
0.1646
0.1646
0.0617
0.0103
0.0006
0
0.4019
2
0.0823
0.0617
0.0154
0.0013
0
0
0.1608
3
0.0206
0.0103
0.0013
0
0
0
0.0322
4
0.0026
0.0006
0
0
0
0
0.0032
5
0.0001
0
0
0
0
0
0.0001
p X (x )
0.4019
0.4019
0.1608
0.0322
0.0032
0.0001
The conditional distribution of X given Y = y.
pX |Y(x|y) = P[X = x|Y = y]
y
x
0
1
2
3
4
5
0
0.3277
0.4096
0.2048
0.0512
0.0064
0.0003
1
0.4096
0.4096
0.1536
0.0256
0.0016
0.0000
2
0.5120
0.3840
0.0960
0.0080
0.0000
0.0000
3
0.6400
0.3200
0.0400
0.0000
0.0000
0.0000
4
0.8000
0.2000
0.0000
0.0000
0.0000
0.0000
5
1.0000
0.0000
0.0000
0.0000
0.0000
0.0000
The conditional distribution of Y given X = x.
pY |X(y|x) = P[Y = y|X = x]
y
x
0
1
2
3
4
5
0
0.3277
0.4096
0.5120
0.6400
0.8000
1.0000
1
0.4096
0.4096
0.3840
0.3200
0.2000
0.0000
2
0.2048
0.1536
0.0960
0.0400
0.0000
0.0000
3
0.0512
0.0256
0.0080
0.0000
0.0000
0.0000
4
0.0064
0.0016
0.0000
0.0000
0.0000
0.0000
5
0.0003
0.0000
0.0000
0.0000
0.0000
0.0000
Example
A Bernoulli trial (S - p, F – q = 1 – p) is repeated until
two successes have occurred.
X = trial on which the first success occurs
and
Y = trial on which the 2nd success occurs.
Find the joint probability function of X, Y.
Find the marginal probability function of X and Y.
Find the conditional probability functions of Y given X
= x and X given Y = y,
Solution
A typical outcome would be:
x
y
FFF…FSFFF…FS
x-1
y–x-1
p  x, y   P  X  x, Y  y 
 q x1 pq y  x1 p  q y 2 p 2 if y  x
q y  2 p 2
p  x, y   
 0
if y  x
otherwise
p(x,y) - Table
y
x
1
2
3
4
5
6
7
8
1
0
0
0
0
0
0
0
0
2
p2
0
0
0
0
0
0
0
3
4
5
6
p2q p2q2 p2q3 p2q4
p2q p2q2 p2q3 p2q4
0 p2q2 p2q3 p2q4
0
0 p2q3 p2q4
0
0
0 p2q4
0
0
0
0
0
0
0
0
0
0
0
0
7
p2q5
p2q5
p2q5
p2q5
p2q5
p2q5
0
0
8
p2q6
p2q6
p2q6
p2q6
p2q6
p2q6
p2q6
0
The marginal distribution of X
pX  x   P  X  x    p  x, y 

y


2
pq
y 2
y  x 1
 p 2 q x 1  p 2 q x  p 2 q x 1  p 2 q x 2 
pq
x 1
pq
x 1
2
2
1  q  q
2
q 
3
 1 
x 1

  pq
 1 q 
This is the geometric distribution

The marginal distribution of Y
pY  y   P Y  y    p  x, y 
x
 y  1 p 2 q y 2

0

y  2,3, 4,
otherwise
This is the negative binomial distribution with k = 2.
The conditional distribution of X given Y = y
pX Y  x y   P  X  x Y  y 

P  X  x, Y  y 
P Y  y 

p  x, y 
pY  y 
p 2 q y 2

pq x 1
 pq y  x 1 for y  x  1, x  2, x  3
This is the geometric distribution with time starting at x.
The conditional distribution of Y given X = x
pY X  y x   P Y  y X  x 

P  X  x, Y  y 
P  X  x

p  x, y 
pX  x 
p 2 q y 2
1


2 y 2
 y  1 p q
 y  1
for x  1, 2,3,
,  y 1
This is the uniform distribution on the values 1, 2, …(y – 1)