Probability theory
Download
Report
Transcript Probability theory
Random Vector
Martina LitschmannovΓ‘
[email protected]
K210
Random Vectors
ο§ An k- dimensional random vector is a function X = (π1, β¦ , ππ )π that
associates a vector of real numbers with each element is a random
variable.
For example:
ο§ A semiconductor chip is divided into βMβ regions. For the random
experiment of finding the number of defects and their locations, let
ππ denote the number of defects in ith region. Then π΅ =
(π1, β¦ , ππ)π is a discrete random vector.
ο§ In a random experiment of selecting a studentβs name, let π»π =
height of ith student in inches and ππ = weight of ith student in
pounds. Then πΊ = (π»π, ππ )π is a continuous random vector.
2 β dimensional Random Vectors
ο§ We're going to focus on 2-dimensional distributions (i.e. random
vector consists only of two random variables) but higher
dimensions (more than two variables) are also possible.
Joint Probability Distribution
ο§ Joint distribution for random vector π, π π defines the probability
of events defined in terms of both X and Y.
Joint cumulative distribution function for πΏ = π, π π is given by
πΉ π₯, π¦ = π π < π₯ β§ π < π¦ = π π < π₯; π < π¦ .
πΉ π₯, π¦
Joint cumulative distribution function
Joint cumulative distribution function for πΏ = π, π π is given by
πΉ π₯, π¦ = π π < π₯ β§ π < π¦ = π π < π₯; π < π¦ .
Properties of joint CDF:
1.
2.
3.
β π₯, π¦ β β2 : 0 β€ πΉ π₯, π¦ β€ 1,
lim πΉ π₯, π¦ = lim πΉ π₯, π¦ = 0,
π₯βββ
lim
π₯,π¦ β β,β
π¦βββ
πΉ π₯, π¦ = 1,
4. πΉ π₯, π¦ is nondecreasing in each variable,
5. πΉ π₯, π¦ is continous from the left in each variable.
6. π π β€ π < π, π β€ π < π = πΉ π, π β πΉ π, π β πΉ π, π + πΉ π, π
Discrete Joint Probability Distributions
The probability function, also known as the probability mass function
for a joint probability distribution is defined such that:
π π₯π , π¦π = π π = π₯π β§ π = π¦π = π π = π₯π ; π = π¦π .
π π₯π , π¦π = P π = π₯π |π = π¦π β π π = π¦π = P π = π¦π |π = π₯π β π π = π₯π
π1 π2
π π₯π , π¦π = 1,
π=1 π=1
π1
β₯ 1, π2 β₯ 1
Discrete Joint Probability Distributions
Probability Mass Function for a Joint Probability Distribution :
π π₯π , π¦π = π π = π₯π β§ π = π¦π = π π = π₯π ; π = π¦π
Properties of p.m.f.:
1. π π₯π , π¦π > 0 only for a finite or countable set of values π₯π , π¦π
2. 0 β€ π π₯π , π¦π β€ 1,
π1
π2
3.
π1 β₯ 1, π2 β₯ 1,
π=1 π=1 π π₯π , π¦π = 1,
4. π π₯π , π¦π = P π = π₯π |π = π¦π β π π = π¦π = P π = π¦π |π = π₯π β
π π = π₯π ,
5. If X, Y are independent π π₯π , π¦π = P π = π₯π β π π = π¦π
Table of joint probabilities
X\Y
x1
x2
y1
p(x1, y1)
p(x2, y1)
y2
p(x1, y2)
p(x2, y2)
xn1
p(xn1, y1) p(xn1, y2)
...
...
...
...
...
yn2
p(x1, yn2)
p(x2, yn2)
p(xn1, yn2)
1. A random experiment consists of tossing coin (X) and flipping die
(Y). Find probability mass function for a joint probability
distribution of a random vector π, π .
X/Y
0 (head)
1 (tail)
1
1/12
1/12
2
1/12
1/12
3
1/12
1/12
4
1/12
1/12
5
1/12
1/12
6
1/12
1/12
1. A random experiment consists of tossing coin (X) and flipping die
(Y). Find probability mass function for a joint probability
distribution of a random vector π, π .
X/Y
0 (head)
1 (tail)
1
1/12
1/12
2
1/12
1/12
3
1/12
1/12
4
1/12
1/12
5
1/12
1/12
6
1/12
1/12
1. A random experiment consists of tossing coin (X) and flipping die
(Y). Find probability mass function for a joint probability
distribution of a random vector π, π .
X/Y
0 (head)
1 (tail)
1
1/12
1/12
2
1/12
1/12
3
1/12
1/12
4
1/12
1/12
5
1/12
1/12
6
1/12
1/12
1
control cell
2. Probability mass function for a joint probability distribution of a
random vector π, π is given as:
X/Y
-1
0
1
-2
0,13
0
0,23
0
0,11
0,11
0,01
1
0,07
0
0
2
0
0,33
0,01
1
Find:
a) π 0; 1
b) π 1; 0
c) π 3; 1
d) π π < 1,3; π < β0,6
e) π π < 1,3; π > β0,6
f) πΉ 0,7; 1,3
Continous Joint Probability Distributions
πΉ π₯, π¦ =
π¦
π₯
π
ββ ββ
π , π‘ ππ ππ‘,
where π π₯, π¦ is Joint Probability Density Function.
Properties of Joint Probability Density Function:
1. π π₯, π¦ β₯ 0,
2.
β β
π
ββ ββ
3. If
π2 πΉ π₯,π¦
ππ₯ππ¦
π₯, π¦ ππ₯ ππ¦ = 1,
exists, pak π π₯, π¦ =
4. π π β€ π < π, π β€ π < π =
π2 πΉ π₯,π¦
ππ₯ππ¦
π π
π
π π
,
π₯, π¦ ππ₯ ππ¦.
3.
Find the constant c so that function
π π₯, π¦ =
π π₯+π¦
0
π₯, π¦ β 0; 1 Γ 0; 1
π₯, π¦ β 0; 1 Γ 0; 1
can be a joint probability density function of a random vector π, π π .
That the function π π₯, π¦ can be a joint probability density function of a
random vector π, π π , it has be satisfy condition
β β
π
ββ ββ
1 1
π
0 0
1 1
π
0 0
π₯, π¦ ππ₯ ππ¦ = 1.
π₯ + π¦ ππ₯ ππ¦ = 1
π₯ + π¦ ππ₯ ππ¦ = π
=π
1 π₯2
0 2
+ π₯π¦
1 1
+
0 2
1
0
ππ¦ = π
π¦ ππ¦ = π
1
π¦
2
1 π₯2
0 2
+
+ π₯π¦
1
π¦2
2 0
1
0
ππ¦ =
=π β π=1
Marginal probability distributions
Obtained by summing or integrating the joint probability distribution
over the values of the other random variable.
ο§ Discrete Random Vector
ππ π₯π =
π¦π
π π₯π , π¦π , π β₯ 1,
ππ π¦π =
π₯π
π π₯π , π¦π , π β₯ 1.
ο§ Continous Random Vector
ππ π₯ =
ππ π¦ =
β
π
ββ
β
π
ββ
π₯, π¦ ππ¦, π₯ β β,
π₯, π¦ ππ₯ , π¦ β β.
4. A random experiment consists of tossing coin (X) and flipping die
(Y). Probability mass function for a joint probability distribution of
a random vector π, π is given as:
X/Y
0 (head)
1 (tail)
1
1/12
1/12
2
1/12
1/12
3
1/12
1/12
4
1/12
1/12
5
1/12
1/12
6
1/12
1/12
1
Find Marginal Probability Mass Functions ππ π₯π and ππ π¦π .
4. A random experiment consists of tossing coin (X) and flipping die
(Y). Probability mass function for a joint probability distribution of
a random vector π, π is given as:
X/Y
0 (head)
1 (tail)
ππ π¦π
1
1/12
1/12
2/12
2
1/12
1/12
2/12
3
1/12
1/12
2/12
4
1/12
1/12
2/12
5
1/12
1/12
2/12
6
ππ π₯π
1/12 6/12
1/12 6/12
2/12
1
Find Marginal Probability Mass Functions ππ π₯π and ππ π¦π .
4. A random experiment consists of tossing coin (X) and flipping die
(Y). Probability mass function for a joint probability distribution of
a random vector π, π is given as:
X/Y
0 (head)
1 (tail)
ππ π¦π
1
1/12
1/12
2/12
2
1/12
1/12
2/12
3
1/12
1/12
2/12
4
1/12
1/12
2/12
5
1/12
1/12
2/12
6
ππ π₯π
1/12 6/12
1/12 6/12
2/12
1
Marginal Probability Mass Functions ππ π₯π and ππ π¦π .
Y
ππ π¦π
X
ππ π₯π
1
2/12
2
2/12
0 (head)
6/12
3
2/12
1 (tail)
6/12
4
2/12
5
2/12
6
2/12
4. A random experiment consists of tossing coin (X) and flipping die
(Y). Probability mass function for a joint probability distribution of
a random vector π, π is given as:
X/Y
0 (head)
1 (tail)
ππ π¦π
1
1/12
1/12
1/6
2
1/12
1/12
1/6
3
1/12
1/12
1/6
4
1/12
1/12
1/6
5
1/12
1/12
1/6
6
ππ π₯π
1/12
1/2
1/12
1/2
1/6
1
Marginal Probability Mass Functions ππ π₯π and ππ π¦π .
Y
ππ π¦π
X
ππ π₯π
1
1/6
2
1/6
0 (head)
1/2
3
1/6
1 (tail)
1/2
4
1/6
5
1/6
6
1/6
5. A certain farm produces two kinds of eggs on any given day;
organic and non-organic. Let these two kinds of eggs be
represented by the random variables X and Y respectively. Given
that the joint probability density function of these variables is
given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
πππ ππ€βπππ.
Find:
a) marginal density functions ππ π₯ and ππ π¦ .
β
ππ π₯ =
1
π π₯; π¦ ππ¦ =
ββ
β
ππ π¦ =
0
1
π π₯; π¦ ππ₯ =
ββ
0
2
2
π₯ + 2π¦ ππ¦ = π₯π¦ + π¦ 2
3
3
1
0
1
2
2
2 π₯
π₯ + 2π¦ ππ₯ =
+ 2π₯π¦
3
3 2
2
= π₯+1
3
0
1
= 1 + 4π¦
3
5. A certain farm produces two kinds of eggs on any given day;
organic and non-organic. Let these two kinds of eggs be
represented by the random variables X and Y respectively. Given
that the joint probability density function of these variables is
given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
Find:
b) π π
π π<
1
;π
2
β€
1
1
< ;π β€
2
2
0,5 0,5 2
1
2
=
0
0
3
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
π₯ + 2π¦ ππ₯ππ¦ =
πππ ππ€βπππ.
0,5 2 π₯ 2
0
3 2
+
5. A certain farm produces two kinds of eggs on any given day;
organic and non-organic. Let these two kinds of eggs be
represented by the random variables X and Y respectively. Given
that the joint probability density function of these variables is
given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
Find:
c) π π <
1
1
1
;π
2
π π < 2;π > 4 =
=
>
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
πππ ππ€βπππ.
1
4
1
0,5 2
0,25 0 3
1 2 1
0,25 3 8
π₯ + 2π¦ ππ₯ππ¦ =
+ π¦ ππ¦ =
1
2
0,25 3
2 1
π¦
3 8
+
π₯2
2
0,5
+ 2π₯π¦
1
π¦2
2
0,25
=
ππ¦ =
0
5
12
β
1
24
=
π
π
Conditional probability distributions
Conditional Probability Distributions arise from joint probability
distributions where by we need to know that probability of one event
given that the other event has happened, and the random variables
behind these events are joint.
ο§ Discrete Random Vector
π π₯π |π¦π
π π₯π ,π¦π
= π π¦
π π
π π₯π ,π¦π
π π¦π |π₯π =
ππ π₯π
,
ππ π¦π β 0,
,
ππ π₯π β 0.
ο§ Continous Random Vector
π π₯|π¦ =
π π¦|π₯ =
π π₯,π¦
ππ π¦
π π₯,π¦
ππ π₯
,
ππ π¦ β 0,
,
ππ π₯ β 0.
6. Joint and marginal probability distribution of a random vector
π, π is given as:
X/Y
-1
0
1
ππ π¦π
Find:
a) π π = 0|π
b) π π = 0|π
c) π π = 0|π
d) π π = 0|π
e) π π₯|π¦
f) π π¦|π₯
=1
=1
=2
=2
-2
0,13
0
0,23
0,36
0
0,11
0,11
0,01
0,23
1
0,07
0
0
0,07
2
0
0,33
0,01
0,34
ππ π₯π
0,31
0,44
0,25
1
7. Joint probability density function of π, π is given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
Find:
a) conditional density function π π¦|π₯ ,
π π¦|π₯ =
π π₯, π¦
ππ π₯
2
ππ π₯ = 3 π₯ + 1
0
0β€π₯β€1
πππ ππ€βπππ
πππ ππ€βπππ.
7. Joint probability density function of π, π is given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
Find:
b) conditional density function π π₯|π¦ ,
π π₯|π¦ =
π π₯, π¦
ππ π¦
1
ππ π¦ = 3 1 + 4π¦
0
0β€π¦β€1
πππ ππ€βπππ
πππ ππ€βπππ.
Conditional expected value (expectation)
Conditional expectation is the expected value of a real random variable
with respect to a conditional probability distribution.
Discrete random vector:
ο§ πΈ π|π = π¦ = (π₯) π₯ β π π₯|π = π¦
ο§ πΈ π|π = π₯ =
(π¦) π¦
β π π¦|π = π₯
Continous random vector:
ο§ πΈ π|π = π¦ =
ο§ πΈ π|π = π₯ =
β
π₯
ββ
β
π¦
ββ
β π π₯|π = π¦ ππ₯
β π π¦|π = π₯ ππ¦
Conditional variance
Conditional variance is the variance of a conditional probability
distribution.
π· π|π = π¦ = πΈ
π β πΈ π|π = π¦
π· π|π = π₯ = πΈ
π β πΈ π|π = π₯
2
|π = π¦ = πΈ π 2 |π = π¦ β πΈ π|π = π¦
2
2
|π = π₯ = πΈ π 2 |π = π₯ β πΈ π|π = π₯
2
8. Joint and marginal probability distribution of a random vector
π, π is given as:
X/Y
-1
0
1
ππ π¦π
Find:
a) πΈ π|π = 1
b) π· π|π = 1
-2
0,13
0
0,23
0,36
0
0,11
0,11
0,01
0,23
1
0,07
0
0
0,07
2
0
0,33
0,01
0,34
ππ π₯π
0,31
0,44
0,25
1
9. Joint probability density function of π, π is given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
Find:
a) E π|π = 0,5 ,
b) D π|π = 0,5 .
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
πππ ππ€βπππ.
Independence
Two random variables X and Y are independent if
ο§ Discrete Random Variables
βπ, π:
π π₯π , π¦π = ππ π₯π β ππ π¦π
ο§ Continous Random Vector
βπ₯, π¦:
π π₯, π¦ = ππ π₯ β ππ π¦
.
Measures of Linear Independence
Covariance:
πππ = πππ£ π, π = πΈ π β πΈπ π β πΈπ
ο§ πππ β ββ, β
Correlation coefficient:
πππ
πππ£ π, π
=
ππ ππ
ο§ πππ β β1; 1
ο§ πππ is a scaled version of covariance
= πΈ ππ β πΈπ β πΈπ
Covariance
Covariance:
πππ = πππ£ π, π = πΈ π β πΈπ π β πΈπ
ο§ πππ β ββ, β
Covariance matrix:
π·π
πππ£ π, π
πππ£ π, π
π·π
= πΈ ππ β πΈπ β πΈπ
Correlation
Correlation:
πππ
ο§
ο§
ο§
ο§
πππ
πππ
πππ
πππ
πππ£ π, π
=
ππ ππ
β β1; 1
> 0 β¦ π, π are positively correlated
< 0 β¦ π, π are negatively correlated
= 0 β¦ π, π are uncorrelated
Correlation matrix:
1
πππ
πππ
1
Correlation
π π, π
=1,000
π π, π
=0,967
π π, π
π π, π
= -1,000
π π, π
=0,000
π π, π
=0,934
=0,857
π π, π
=-0,143
π π, π
=0,608
10. Joint and marginal probability distribution of a random vector
π, π is given as:
X/Y
-1
0
1
ππ π¦π
-2
0,13
0
0,23
0,36
0
0,11
0,11
0,01
0,23
1
0,07
0
0
0,07
2
0
0,33
0,01
0,34
Find:
a) πΈ π , πΈ π
b) π· π , π· π
c) π π , π π
d) πππ£(π, π)
e) π π, π
f) Are random variable X, Y independent?
g) Are random variable X, Y linear independent?
ππ π₯π
0,31
0,44
0,25
1
11. Joint probability density function of π, π is given by
2
π π₯; π¦ = 3 π₯ + 2π¦
0
Marginal density functions are:
2
ππ π₯ = 3 π₯ + 1
0
0β€π₯β€1 ,
πππ ππ€βπππ
0 β€ π₯ β€ 1; 0 β€ π¦ β€ 1
πππ ππ€βπππ.
1
ππ π¦ = 3 1 + 4π¦
0
Find:
a) πΈ π , πΈ π
b) π· π , π· π
c) π π , π π
d) πππ£(π, π)
e) π π, π
f) Are random variable X, Y independent?
g) Are random variable X, Y linear independent?
0β€π¦β€1
πππ ππ€βπππ
Study materials :
ο§ http://homel.vsb.cz/~bri10/Teaching/Bris%20Prob%20&%20Stat.pdf
(p. 64 - p.70)