MATH30-6 Lecture 9

Download Report

Transcript MATH30-6 Lecture 9

Joint Probability Distributions

MATH30-6 Probability and Statistics

Objectives

• • • At the end of the lesson, the students are expected to Use joint probability mass functions and joint probability density functions to calculate probabilities; Calculate marginal and conditional probability distributions from joint probability distributions; and Interpret and calculate covariances and correlations between random variables.

Joint Probability Mass Function

The joint probability mass function of the discrete random variables X and Y, denoted as 𝑓 𝑋𝑌 𝑥, 𝑦 , satisfies (1) 𝑓 𝑋𝑌 𝑥, 𝑦 ≥ 0 (2) 𝑥 𝑦 𝑓 𝑋𝑌 𝑥, 𝑦 = 1 (3) 𝑓 𝑋𝑌 𝑥, 𝑦 = 𝑃 𝑋 = 𝑥, 𝑌 = 𝑦 (5-1) • • Sometimes referred to as the bivariate probability distribution or bivariate distribution of the random variables 𝑃 𝑋 = 𝑥 and 𝑌 = 𝑦 is usually written as 𝑃 𝑋 =

Joint Probability Mass Function

Examples: 5-1/153 Signal Bars Calls are made to check the airline schedule at your departure city. You monitor the number of bars of signal strength on your cell phone and the number of times you have to state the name of your departure city before the voice system recognizes the name.

In the first four bits transmitted, let X denote the number of bars of signal strength on your cell phone Y denote the number of times you need to state your departure city

Joint Probability Mass Function

By specifying the probability of each of the points in Fig.

5-1, we specify the joint probability distribution of X and Y. Similarly to an individual random variable, we define the range of the random variables (X, Y) to be the set of points (x, y) in two-dimensional space for which the probability that X = x and Y = y is positive.

Joint Probability Mass Function

3.14/95 Two refills for a ballpoint pen are selected at random from a box that contains 3 blue refills, 2 red refills, and 3 green refills. If and 𝑌 𝑋 is the number of blue refills is the number of red refills selected, find (a) the joint probability function 𝑓 𝑥, 𝑦 , and (b) (b) 𝑃 𝑋, 𝑌 ∈ 𝐴 , where 𝐴 is the region 𝑥, 𝑦 |𝑥 +

Joint Probability Density Function

A joint probability density function for the continuous random variables X and Y, denoted as 𝑓 𝑋𝑌 𝑥, 𝑦 , satisfies the following properties: (1) (2) 𝑓 𝑋𝑌 𝑥, 𝑦 ≥ 0 ∞ −∞ ∞ −∞ 𝑓 𝑋𝑌 for all x, y 𝑥, 𝑦 𝑑𝑥 𝑑𝑦 = 1 (3) For any region R of two-dimensional space, 𝑃 𝑋, 𝑌 ∈ 𝑅 = 𝑅 (5-2) 𝑓 𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦

Joint Probability Density Function

Examples: 5-2/155 Server Access Time Let the random variable X denote the time until a computer server connects to your machine (in milliseconds), and let Y denote the time until the server authorizes you as a valid user (in milliseconds).

Each of these random variables measures the wait from a common starting time and X < Y. Assume that the joint probability density function for X and Y is 𝑓 𝑋𝑌 𝑥, 𝑦 = 6 × 10 −6 𝑒 −0.001𝑥−0.002𝑦 for 𝑥 < 𝑦

Joint Probability Density Function

The region with nonzero probability is shaded in Fig. 5-4.

The property that this joint probability density function integrates to 1 can be verified by the integral of over this region as follows: ∞ ∞ 𝑓 𝑋𝑌 𝑥, 𝑦 −∞ = ∞ −∞ 𝑓 𝑋𝑌 ∞ 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 6 × 10 −6 𝑒 −0.001𝑥−0.002𝑦 0 𝑥 = 6 × 10 −6 0 ∞ ∞ 𝑒 −0.002𝑦 𝑥 𝑑𝑦 𝑒 𝑑𝑦 𝑑𝑥 −0.001𝑥 𝑑𝑥

Joint Probability Density Function

∞ −∞ = 6 × 10 −6 ∞ 𝑒 −0.002𝑥 0 0.002

= 0.003 ∞ 𝑒 −0.003𝑥 𝑒 −0.001𝑥 𝑑𝑥 0 ∞ 1 −∞ 𝑓 𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 = 0.003

0.003

𝑑𝑥 = 1

Joint Probability Density Function

Joint Probability Density Function

The probability that X < 1000 and Y < 2000 is determined as the integral over the darkly shaded region in Fig 5-5.

1000 2000 𝑃 𝑋 ≤ 1000, 𝑌 ≤ 2000 = = 6 × 10 −6 1000 0 𝑥 2000 𝑒 −0.002𝑦 𝑓 𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 𝑑𝑦 𝑒 −0.001𝑥 = 6 × 10 −6 0 = 0.003 1000 0 1000 𝑥 𝑒 −0.002𝑥 − 𝑒 0.002

𝑒 −0.003𝑥 − 𝑒 −4 −4 𝑒 𝑒 −0.001𝑥 −0.001𝑥 𝑑𝑥 0 𝑑𝑥 𝑑𝑥

Joint Probability Density Function

= 0.003

1 − 𝑒 −3 − 𝑒 −4 1 − 𝑒 −1 0.003

0.001

𝑃 𝑋 ≤ 1000, 𝑌 ≤ 2000 = 0.003 316.738 − 11.578

= 0.915

Joint Probability Density Function

3.15/96 A privately owned business operates a drive-in facility and a walk-in facility. On a randomly selected day, let 𝑋 and variables is 𝑌 , respectively, be the proportions of the time that the drive-in and the walk-in facilities are in use, and suppose that the joint density function of these random 2 𝑓 𝑥, 𝑦 = 5 2𝑥 + 3𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1, 0, elsewhere.

(a) Verify condition 2.

(b) Find 𝑃 𝑋, 𝑌 ∈ 𝐴 , where 𝐴 = 𝑥, 𝑦 |0 < 𝑥 < 1 2 , 1 4 <

Marginal Probability Mass Function

• Marginal probability distribution The individual probability distribution of a random variable The marginal probability mass functions of of 𝑌 alone are 𝑋 alone and 𝑓 𝑋 𝑥 = 𝑓 𝑥, 𝑦 𝑦 and 𝑓 𝑌 𝑦 = 𝑓 𝑥, 𝑦 𝑥

Marginal Probability Mass Function

Examples: 5-3/157 Marginal Distribution The joint probability distribution of X and Y in Fig. 5-1 can be used to find the marginal probability distribution of X. For example, 𝑓 𝑋 3 = 𝑃 𝑋 = 3 = 𝑃 𝑋 = 3, 𝑌 = 1 + 𝑃 𝑋 = 3, 𝑌 = 2 + 𝑃 𝑋 = 3, 𝑌 = 3 + 𝑃 𝑋 = 3, 𝑌 = 4 = 0.25 + 0.2 + 0.05 + 0.05 = 0.55

The marginal probability distribution for X is found by summing the probabilities in each column, whereas the marginal probability distribution for Y is found by summing the probabilities in each row. The results are shown in Fig. 5-6.

Marginal Probability Mass Function

Marginal Probability Mass Function

3.16/98 Show that the column and row totals of Table 3.1

give the marginal distribution of 𝑋 alone and of 𝑌 alone.

Table 3.1: Joint Probability Distribution for Example 3.14

Row Totals

y

f(x, y) 0 1 2 Column Totals 0 3/28 3/14 1/28 5/14

x

1 9/28 3/14 0 15/28 2 3/28 0 0 3/28 15/28 3/7 1/28 1

Marginal Probability Mass Function

3.50/106 Suppose that X and Y have the following joint probability distribution: f(x, y)

y

1 3 5 2 0.10

0.20

0.10

x

4 0.15

0.30

0.15

(a) Find the marginal distribution of X.

(b) Find the marginal distribution of Y.

Marginal Probability Density Function

If the joint probability density function of random variables X and Y is 𝑓 𝑋𝑌 𝑥, 𝑦 density functions of X and Y are , the marginal probability 𝑓 𝑋 𝑥 = 𝑦 𝑓 𝑋𝑌 𝑥, 𝑦 𝑑𝑦 and 𝑓 𝑌 𝑦 = 𝑥 𝑓 𝑋𝑌 𝑥, 𝑦 𝑑𝑥 (5-3) where the first integral is over all points in the range of 𝑋, 𝑌 for which X = x and the second integral is over all points in the range of 𝑋, 𝑌 for which Y = y.

Marginal Probability Density Function

A probability for only one random variable, say, for example, 𝑃 𝑎 < 𝑋 < 𝑏 , can be found from the marginal probability distribution of X or from the integral of the joint probability distribution of X and Y as 𝑏 𝑏 ∞ 𝑃 𝑎 < 𝑋 < 𝑏 = 𝑏 ∞ 𝑎 𝑓 𝑋 𝑥 𝑑𝑥 = 𝑎 −∞ 𝑓 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 = 𝑎 −∞ 𝑓 𝑥, 𝑦 𝑑𝑦 𝑑𝑥

Marginal Probability Density Function

Examples: 5-4/157 Server Access Time For the random variables that denote times in Example 5-2, calculate the probability that Y exceeds 2000 milliseconds.

This probability is determined as the integral of 𝑓 𝑋𝑌 𝑥, 𝑦 over the darkly shaded region in Fig. 5-7. The region is partitioned into two parts and different limits of integration are determined for each part.

Marginal Probability Density Function

Marginal Probability Density Function

𝑃 𝑌 > 2000 2000 ∞ = 6 × 10 −6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑦 𝑑𝑥 0 ∞ + 2000 2000 ∞ 6 × 10 −6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑦 𝑑𝑥 𝑥

Marginal Probability Density

The first integral is 2000

Function

6 × 10 −6 0 𝑒 −0.002𝑦 −0.002

∞ 2000 2000 6 × 10 −6 = 𝑒 −4 𝑒 −0.001𝑥 𝑑𝑥 0.002

0 𝑒 −0,001𝑥 𝑑𝑥 = 6 × 10 −6 𝑒 −4 0.002

= 0.0475

1 − 𝑒 −2 0.001

Marginal Probability Density

The second integral is

Function

6 × 10 −6 = ∞ 𝑒 −0.002𝑦 −0.002

6 × 10 = 2000 −6 ∞ 0.002

2000 6 × 10 −6 0.002

𝑒 −0.003𝑥 𝑒 −6 0.003

∞ 𝑥 𝑑𝑥 𝑒 −0.001𝑥 = 0.0025

𝑑𝑥 Therefore, 𝑃 𝑌 > 2000 = 0.0475 + 0.0025 = 0.05

Marginal Probability Density Function

Alternatively, the probability can be calculated from the marginal probability distribution of Y as follows. For y > 0, 𝑦 𝑓 𝑌 𝑦 = 6 × 10 −6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑥 0 𝑦 = 6 × 10 −6 𝑒 −0.002𝑦 𝑒 −0.001𝑥 𝑑𝑥 = 6 × 10 −6 𝑒 −0.002𝑦 = 6 × 10 −6 𝑒 −0.002𝑦 0 𝑒 −0.001𝑥 −0.001

𝑦 0 1 − 𝑒 −0.001𝑦 0.001

Marginal Probability Density

𝑓 𝑌

Function

𝑦 = 6 × 10 −3 𝑒 −0.002𝑦 1 − 𝑒 −0.001𝑦 for y > 0 We have obtained the marginal probability density function of Y. Now, ∞ 𝑃 𝑌 > 2000 = 6 × 10 −3 𝑒 −0.002𝑦 1 − 𝑒 −0.001𝑦 𝑑𝑦 = 6 × 10 −3 𝑒 −0.002𝑦 −0.002

2000 ∞ = 6 × 10 −3 𝑒 −4 2000 − 0.002

𝑒 − −6 0.003

𝑒 −0.003𝑦 −0.003

∞ 2000 = 0.05

Marginal Probability Density Function

3.40/105 A fast-food restaurant operates both a drive-through facility and a walk-in facility. On a randomly selected day, let X and Y, respectively, be the proportions of the time that the drive-through and walk-in facilities are in use, and suppose that the joint density function of these random variables is 2 𝑓 𝑥, 𝑦 = 3 𝑥 + 2𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1, 0, elsewhere.

(a) Find the marginal density of X.

(b) Find the marginal density of Y.

(c) Find the probability that the drive-through facility is busy less than one-half of the time.

Conditional Probability Mass Function

Let X and Y be two random variables, discrete or continuous. The conditional distribution of the random variable Y, given that X = x, is 𝑓 𝑋𝑌 𝑓 𝑌|𝑥 𝑦 = 𝑥, 𝑦 , 𝑓 𝑋 𝑥 > 0.

𝑓 𝑋 𝑥 Similarly, the conditional distribution of the random variable X, given that Y = y, is 𝑓 𝑋|𝑦 𝑥 = 𝑓 𝑋𝑌 𝑓 𝑌 𝑥, 𝑦 𝑦 , 𝑓 𝑌 𝑦 > 0.

Conditional Probability Mass Function

Examples: 5-5/159 Signal Bars For Example 5-1, X and Y denote the number of bars of signal strength and times you need to state your departure city received, respectively. Then, 𝑃 𝑌 = 1|𝑋 = 3 = = 𝑓 𝑋𝑌 𝑋 3 = The probability that 𝑌 = 2 𝑃 𝑌 = 2|𝑋 = 3 = = 𝑓 𝑋𝑌 𝑋 given that 3 = 𝑋 = 3 is

Conditional Probability Mass

Additional

Function

Conclusion: 𝑃 𝑌 = 3|𝑋 = 3 = 0.091

Further and work shows that 𝑃 𝑌 = 4|𝑋 = 3 = 0.091

.

Note that 𝑃 𝑌 = 1|𝑋 = 3 + 𝑃 𝑌 = 2|𝑋 = 3 + 𝑃 𝑌 = 3|𝑋 = 3 = 𝑃 𝑌 = 4|𝑋 = 3 = 1 . This set of probabilities defines the distribution of Y given that X = 3.

conditional probability

Conditional Probability Mass Function

3.18/99 Referring to Example 3.14, find the conditional distribution of X, given that Y = 1, and use it to determine 𝑃 𝑋 = 0|𝑌 = 1 .

Table 3.1: Joint Probability Distribution for Example 3.14

Row Totals

y

f(x, y) 0 1 2 Column Totals 0 3/28 3/14 1/28 5/14

x

1 9/28 3/14 0 15/28 2 3/28 0 0 3/28 15/28 3/7 1/28 1

Conditional Probability Mass Function

3.49/106 Let X denote the number of times a certain numerical control machine will malfunction: 1, 2, or 3 times on any given day. Let Y denote the number of times a technician is called on an emergency call. Their joint probability distribution is given as f(x, y)

y

1 3 5 1

x

2 3 0.05 0.05 0.10

0.05 0.10 0.35

0.00 0.20 0.10

(a) Evaluate the marginal distribution of X.

(b) Evaluate the marginal distribution of Y.

(c) Find 𝑃 𝑌 = 3|𝑋 = 2 .

Conditional Probability Density Function

Given continuous random variables X and Y with joint probability density function 𝑓 𝑋𝑌 𝑥, 𝑦 , the conditional probability density function of Y given X = x is 𝑓 𝑌|𝑥 𝑦 = 𝑓 𝑋𝑌 𝑓 𝑋 𝑥,𝑦 𝑥 for 𝑓 𝑋 𝑥 > 0 (5-4) Equivalently, 𝑓 𝑋|𝑦 𝑥 = 𝑓 𝑋𝑌 𝑓 𝑌 𝑥, 𝑦 𝑦 , 𝑓 𝑌 𝑦 > 0

Conditional Probability Density Function

Because the conditional probability density function 𝑓 𝑌|𝑥 𝑦 is a probability density function for all y in following properties are satisfied: 𝑅 𝑥 , the (1) 𝑓 𝑌|𝑥 𝑦 ≥ 0 (2) 𝑓 𝑌|𝑥 𝑦 𝑑𝑦 = 1 (3) 𝑃 𝑌 ∈ 𝐵|𝑋 = 𝑥 = 𝐵 range of Y 𝑓 𝑌|𝑥 𝑦 𝑑𝑦 for any set B in the (5-5)

Conditional Probability Density Function

Examples: 3.19/100 The joint density function for the random variables 𝑋, 𝑌 , where X is the unit temperature change and Y is the proportion of spectrum shift that a certain atomic particle produces is 𝑓 𝑋𝑌 𝑥, 𝑦 = 10𝑥𝑦 2 , 0 < 𝑥 < 𝑦 < 1 0, elsewhere.

a) Find the marginal densities conditional probability 𝑓 𝑌|𝑥 𝑦 .

𝑓 𝑋 𝑥 , 𝑓 𝑌 𝑦 , and the b) Find the probability that the spectrum shifts more than half of the total observations, given the temperature is increased to 0.25 units.

Conditional Probability Density Function

3.20/100 Given the joint density function 𝑓 𝑋𝑌 𝑥, 𝑦 = 𝑥 1 + 3𝑦 2 4 , 0 < 𝑥 < 2, 0 < 𝑦 < 1 0, elsewhere, find 𝑓 𝑋 𝑥 , 𝑓 𝑌 𝑦 , 𝑓 𝑋|𝑦 𝑥 , and evaluate 𝑃 1 4 < 𝑋 <

Conditional Probability Density Function

3.53/106 Given the joint density function 𝑓 𝑋𝑌 𝑥, 𝑦 = 6 − 𝑥 − 𝑦 , 0 < 𝑥 < 2, 2 < 𝑦 < 4, 8 0, elsewhere, find 𝑃 1 < 𝑌 < 3|𝑋 = 1 .

Independence

For random variables X and Y, if any one of the following properties is true, the others are also true, and X and Y are independent. (1) 𝑓 𝑋𝑌 (2) 𝑓 𝑌|𝑥 𝑥, 𝑦 = 𝑓 𝑋 𝑦 = 𝑓 𝑌 𝑦 𝑥 𝑓 𝑌 𝑦 for all x and y for all x and y with 𝑓 𝑋 𝑥 > 0 (3) 𝑓 𝑋|𝑦 𝑥 = 𝑓 𝑋 𝑥 for all x and y with 𝑓 𝑌 (4) 𝑃 𝑋 ∈ 𝐴, 𝑌 ∈ 𝐵 = 𝑃 𝑋 ∈ 𝐴 𝑃 𝑌 ∈ 𝐵 and B in the range of X and Y, respectively.

𝑦 > 0 for any sets A (5-7)

Independence

Examples: 5-11/162 Independent Random Variables Suppose that Example 5-2 is modified so that the joint probability density function of X and Y is 10 −6 𝑒𝑥𝑝 −0.001𝑥 − 0.002𝑦 for 𝑥 ≥ 0 𝑓 𝑋𝑌 and 𝑥, 𝑦 = 2 × 𝑦 ≥ 0 . Show that X and Y are independent and determine 𝑃 𝑋 >

Independence

The marginal probability density function of Y is ∞ 𝑓 𝑌 𝑦 = 2 × 10 −6 𝑒 −0.001𝑥−0.002𝑦 𝑑𝑥 0 = 0.002𝑒 −0.002𝑦 for y > 0 Therefore, 𝑓 𝑋𝑌 𝑥, 𝑦 = 𝑓 𝑋 and Y are independent.

𝑥 𝑓 𝑌 𝑦 for all x and y, and X

Independence

To determine the probability requested, property (4) of Equation 5-7 and the fact that each random variable has an exponential distribution can be applied.

𝑃 𝑋 > 1000, 𝑌 < 1000 = 𝑃 𝑋 > 1000 𝑃 𝑌 < 1000 = 𝑒 −1 1 − 𝑒 −2 = 0.0318

Independence

3.21/102 Show that the random variables of Example 3.14

are not statistically independent.

Table 3.1: Joint Probability Distribution for Example 3.14

y

f(x, y) 0 1 2 Column Totals 0 3/28 3/14 1/28 5/14

x

1 9/28 3/14 0 15/28 2 3/28 0 0 3/28 Row Totals 15/28 3/7 1/28 1

Examples

5-1/167 Show that the following function satisfies the properties of a joint probability mass function.

x

1 1.5

1.5

2.5

4 3 Determine the following: 5 (a) 𝑃 𝑋 < 2.5, 𝑌 < 3 (c) 𝑃 𝑌 < 3

y

1 2 3

f

(b) (d)

XY

(x, y) 1/4 1/8 1/4 1/4 1/8 𝑃 𝑋 < 2.5

𝑃 𝑋 > 1.8, 𝑌 > 4.7

Examples

(e) 𝐸 𝑋 , 𝐸 𝑌 , 𝑉 𝑋 , and 𝑉 𝑌 (f) Marginal probability distribution of the random variable X (g) Conditional probability distribution of Y given that X = 2.5

(h) Conditional probability distribution of X given that Y = 2.

(i) 𝐸 𝑌|𝑋 = 1.5

(j) Are X and Y independent?

Examples

5-2/167 Determine the value of c that makes the function 𝑓 𝑥, 𝑦 = 𝑐 𝑥 + 𝑦 a joint probability mass function over the nine points with x = 1, 2, 3 and y = 1, 2, 3. Determine the following: (a) 𝑃 𝑋 = 1, 𝑌 < 4 (b) 𝑃 𝑋 = 1 (c) (e) 𝑃 𝑌 = 2 𝐸 𝑋 , 𝐸 𝑌 , 𝑉 𝑋 (d) 𝑃 𝑋 < 2, 𝑌 < 2 , and 𝑉 𝑌 (f) Marginal probability distribution of the random variable X (g) Conditional probability distribution of Y given that X = 1

Examples

(h) Conditional probability distribution of X given that Y = 2 (i) 𝐸 𝑌|𝑋 = 1 (j) Are X and Y independent?

Examples

5-4/167 Four electronic printers are selected from a large lot of damaged printers. Each printer is inspected and classified as containing either a major or a minor defect.

Let the random variables X and Y denote the number of printers with major and minor defects, respectively.

Determine the range of the joint probability distribution of X and Y.

Examples

5-6/167 A small-business Web site contains 100 pages and high graphic content, respectively. A sample of four pages is selected without replacement, and X and Y denote the number of pages with moderate and high graphics output in the sample. Determine: (a) 𝑓 𝑋𝑌 (c) 𝐸 𝑋 𝑥, 𝑦 (e) 𝐸 𝑌|𝑋 = 3 (b) 𝑓 𝑋 (d) 𝑓 𝑌|3 𝑥 𝑦 (f) 𝑉 𝑌|𝑋 = 3 (g) Are X and Y independent?

Expected Value of a Function of Two Random Variables

ℎ 𝑥, 𝑦 𝑓 𝑋𝑌 𝑥, 𝑦 𝑋, 𝑌 discrete 𝐸 ℎ 𝑋, 𝑌 = ℎ 𝑥, 𝑦 𝑓 𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦 𝑋, 𝑌 continuous (5-13)

Expected Value of a Function of Two Random Variables

Example: 5-19/171 For the joint probability distribution of the two random variables in Fig. 5-12, calculate 𝐸 𝑋 − 𝜇 𝑋 𝑌 − 𝜇 𝑌 .

Covariance

• • A measure of linear relationship between the random variables If the relationship between the random variables is nonlinear, the covariance might not be sensitive to the relationship, as illustrated in Fig. 5-13(d). The only points with nonzero probability are the points on the circle.

cov 𝑋, 𝑌 = 𝜎 𝑋𝑌 = 𝐸 𝑋 − 𝜇 𝑋 𝑌 − 𝜇 𝑌 (5-14)

Covariance

Covariance

Examples: 5-20/173 In Example 5-1, the random variables X and Y are the number of signal bars and the number of times you need to state your departure city, respectively. Is the covariance between X and Y positive or negative?

Negative covariance

Covariance

4.47/127 For the random variables X and Y whose joint density function is given in Exercise 3.40 on page 105, find the covariance.

2 𝑓 𝑥, 𝑦 = 3 𝑥 + 2𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1, 0, elsewhere

Correlation

The correlation between random variables X and Y, denoted as ρ

XY

, is 𝜌 𝑋𝑌 = cov 𝑋, 𝑌 𝑉 𝑋 𝑉 𝑌 = 𝜎 σ 𝑋𝑌 𝑋 𝜎 𝑌 (5-15) For any two random variables X and Y −1 ≤ ρ

XY

≤ +1 (5-16) If X and Y are independent random variables,

σ XY

= ρ

XY

= 0 (5-17)

Covariance

Examples: 5-21/173 Covariance For the discrete random variables X and Y with the joint distribution shown in Fig. 5-14, determine σ

XY

and ρ

XY

.

Correlation

5-22/174 Correlation Suppose that the random variable X has the following distribution: 𝑃 𝑋 = 1 = 0.2

,

Examples

5-29/175 Determine the covariance and correlation for the following joint probability distribution

x y f XY

(x, y) 2 3 1/8 1 4 1/4 2 5 1/2 4 6 1/8

Examples

5-31/175 Determine the value for c and the covariance and correlation for the joint probability mass function 𝑓 𝑋𝑌 𝑥, 𝑦 = 𝑐 𝑥 + 𝑦 for x = 1, 2, 3 and y = 1, 2, 3.

5-37/175 Determine the covariance and correlation for the joint probability density function 𝑓 𝑋𝑌 𝑥, 𝑦 = 𝑒 −𝑥−𝑦 over the range 0 < x and 0 < y.

Examples

5-39/175 The joint probability distribution is

x y f XY

(x, y) −1 0 1/4 0 −1 1/4 0 1 1/4 1 0 1/4 Show that the correlation between X and Y is zero, but X and Y are not independent.

Summary

• • • A joint probability mass function is a function used to calculate probabilities for two or more discrete random variables.

A joint probability density function is a function used to calculate probabilities for two or more continuous random variables.

A marginal probability mass function is the probability mass function of a discrete random variable obtained from the joint probability distribution of two or more random variables.

Summary

• • • A marginal probability density function is the probability density function of a continuous random variable obtained from the joint probability distribution of two or more random variables.

A conditional probability mass function is the probability mass function of the conditional probability distribution of a discrete random variable.

A conditional probability density function is the probability density function of the conditional probability distribution of a continuous random variable.

Summary

• The covariance is a measure of association between two random variables obtained as the expected value of the product of the two random variables around their means; that is Cov 𝑋, 𝑌 = 𝜎 𝑋𝑌 = 𝐸 𝑋 −

References

• • Montgomery and Runger. Applied Statistics and Probability for Engineers, 5 th Ed. © 2011 Walpole, et al. Probability and Statistics for Engineers and Scientists 9 th Ed. © 2012, 2007, 2002