p.p chapter 6.2
Download
Report
Transcript p.p chapter 6.2
Transforming and Combining
Random Variables
Section 6.2
Reference Text:
The Practice of Statistics, Fourth Edition.
Starnes, Yates, Moore
Objectives
1. Multiplying or Dividing by a constant
2. Adding or Subtracting by a constant
3. Putting it Together: Adding, Subtracting,
Multiplying, or Dividing Linear transformation!
4. Mean of the Sum of Random Variables
5. Independent Random Variables
6. Variance of the Sum of Independent Random
Variables
7. Mean of the Difference of Random Variables
8. Variance of the Differences of Random
Variables
Multiplying or Dividing
by a Constant
Adding or Subtracting
by a Constant
• If we took our values of our distinct random
variables and added (which could be negative) (or
subtracted) them by a constant (a)….
• Adds (or subtracts) measures of center and
location (mean, median, quartiles, percentiles) by
a.
• Does not change the measure of spread
• Does not change the shape of the distribution
• Shall we look at Pete’s Jeep Tours?
Putting It All Together:
Linear Transformation
• What happens if we transform a random variable by both
adding or subtracting a constant and multiplying or
dividing by a constant?
• We could have gone directly from the number of
passengers X on Pete’s Jeep Tours to the profit of:
V = 150X -100
where we both subtracted 100 and multiplied by 150.
• This is a linear transformation! In general can be written
in the form of Y
constants.
= a + bX, where a and b are
• Lets generalize on the next slide….
Effects of Linear Transformation on
the mean and SD
Mean of the Sum of Random
Variables
Independent Random Variables
• If knowing whether any event involving X
alone has occurred tells us nothing about
the occurrence of any event involving Y
alone, and vice versa, then X and Y are
independent random variables.
• But we already knew this! Just restating the
idea of being independent!
Independent Random Variables
• Probability models often assume independence
when the random variables describe outcomes
that appear unrelated to each other. You should
always ask whether the assumption of
independence seems reasonable.
• For instance, its reasonable to treat the random variables
X = number of passengers on Pete’s trip and Y = number
of passengers on Erin’s trip on a randomly chosen day as
independent, since the siblings operate their trips in
different parts of the country.
Variance of the Sum of
Independent Random Variables
By the Way…
• You might be wondering whether there’s a
formula for computing the variance of the
sum of two random variables that are not
independent. There is, but its beyond the
scope of this course.
• Just remember, you can add variances
only if the two random variables are
independent, and that you can never add
standard deviations.
Mean of the Difference of Random
Variables
Variance of the Differences of
Random Variables
• Earlier, we saw that the variance of the sum
of two independent random variables is the
sum of their variances. Can you guess what
the variance of the difference of two
independent random variables will be?
WRONG! THINK AGAIN! MUHAHAHA!
Variance of the Differences of
Random Variables
Objectives
1. Multiplying or Dividing by a constant
2. Adding or Subtracting by a constant
3. Putting it Together: Adding, Subtracting,
Multiplying, or Dividing Linear transformation!
4. Mean of the sum of random variables
5. Independent random variables
6. Variance of the sum of independent variables
7. Mean difference of Random Variables
8. Variance of the differences of Random
Variables
Homework
Worksheet : I'm going to post the homework
online- however the deal is:
If I don’t post the homework online by
Tuesday 11/25/14 then there is no
homework over break.
Deal?!