Random Variables
Download
Report
Transcript Random Variables
Random Variables
A random variable assumes a value based on
the outcome of a random event.
◦ We use a capital letter, like X, to denote a random
variable.
◦ A particular value of a random variable will be
denoted with a lower case letter, in this case x.
There are two types of random variables:
◦ Discrete random variables can take one of a finite
number of distinct outcomes.
Example: Number of credit hours
◦ Continuous random variables can take any numeric
value within a range of values.
Example: Cost of books this term
Example
A professor asked his introductory statistics students to
state how many siblings they have. The result is
organized in the following table.
Let X denote the number of siblings of a randomly
selected student.
a. Determine the probability
distribution of the random
variable X.
b. Construct a probability
histogram for the random
variable X.
Solution Example
The table displays these probabilities and provides the
probability distribution.
Example
Toss a balanced dime three times and record the
number of heads. Let X be the number of heads of each
three-toss experiment.
a. Repeat the experiment 1000 times and record the
frequencies and proportions for the number of heads.
b. Determine the probability distribution of the random
variable X.
c. Construct a probability histogram for the random
variable X.
Solution Example
a) We used a computer to simulate 1000 observations of
the random variable X, the number of heads obtained in
three tosses of a balanced dime. The following table
shows the frequencies and proportions for the numbers
of heads obtained in the 1000 observations.
Solution Example
c) This result is more easily seen if we compare the
proportion histogram to the probability histogram of the
random variable X.
A probability model for a random variable
consists of:
◦ The collection of all possible values of a random
variable, and
◦ the probabilities that the values occur.
Of particular interest is the value we expect a
random variable to take on, notated μ (for
population mean) or E(X) for expected value.
The expected value of a (discrete) random
variable can be found by summing the products
of each possible value and the probability that
it occurs:
E X x P x
An American Roulette wheel has 38 slots, of
which 18 are black, 18 are red, and 2 are
green. The dealer spins the wheel and whirls
a small ball in the opposite direction within
the wheel. Gamblers bet on where the ball
will come to rest. One of the simplest bet to
choose red (or black). A bet of $1 on red pays
off an additional $1 if the ball lands in a red
slot. Otherwise, the player loses his $1. Is it a
fair game?
Insurance company make bets. They bet that
you’re going to live a long life. You bet that
you’re going to die sooner. How to find a “fair
price” for this kind of bet?
Suppose an insurance company offers a
“death and disability” policy that pays
$10,000 when you die or $5000 if you are
permanently disabled. How can the insurance
company determine the annual premium?
Suppose the insurance company insures 1000
people. The death rate and disability rate in
any year can be assumed as follows:
Policyholder
outcome
Probability
P (X=x)
Payout
Death
1/1000
10,000
Disability
2/1000
5000
977/1000
0
Neither
Let X be the insurance company’s payout in a
given year.
For data, we calculated the standard deviation
by first computing the deviation from the mean
and squaring it. We do that with discrete
random variables as well.
The variance for a random variable is:
2 Var X x P x
2
The standard deviation for a random variable
is:
SD X Var X
Adding or subtracting a constant from data shifts
the mean but doesn’t change the variance or S.D:
E(X ± c) = E(X) ± c
Var(X ± c) = Var(X)
◦ Example: Consider everyone in a company receiving a
$5000 increase in salary.
In general, multiplying each value of a random
variable by a constant multiplies the mean by
that constant and the variance by the square of
the constant:
E(aX) = aE(X)
Var(aX) = a2Var(X)
◦ Example: Consider everyone in a company receiving a
10% increase in salary.
The mean of the sum (or difference) of two
random variables is the sum (or difference) of
the means.
E(X ± Y) = E(X) ± E(Y)
If the random variables are independent, the
variance of their sum or difference is always
the sum of the variances.
Var(X ± Y) = Var(X) + Var(Y)
Combining Random Variables (The Bad News)
It would be nice if we could go directly from
models of each random variable to a model for
their sum.
But, the probability model for the sum of two
random variables is not necessarily the same as
the model we started with even when the
variables are independent.
Thus, even though expected values may add, the
probability model itself is different.
Combining Random Variables (The good news)
Nearly everything we’ve said about how
discrete random variables behave is true of
continuous random variables, as well.
When two independent continuous random
variables have Normal models, so does their
sum or difference.
This fact will let us apply our knowledge of
Normal probabilities to questions about the
sum or difference of independent random
variables.
Consider the company that manufactures and ships
small stereo systems. The times required to pack
the stereos can be described by a Normal model
with a mean of 9 minutes and standard deviation of
1.5 minutes. The times for the boxing stage can
also be modeled as Normal, with a mean of 6
minutes and standard deviation of 1 minute.
Find the expected total times for packing and
boxing a system.
Find the standard deviation of packing and boxing
time .
Find the expected time for packing two systems
◦ What’s the probability that packing two systems takes over
20 minutes?
◦ What percentage of stereo system take longer to pack than
to box?
Page 427 – 430
Problem # 1–15 odd, 19, 23, 25, 27, 29, 35, 43.