Transcript Point Estimation of Parameters and Sampling Distributions
Point Estimation of Parameters and Sampling Distributions Outlines: Sampling Distributions and the central limit theorem Point estimation Methods of point estimation Moments Maximum Likelihood
Sampling Distributions and the central limit theorem Random Sample Sampling distribution: the probability distribution of a statistic.
Ex. The probability distribution of is called distribution of the mean.
Sampling Distributions of sample mean Consider the sampling distribution of the sample mean.
n X i
is a normal and independent probability, then
E
(
X
)
X X
E
(
X X
1 1 )
X E
2 (
X
2
X
) 3
E n
...
(
X
3
X
)
n
...
E
(
X n
) ...
V X
n V X V X V X V
(
X
) 2 ( 1 ) ( 2 ) ( 3 ) ...
(
n
)
X n
2 2
X
2
n
2 2 ...
N
( ,
n
2 )
n
2
Central limit theorem
n >= 30, sampling from an unknown population => the sampling distribution of will be approximated as normal mean µ and
2 /n .
with
Central limit theorem
Central limit theorem
Ex. Suppose that a random variable X has a continuous uniform distribution 1 / 2 , 4
x
6
f
(
x
) 0 ,
otherwise
Find the distribution of the sample mean of a random sample of size n=40
x
4 5 6 Method: 1.Calculate the value of mean and variance of x
2.
x
x x x
n
Sampling Distribution of a Difference
Two independent populations. µ 1 , 1 2 µ 2 , 2 2 Suppose that both populations are normally distributed.
with
X X X
X X
1
X
2 2 1 2 1 1 1 2
X
1
X
2 2
X
1 2
X
2 1 2
n
1 2 2
n
2
Sampling Distribution of a Difference
Definition
Sampling Distribution of a Difference
Ex. The effective life of a jet-turbine aircraft engine is a random variable with mean 5000 hr. and sd. 40 hr. The distribution of effective life is fairly close to a normal distribution. The engine manufacturer introduces an improvement into the manufacturing process for the engine that increases the mean life to 5050 hr. and decrease sd. to 30 hr. 16 components are sampling from the old process.
25 components are sampling from the improve process.
What is the probability that the difference in the two sample means is at least 25 hr?
Point estimation
Parameter Estimation : calculation of a reasonable number that can explains the characteristic of population.
Ex. X is normally distributed with unknown mean µ.
The Sample mean( ) is a point estimator of population mean (µ) =>
X
After selecting the sample, is the point estimate of µ.
x
Point estimation
Unbiased Estimators
Point estimation
Ex. Suppose that X is a random variable with mean
μ
and variance
X σ 2
. Let X
1 2 ,X 2 ,..., X n
be a random sample of size n from the population. Show that the sample mean and are unbiased estimators of
μ
and
σ 2
, respectively.
1.
proof,
E
(
X
) ?
2.
proof,
E
(
S
2 ) 2 ?
Point estimation
Point estimation
Sometimes, there are several unbiased estimators of the sample population parameter.
Ex. Suppose we take a random sample of size n from a normal population and obtain the data x1 = 12.8, x2 = 9.4, x3 = 8.7, x4 = 11.6, x5 = 13.1, x6 = 9.8, x7 = 14.1,x8 = 8.5, x9 = 12.1, x10 = 10.3. all of them are unbiased estimator of μ
Point estimation
Minimum Variance Unbiased Estimator (MVUE)
Point estimation
MVUE for
μ
Method of Point Estimation
Method of Moment Method of Maximum Likelihood Bayesian Estimation of Parameter
Method of Moments
The general idea of the method of moments is to equate the population moments to the corresponding sample moments.
The first population moment is E(X)=
μ
........(1) Equating (1) and (2),
n i n
1
i X
ˆ
X
The sample mean is the moment estimator of the population mean
Method of Moments
Moment Estimators Ex.
Suppose that X
1 ,X 2 ,..., X n
be a random sample from an exponential distribution with parameter λ . Find the moment estimator of λ There is one parameter to estimate, so we must equate first population moment to first sample moment.
first population moment = E(X)=1/ λ , first sample moment = 1
n i n
1
X i
X
ˆ 1 /
X
Method of Moments
Ex. Suppose that X
1 ,X 2 ,..., X n
be a random sample from a normal distribution with parameter μ and σ 2 . Find the moment estimators of μ and σ 2 .
For μ : k=1 ; The first population moment is E(X)=
μ
........(1) The first sample moment is ........(2) Equating (1) and (2), ˆ
X
1
n i n
1
i X
For σ 2 : k=2 ; The second population moment is E(X
2 )= μ 2 +
σ 2 Equating (3) and (4), 1
n i n
1
X i
2 ......(3) 2 2 1
n i n
1
X i
2 , ˆ 2
i n
1 (
X i
X
) 2
n
Method of Maximum Likelihood Concept: the estimator will be the value of the parameter that maximizes the likelihood function.
Method of Maximum Likelihood Ex.
Let X be a Bernoulli random variable. The probability mass function is
f
(
x
;
p
)
p x
( 1 0
p
) 1
x
,
x
0 , 1 ,
otherwise L
(
p
)
p x
1 ( 1
p
) 1
x
1
p x
2 ( 1
p
) 1
x
2 ...
p x n
( 1
p
) 1
x n
i n
1
p x i
( 1
p
) 1
x i
p n
i
1
x i
( 1
p
)
n
i n
1
x i
ln
L
(
p
) (
i n
1
x i
) ln
p
(
n
i n
1
x i
) ln( 1
p
)
d
ln
L
(
p
)
dp i n
1
x i p
(
n
1
i n
1
p x i
) 0 , 1
n i n
1
x i
1
n i n
1
X i
Method of Maximum Likelihood Ex. Let X be normally distributed with unknown μ maximum likelihood estimator of μ and known σ 2 . Find the
Method of Maximum Likelihood Ex.
Let X be exponentially distributed with parameter λ . Find the maximum likelihood estimator of λ .
Method of Maximum Likelihood
Ex. Let X be normally distributed with unknown μ likelihood estimator of μ , and σ 2
.
and unknown σ 2 . Find the maximum
Method of Maximum Likelihood
The method of maximum likelihood is often the estimation method that mathematical statisticians prefer, because it is usually easy to use and produces estimators with good statistical properties.