Lecture 6: Normal distribution, central limit theorem

Download Report

Transcript Lecture 6: Normal distribution, central limit theorem

Stats for Engineers Lecture 6 Answers for Question sheet 1 are now online http://cosmologist.info/teaching/STAT/ Answers for Question sheet 2 should be available Friday evening

Summary From Last Time

𝑃(βˆ’1.5 < 𝑋 < βˆ’0.7)

Continuous Random Variables

Probability Density Function (PDF) 𝑓(π‘₯) 𝑃 π‘Ž ≀ 𝑋 ≀ 𝑏 = π‘Ž 𝑏 𝑓 π‘₯ β€² 𝑑π‘₯β€² ∞ βˆ’βˆž 𝑓 π‘₯ 𝑑π‘₯ = 1 𝑓(π‘₯)

Exponential distribution

𝑓 𝑦 = πœˆπ‘’ βˆ’πœˆπ‘¦ , 0, 𝑦 > 0 𝑦 < 0 Probability density for separation of random independent events with constant rate 𝜈

Normal/Gaussian distribution

𝜎 𝑓 π‘₯ = 1 2πœ‹πœŽ 2 𝑒 βˆ’ π‘₯βˆ’πœ‡ 2 2𝜎2 (βˆ’βˆž < π‘₯ < ∞) πœ‡: mean 𝜎: standard deviation

Normal distribution

Question from Derek Bruff

Consider the continuous random variable X = the weight in pounds of a randomly selected new born baby. Suppose that X can be modelled with a normal distribution with mean μ = 7.57 and standard deviation 𝜎 = 1.06. If the standard deviation were 𝜎 = 1.26 instead, how would that change the graph of the pdf of X?

1.

2.

3.

4.

5.

6.

The graph would be narrower and have a greater maximum value.

The graph would be narrower and have a lesser maximum value.

The graph would be narrower and have the same maximum value.

The graph would be wider and have a greater maximum value.

The graph would be wider and have a lesser maximum value.

The graph would be wider and have the same maximum value.

12% 1 3% 2 0% 3 0% 4 5 3% 6

𝜎 ∞ βˆ’βˆž 𝑓 π‘₯ 𝑑π‘₯ = 1

82%

𝑃 π‘Ž < 𝑋 < 𝑏 = π‘Ž 𝑏 𝑓 π‘₯ 𝑑π‘₯ BUT: for normal distribution cannot integrate analytically.

Instead use tables for standard Normal distribution: 𝑓 𝑧 = 1 2πœ‹ 𝑒 βˆ’π‘§ 2 2 If 𝑋 ∼ 𝑁 πœ‡, 𝜎 2 , then 𝑍 = π‘‹βˆ’πœ‡ 𝜎 ∼ 𝑁(0,1)

Why does this work?

Change of variable

The probability for X in a range 𝑑π‘₯ around π‘₯ is for a distribution 𝑓(π‘₯) 𝑓 π‘₯ 𝑑π‘₯.

The probability should be the same if it is written in terms of another variable 𝑦 = 𝑦(π‘₯) . Hence 𝑓 π‘₯ 𝑑π‘₯ = 𝑓 𝑦 𝑑𝑦 is given by 𝑓(π‘₯) β‡’ 𝑓 𝑦 = 𝑓 π‘₯ 𝑑π‘₯ 𝑑𝑦 .

i.e. change π‘₯ to 𝑧 = π‘₯βˆ’πœ‡ 𝜎 β‡’ π‘₯ = πœ‡ + πœŽπ‘§ β‡’ 𝑑π‘₯ 𝑑𝑧 = 𝜎 β‡’ 𝑓 𝑧 = 𝑓 π‘₯ 𝑑π‘₯ 𝑑𝑧 = = 𝑍 2 2 1 2πœ‹πœŽ 2 𝑒 βˆ’ π‘₯βˆ’πœ‡ 2𝜎 2 2 Γ— 𝜎 = 1 2πœ‹ 𝑒 βˆ’π‘§ 2 2 𝑑π‘₯ N(0, 1) - standard Normal distribution

Use Normal tables for 𝑄 = 𝑃(𝑍 < 𝑧) [also called Ξ¦(𝑧) ] 𝑧 Q z = 𝑃 𝑍 ≀ 𝑧 = βˆ’βˆž 𝑓 π‘₯ 𝑑π‘₯ 𝑍 = π‘‹βˆ’πœ‡ 𝜎 ∼ 𝑁(0,1) 𝑸 𝑧

Outside of exams this is probably best evaluated using a computer package (e.g. Maple, Mathematica, Matlab, Excel); for historical reasons you still have to use tables.

0 ≀ 𝑍 ≀ 3.59

𝑸 z

Example :

If

Z

~ N(0, 1): (a) 𝑃 𝑍 ≀ 1.22 = Q 1.22

= 0.8

888

(b) 𝑃 𝑍 > βˆ’0.5

= 𝑃 𝑍 ≀ 0.5

= Q(0.5) = 0.6915.

=

(c) 𝑃 𝑍 ≀ βˆ’1.0

= 𝑃(𝑍 β‰₯ 1.0) = 1 βˆ’ 𝑃(𝑍 < 1.0) = 1 βˆ’ Q 1.0 = 1 βˆ’ 0.8413

= 0.1587

=

Symmetries

If 𝑍 ∼ 𝑁(0,1) , which of the following is NOT the same as 𝑃(𝑍 < 0.7) ?

60%

1.

2.

3.

4.

1 βˆ’ 𝑃(𝑍 > 0.7) 𝑃(𝑍 > βˆ’0.7) 1 βˆ’ 𝑃(𝑍 > βˆ’0.7) 1 βˆ’ 𝑃(𝑍 < βˆ’0.7)

10% 13% 1 2 3 17% 4

Symmetries

If 𝑍 ∼ 𝑁(0,1) , which of the following is NOT the same as 𝑃(𝑍 < 0.7) ?

1 βˆ’ 𝑃 𝑍 > 0.7

𝑃(𝑍 > βˆ’0.7) 1 βˆ’ 𝑃(𝑍 > βˆ’0.7) 1 βˆ’ 𝑃(𝑍 < βˆ’0.7) οƒΌ = οƒΌ = = οƒΌ

(d) 𝑃 0.5 < 𝑍 < 1.5

= 𝑃 𝑍 < 1.5 βˆ’ 𝑃(𝑍 < 0.5) = Q 1.5 βˆ’ Q(0.5) = 0.9332 βˆ’0.6915

= 0.2417

= βˆ’

(e) 𝑃 𝑍 < 1.356

Between Q 1.35 = 0.9115

and Q 1.36 = 0.9131

Using interpolation Q 1.356 = A 𝑄 1.35 + B 𝑄(1.36) Fraction of distance between 1.35 and 1.36: 𝐡 = 1.356 βˆ’ 1.35

1.36 βˆ’ 1.35

= 0.6

𝐴 = 1 βˆ’ 𝐡 = 0.4

Q 1.356 = 0.4𝑄 1.35 + 0.6𝑄(1.36) =0.9125

(f) 0.8 = 𝑃 𝑍 ≀ 𝑐 = Q(𝑐) What is 𝑐 ?

Use table in reverse: 𝑧 between 0.84 and 0.85

Interpolating as before 𝑐 = 𝐴 Γ— 0.084 + 𝐡 Γ— 0.085

0.8βˆ’0.7995

𝐡 = 0.8023βˆ’0.7995

= 0.1

8 𝐴 = 1 βˆ’ 𝐡 = 0.8

2 β‡’ 𝑐 = 0.82 Γ— 0.084 + 0.18 Γ— 0.085

β‰ˆ 0.0842

.

1. 0.0618

2. 0.9382

3. 0.1236

4. 0.0735

Using Normal tables

The error 𝑍 (in Celsius) on a cheap digital thermometer has a normal distribution, with 𝑍 ∼ 𝑁 0,1 .

What is the probability that a given temperature measurement is too cold by more than 1.54

∘ C?

43% 36% 19% 1.

2.

3.

2% 4.

Using Normal tables

The error 𝑍 (in Celsius) on a cheap digital thermometer has a normal distribution, with 𝑍 ∼ 𝑁 0,1 .

That is the probability that a given temperature measurement is too cold by more than 1.54

∘ C?

Answer

: Want 𝑃 𝑍 < βˆ’1.54

= 𝑃(𝑍 > 1.54) = 1 βˆ’ 𝑃(𝑍 < 1.54) = 1 βˆ’ Q(1.54) = 1 βˆ’ 0.9382 = 0.0618

=

(g) Finding a range of values within which 𝑍 lies with probability 0.95: The answer is not unique; but suppose we want an interval which is symmetric about zero i.e. between βˆ’π‘‘ and 𝑑 .

0.95

So 𝑑 is where Q 𝑑 = 0.975

0.025+0.95

βˆ’π‘‘ 0.025

𝑑 0.05/2=0.025

0.975

𝑑

Use table in reverse: Q 𝑑 = 0.975

β‡’ 𝑍 = 1.96

95% of the probability is in the range βˆ’1.96 < 𝑍 < 1.96

In general 95% of the probability lies within 1.96𝜎 of the mean πœ‡ P=0.025

P=0.025

The range πœ‡ Β± 1.96𝜎 is called a 95%

confidence interval

.

Question from Derek Bruff

Normal distribution

If 𝑋 has a Normal distribution with mean πœ‡ = 20 and standard deviation 𝜎 = 4 , which of the following could be a graph of the pdf of 𝑋 ?

1.

2.

3.

4.

45% 20% 32% 1 2 2% 3 4

1.

Normal distribution

If 𝑋 has a Normal distribution with mean πœ‡ = 20 and standard deviation 𝜎 = 4 , which of the following could be a graph of the pdf of 𝑋 ?

2.

3.

4.

Too wide Correct Wrong mean i.e. Mean at πœ‡ = 20 , 95% inside (5% outside) of πœ‡ Β± 2𝜎, i.e. 20 Β± 8 Too narrow

Example : Manufacturing variability

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

(i) Find the probability that X exceeds 14.99 mm.

(ii) Within what range will X lie with probability 0.95?

(iii) Find the probability that a randomly chosen pipe fits into a randomly chosen fitting (i.e. X < Y).

Y X

Example : Manufacturing variability

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

(i) Find the probability that X exceeds 14.99 mm.

Answer

: 𝑋 ∼ 𝑁 πœ‡, 𝜎 2 = 𝑁(15.0, 0.02

2 ) 𝑃 𝑋 > 14.99 = 𝑃 𝑍 > 14.99 βˆ’ 15.0

0.02

= 𝑃 𝑍 > βˆ’0.5

= 𝑃 𝑍 < 0.5 = 𝑄(0.5) β‰ˆ 0.6915

Reminder: 𝑋 βˆ’ πœ‡ 𝑍 = 𝜎

Example : Manufacturing variability

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

(ii) Within what range will X lie with probability 0.95?

Answer

From previous example 𝑃 βˆ’1.96 < 𝑍 < 1.96 = 0.95

i.e. 𝑋 lies in πœ‡ Β± 1.96𝜎 with probability 0.95

β‡’ 𝑋 = 15 Β± 1.96 Γ— 0.02

β‡’ 14.96mm < 𝑋 < 15.04mm

1. 0.025

2. 0.05

3. 0.95

4. 0.975

Where is the probability

We found 95% of the probability lies within 14.96mm < 𝑋 < 15.04mm

What is the probability that 𝑋 > 15.04mm?

P=0.025

71% 1.

14% 2.

4% 3.

11% 4.

P=0.025

Example : Manufacturing variability

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

(iii) Find the probability that a randomly chosen pipe fits into a randomly chosen fitting (i.e. X < Y).

Answer

For 𝑋 < π‘Œ we want 𝑃(π‘Œ βˆ’ 𝑋 > 0 ). To answer this we need to know the distribution of π‘Œ βˆ’ 𝑋 , where π‘Œ have (different) Normal distributions and 𝑋 both

Distribution of the sum of Normal variates

Means and variances of independent random variables just add. If 𝑋 1 , 𝑋 2 , … , 𝑋 𝑛 𝑁 πœ‡ 𝑖 , , 𝜎 𝑖 2 are independent and each have a normal distribution 𝑋 𝑖 ∼ β‡’ πœ‡ 𝑋 1 +𝑋 2 = πœ‡ 𝑋 1 + πœ‡ 𝑋 2 𝜎 2 𝑋 1 +𝑋 2 = 𝜎 2 𝑋 1 + 𝜎 2 𝑋 2 Etc.

A special property of the Normal distribution is that the distribution of the sum of Normal variates is also a Normal distribution. [stated without proof] If 𝑐 1 , 𝑐 2 , … , 𝑐 𝑛 are constants then: 𝑐 1 𝑋 1 + 𝑐 2 𝑋 2 + β‹― 𝑐 𝑛 𝑋 𝑛 ∼ 𝑁(𝑐 1 πœ‡ 1 + β‹― + 𝑐 𝑛 πœ‡ 𝑛 , 𝑐 1 2 𝜎 2 + 𝑐 2 2 𝜎 2 + β‹― + 𝑐 2 𝑛 𝜎 2 )

E.g.

𝑋 1 + 𝑋 2 ∼ 𝑁(πœ‡ 1 + πœ‡ 2 , 𝜎 1 2 + 𝜎 2 2 ) 𝑋 1 βˆ’ 𝑋 2 ∼ 𝑁(πœ‡ 1 βˆ’ πœ‡ 2 , 𝜎 1 2 + 𝜎 2 2 )

Example : Manufacturing variability

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

(iii) Find the probability that a randomly chosen pipe fits into a randomly chosen fitting (i.e. X < Y).

Answer

For 𝑋 < π‘Œ we want 𝑃(π‘Œ βˆ’ 𝑋 > 0 ). π‘Œ βˆ’ 𝑋 ∼ 𝑁 πœ‡ π‘Œ βˆ’ πœ‡ 𝑋 , 𝜎 π‘Œ 2 + 𝜎 𝑋 2 = 𝑁 15.07 βˆ’ 15,0.02

2 + 0.022

2 = 𝑁(0.07,0.000884) Hence 𝑃 π‘Œ βˆ’ 𝑋 > 0 = 𝑃 𝑍 > 0βˆ’0.07

0.0.000884

= 𝑃 𝑍 > βˆ’2.35

= 𝑃 𝑍 < 2.35

= β‰ˆ 0.991

Which of the following would make a random pipe more likely to fit into a random fitting?

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

Y X

1. Decreasing mean of Y 2. Increasing the variance of X 3. Decreasing the variance of X 4. Increasing the variance of Y

55% 16% 14% 16% 1 2 3 4

Which of the following would make a random pipe more likely to fit into a random fitting?

The outside diameter, X mm, of a copper pipe is N(15.00, 0.02

2 ) and the fittings for joining the pipe have inside diameter Y mm, where Y ~ N(15.07, 0.022

2 ).

Y

Answer

Common sense. Or use 𝑑 = π‘Œ βˆ’ 𝑋 ∼ 𝑁 πœ‡ π‘Œ βˆ’ πœ‡ 𝑋 , 𝜎 π‘Œ 2 + 𝜎 2 𝑋 𝑃 𝑋 < π‘Œ = 𝑃 𝑑 > 0 = 𝑃 𝑍 > 0 βˆ’ πœ‡ 𝑑 𝜎 𝑑 = 𝑃 𝑍 < πœ‡ 𝑑 𝜎 𝑑 Larger probability if πœ‡ 𝑑 larger (bigger average gap between pipe and fitting) 𝜎 𝑑 smaller (less fluctuation in gap size) 𝜎 𝑑 2 = 𝜎 𝑋 2 + 𝜎 π‘Œ 2 , so 𝜎 𝑑 is smaller if variance of 𝑋 is decreased X

Normal approximations

Central Limit Theorem: If 𝑋 1 , 𝑋 2 same distribution, which has mean sum 𝑛 𝑖=1 𝑋 𝑖 … are independent random variables with the πœ‡ and variance 𝜎 2 tends to the distribution 𝑁(π‘›πœ‡, π‘›πœŽ 2 ) as (both finite), then the 𝑛 β†’ ∞ .

Hence: The sample mean 𝑛 𝑁(πœ‡, 𝜎 2 𝑛 )

.

= 1 𝑛 𝑛 𝑖=1 𝑋 𝑖 is distributed approximately as For the approximation to be good, n has to be bigger than 30 or more for skewed distributions, but can be quite small for simple symmetric distributions. The approximation tends to have much better fractional accuracy near the peak than in the tails: don’t rely on the approximation to estimate the probability of very rare events.

It often also works for the sum of non-independent random variables, i.e. the sum tends to a normal distribution (but the variance is harder to calculate)

Example

: Average of n samples from a uniform distribution:

Example:

The mean weight of people in England is μ=72.4kg, with standard deviation 𝜎 = 15kg.

The London Eye at capacity holds 800 people at once.

What is the distribution of the weight of the passengers at any random time when the Eye is full?

Answer:

The total weight π‘Š of passengers is the sum of Assuming independent: 𝑛 = 800 individual weights.

β‡’ by the central limit theorem π‘Š ∼ 𝑁(π‘›πœ‡, π‘›πœŽ 2 ) 𝜎 = 15Kg, 𝑛 = 800 β‡’ π‘Š ∼ 𝑁 800 Γ— 72.4kg, 800 Γ— 15 2 kg 2 = 𝑁(58000kg, 180000kg 2 ) i.e. Normal with πœ‡ π‘Š = 58000Kg , 𝜎 π‘Š = 180000Kg = 424Kg [usual caveat: people visiting the Eye unlikely to actually have independent weights, e.g. families, school trips, etc.]

Course Feedback Which best describes your experience of the lectures so far?

1.

2.

3.

4.

5.

6.

7.

8.

Too slow Speed OK, but struggling to understand many things Speed OK, I can understand most things A bit fast, I can only just keep up Too fast, I don’t have time to take notes though I still follow most of it Too fast, I feel completely lost most of the time I switch off and learn on my own from the notes and doing the questions I can’t hear the lectures well enough (e.g. speech too fast to understand or other people talking) Stopped prematurely, not many answers

31% 38% 13% 13% 6% 0% 1 2 3 4 5 0% 6 7 0% 8

Course Feedback What do you think of clickers?

1.

2.

3.

4.

5.

I think they are a good thing, help me learn and make lectures more interesting I enjoy the questions, but don’t think they help me learn I think they are a waste of time I think they are a good idea, but better questions would make them more useful I think they are a good idea, but need longer to answer questions

65% 1 7% 2 2% 3 14% 12% 4 5

Course Feedback How did you find the question sheets so far?

1.

2.

3.

4.

5.

Challenging but I managed most of it OK Mostly fairly easy Had difficulty, but workshops helped me to understand Had difficulty and workshops were very little help I’ve not tried them

47% 20% 14% 16% 1 4% 2 3 4 5

Normal approximation to the Binomial

If 𝑋 ∼ 𝐡(𝑛, 𝑝) and 𝑛 𝑁 𝑛𝑝, 𝑛𝑝 1 βˆ’ 𝑝 .

is large and 𝑛𝑝 is not too near 0 or 1, then 𝑋 is approximately 1 𝑝 = 2 𝑛 = 10 1 𝑝 = 2 𝑛 = 50

p=0.5

p=0.5

Approximating a range of possible results from a Binomial distribution e.g

. 𝑃(6 or fewer heads tossing a coin 10 times) = 𝑃(𝑋 ≀ 6) if 𝑋 ∼ 𝐡(10,0.5) 𝑃 𝑋 ≀ 6 = 𝑃 π‘˜ = 0 + 𝑃 π‘˜ = 1 + β‹― + 𝑃 π‘˜ = 6 = 0.8281

6.5

𝑒 βˆ’ β‰ˆ βˆ’βˆž π‘₯βˆ’πœ‡ 2𝜎 2 2πœ‹πœŽ 2 2 = Q 6.5 βˆ’ πœ‡ 𝜎 β‰ˆ Q 1.5

2.5

= Q 0.9487 = 0.8286

πœ‡ = 𝑛𝑝 = 5 𝜎 2 = 𝑛𝑝 1 βˆ’ 𝑝 = 2.5

[not always so accurate at such low 𝑛 !]

If π‘Œ ∼ 𝑁(πœ‡, 𝜎 2 ) what is the best approximation for 𝑃(3 or more heads when tossing a coin 10 times) ?

i.e. If 𝑋 ∼ 𝐡(10,0.5) , πœ‡ = 5 , 𝜎 2 = 𝑛𝑝 1 βˆ’ 𝑝 = 2.5

, what is the best approximation for 𝑃(𝑋 β‰₯ 3) ?

1.

2.

3.

𝑃(π‘Œ > 2.5) 𝑃(π‘Œ > 3) 𝑃(π‘Œ > 3.5)

60% 13% 27% 1 2 3

Quality control example

:

The manufacturing of computer chips produces 10% defective chips. 200 chips are randomly selected from a large production batch. What is the probability that fewer than 15 are defective?

Answer

: mean 𝑛𝑝 = 200 Γ— 0.1 = 20 variance 𝑛𝑝 1 βˆ’ 𝑝 = 200 Γ— 0.1 Γ— 0.9 = 18 . So if 𝑋 is the number of defective chips, approximately 𝑋 ∼ 𝑁 20,18 .

Hence 𝑃 𝑋 < 15 β‰ˆ 𝑃 𝑍 < 14.5 βˆ’ 20 18 = 𝑃 𝑍 < βˆ’1.296 = 1 βˆ’ 𝑃 𝑍 < 1.296

= 1 βˆ’ [0.9015 + 0.6 Γ— 0.9032 βˆ’ 0.9015 ] β‰ˆ 0.097

This compares to the exact Binomial answer 14 π‘˜=0 𝐢 𝑛 π‘˜ 𝑝 π‘˜ 1 βˆ’ 𝑝 π‘›βˆ’π‘˜ β‰ˆ 0.093

. The Binomial answer is easy to calculate on a computer, but the Normal approximation is much easier if you have to do it by hand. The Normal approximation is about right, but not accurate.