Transcript 投影片 1

Introduction to Stochastic Models
GSLM 54100
1
Outline
 memoryless
 geometric
 V(X)
property
and exponential
= E[V(X|Y)] + V[E(X|Y)]
 conditional
probability
2
Memoryless Property
of Geometric Distribution

X ~ Geo (p)

Y = the remaining life of given that X > 1

Y = (X-1|X > 1)
P( X  1  k , X  1)
 P(Y = k) = P(X - 1 = k| X > 1) =
P( X  1)
P( X  1  k )
(1  p)k p k-1
=
=
= (1-p) p = P(X = k)
1 p
1 p

Y~X

similarly, (X - m|X > m) ~ X for all m  1

the memoryless property of Geometric
3
Memoryless Property
of Exponent Distribution

X ~ exp(), P(X > s) = es, s > 0

Y = the remaining life of given that X > t (> 0)
Y

P(Y > s) = P(X - t > s| X > t) =
=


= (X-t|X > t)
P( X  t  s )
=
P( X  t )
e(t  s )
et
P ( X  t  s, X  t )
P( X  t )
= es = P(X > s)
Y~X
the memoryless property of exponential
4
Finding Variance by Conditioning
5
Finding Variance By Conditioning
 Proposition 3.1 of Ross
X & Y: two random variables
 both V(X|Y) and E(X|Y) being random
variables
 well-defined E[V(X|Y)] & V[E(X|Y)]

= E[E(X2|Y)] - E[E2(X|Y)]
 V[E(X|Y)] = E[E2(X|Y)] - E2[E(X|Y)]
 E[V(X|Y)]

E[V(X|Y)] + V[E(X|Y)]
= E[E(X2|Y)]  E2[E(X|Y)]
= E[X2]  E2[X] = V(X)
6
Variance of Random Sum
~ i.i.d. random variables, mean  &
variance 2
 Xi's

N: an integer non-negative random
variable independent of Xi's, mean  &
variance 2
S
N
=  Xi
i 1
7
Variance of Random Sum
 V(S)
= E[V(S|N)] + V[E(S|N)]
 V(S|N
= n) =
 E[V(S|N)]
 E(S|N
S   Xi
= n2
i 1
= E[N2] = 2E[N] = 2
= n) = E( in1 Xi ) = n
 V[E(S|N)]
 V(S)
V(in1 Xi )
N
= V[N] = 2V(N) = 22
= 2+22
 not
necessary to find the distribution of S
8
Variance of Geometric
X
~ Geo(p), 0 < p < 1
 IA
= 1 if {X = 1} and IA = 0 if {X > 1}
 P(IA
= 1) = p and P(IA = 0) = 1-p = q
 E[V(X|IA)]
 V(X|X=1)
=0
 V(X|X>1)
= V(1+X-1|X>1) = V(X-1|X>1) = V(X)

E[V(X|IA)] = p0 + qV(X) = qV(X)
9
Variance of Geometric
 Example 3.18 of Ross
 V[E(X|IA)]
memoryless property
of geometric
 E[X|X=1]
=1
 E[X|X>1]
= 1+ E[X-1| X>1] = 1 
 V[E(X|IA)]

 V (X ) 
= E[E2(X|IA)] - E2[E(X| IA)]
 
1 p 2
p
p(1)  (1  p)
 12
p
q
qV ( X )  p  V ( X )  q2
p
2
1
p

q
p
10
Conditional Probability
11
First Part of Example 3.26 of Ross
n
men mixed their hats & randomly picked
one
 En:
no matches for these n men
 P(En)
=?
 p1
=0
 p2
= 0.5
12
First Part of Example 3.26 of Ross
n
=3
 M:
correct hat for the first man
 Mc:

wrong hat for the first man
A
a
B
b
C
c
p3 = P(E3)
= P(E3|M)P(M) + P(E3|Mc)P(Mc)
= P(E3|Mc)P(Mc)
 P(Mc)
= 2/3 & P(E3|Mc) = 1/2
13
First Part of Example 3.26 of Ross
n
=4
 M:
correct hat for the first man
 Mc:

wrong hat for the first man
p4 = P(E4)
A
a
B
b
C
c
D
d
= P(E4|M)P(M) + P(E4|Mc)P(Mc)
= P(E4|Mc)P(Mc)
 P(Mc)
= 3/4 & P(E4|Mc) = ???
14
First Part of Example 3.26 of Ross
 P(E4|Mc)
 Ca:
C gets a
 Cna:
 {E4
= ???
C does not get a
|Mc}
A
a
B
b
C
c
D
d
= {E4Ca|Mc} {E4Cna|Mc}
 P(E4|Mc)
= P(E4Ca|Mc) + P(E4Cna|Mc)
15
First Part of Example 3.26 of Ross
 P(E4Cna|Mc)
= P(E3)
 P(E4Ca|Mc)
|CaMc)P(Ca|Mc)
= P(E4
= P(E2)(2/3)
A
a
B
b
C
c
D
d
16
First Part of Example 3.26 of Ross








pn = P(En) = P(En|M)P(M) + P(En|Mc)P(Mc)
= P(En|Mc)P(Mc)
Ha: the man whose hat picked by A picked A’s hat
Hna: the man whose hat picked by A did not pick A’s hat
P(En|Mc) = P(EnHna|Mc) + P(EnHa|Mc)
P(EnHna|Mc) = pn-1
P(EnHa|Mc)
= P(En|HaMc)P(Ha|Mc)
= pn-2(1/n-1)
so ( p  p )   1 ( p
n
n1
n1  pn2 )
n
n
solving
(

1)
1
1
1
pn  2!  3!  4!  ... 
n!
17
Recursive Relationships
18
Ex. #4 of WS#10

#1. (Solve Exercise #4 of Worksheet #10 by
conditioning, not direct computation.) Let X and Y
be two independent random variables ~ geo(p).

(a). Find P(X = Y).

(b). Find P(X > Y).

(c). Find P(min(X, Y) > k) for k  {1, 2, …}.

(d). From (c) or otherwise, Find E[min(X, Y)].

(e). Show that max(X, Y) + min(X, Y) = X + Y.
Hence find E[max(X, Y)].
19
Ex. #4 of WS#10
 (a).

different ways to solve the problem
by computation:

 P(X
=
= Y) =  P( X  Y  k ) =

k 1
 P( X  k ) P(Y  k )
k 1

=

 P( X  k , Y  k )
k 1
 (1  p)
k 1
k 1
p(1  p)k 1 p
= p/(2-p)
20
Ex. #4 of WS#10



by conditioning:

P(X = Y|X = 1, Y = 1) = 1

P(X = Y|X = 1, Y > 1) = 0

P(X = Y|X > 1, Y = 1) = 0

P(X = Y|X > 1, Y > 1) = P(X = Y)
P(X = Y) = P(X = 1, Y = 1)(1) + P(X > 1, Y > 1)P(X = Y)
P(X = Y) = p2 + (1p)2 P(X = Y)
i.e., P(X = Y) = p/(2p)
21
Ex. #4 of WS#10
 (b)
P(X > Y)
 by
symmetry
 P(X
> Y) + P(X = Y) + P(X < Y) = 1
 P(X
> Y) = P(X < Y)
 P(X
> Y)
1  P( X  Y )
=
2
=
1 p
2 p
22
Ex. #4 of WS#10
 by
direct computation
 P(X
i 2 j 1
i 2 j 1
i 1
   (1  p)
i 2 j 1

2
 i 1
  P( X  i, Y  j )    P( X  i) P(Y  j )
> Y) =
 i 1
 i 1
p(1  p)
j 1

i 1
p  p  (1  p)
i 1 

1

(1

p
)
 p  (1  p)i 1 


p
i 2


2
i 2
i 1
 (1  p)
j 1
j 1
2 
 1 p
(1

p
)


 p  (1  p)i 1  (1  p)2i 2   p 
2


i 2
 (1  p) 1  (1  p) 
1 p

23
2 p

Ex. #4 of WS#10
 by
conditioning
 P(X
> Y) = E[P(X > Y|Y)]
 P(X
> Y|Y = y) = P(X > y) = (1-p)y
 E[P(X

> Y|Y)] =
(1  p) p
1  (1  p)2
1 p

2 p

E[(1-p)Y]   (1  p) y (1  p) y1 p
y 1
24
Ex. #4 of WS#10

yet another way of conditioning






P(X > Y|X = 1, Y = 1) = 0
P(X > Y|X = 1, Y > 1) = 0
P(X > Y|X > 1, Y = 1) = 1
P(X > Y|X > 1, Y > 1) = P(X > Y)
P(X > Y)
= P(X > 1, Y = 1) + P(X > 1, Y > 1)P(X > Y)
= (1-p)p + (1-p)2P(X > Y)
P(X > Y) = (1p)/(2p)
25
Ex. #5 of WS#10

In the sea battle of Cape of No Return, two
cruisers of country Landpower (unluckily) ran
into two battleships of country Seapower. With
artilleries of shorter range, the two cruisers
had no choice other than receiving rounds of
bombardment by the two battleships. Suppose
that in each round of bombardment, a
battleship only aimed at one cruiser, and it
sank the cruiser with probability p in a round,
0 < p < 1, independent of everything else. The
two battleships fired simultaneously in each
round.
26
Ex. #5 of WS#10




(a) Suppose that the two battleships first co-operated
together to sink the same cruiser. Find the expected
number of rounds of bombardment taken to sink both of
the two cruisers.
pc = P(a cruiser was sunk in a round)
= 1- P(a cruiser was not sunk in a round)
= 1-(1-p)2
expected number of rounds taken to sink a cruiser ~
Geo(pc) with mean = 1/pc
expected number of rounds taken to sink two cruisers =
2/pc
27
Ex. #5 of WS#10

(b) Now suppose that initially the two
battleships aimed at a different cruiser.
They helped the other battleship only if its
targeted cruiser was sunk before the other
one.
 (i)
What is the probability that the two
cruisers were sunk at the same time (with
the same number of rounds of
bombardment).
28
Ex. #5 of WS#10
 (b).
(i). Ni = the number of rounds taken to
sink the ith cruiser; Ni ~ Geo (p); N1 and N2
are independent.

ps = P(2 cruisers sunk at the same round)
= P(N1 = N2)
 discussed
before
29
Ex. #5 of WS#10
 different

ways to solve the problem
by computation:
 P(N1



= N2) = P( N1  N2  k ) =  P( N1  k , N 2  k )
k 1
=  P( N1  k ) P( N2  k )
k 1

=
k 1
k 1
 (1  p)
k 1
p(1  p)k 1 p
= p/(2-p)
30
Ex. #5 of WS#10


by conditioning:

P(N1 = N2|N1 = 1, N2 = 1) = 1

P(N1 = N2|N1 = 1, N2 > 1) = 0

P(N1 = N2|N1 > 1, N2 = 1) = 0

P(N1 = N2|N1 > 1, N2 > 1) = P(X = Y)
P(N1 = N2)
= P(N1 = 1, N2 = 1)(1) + P(N1 = 1, N2 = 1)P(X = Y)

P(N1 = N2) = p2 + (1p)2 P(N1 = N2)
i.e., P(N1 = N2) = p/(2p)
31
Ex. #5 of WS#10
 (ii).
Find the probability of taking k ({1, 2,
…}) rounds to have the first sinking of a
cruiser.
Y
= the number of rounds to have the first
sinking of a cruiser
 different
ways to find P(Y = k)
32
Ex. #5 of WS#10

from the tail distribution
 P(min(N1,
N2) > k) = P(N1 > k, N2 > k)
= P(N1 > k)P(N2 > k) = (1-p)2k
 P(Y = k) = P(Y > k-1) - P(Y > k)
= P(min(N1, N2) > k -1) - P(min(N1, N2) > k)
= P(N1 > k –1, N2 > k –1) - P(N1 > k, N2 > k)
= P(N1 > k –1)P(N2> k –1) - P(N1> k)P(N2> k)
= (1p)2k-2  (1p)2k
= (1p)2k-2p(2p)
33
Ex. #5 of WS#10

from direct calculation
 P(Y
= k) = P(min(N1, N2) = k)
= P(N1 = k, N2 = k)
+ P(N1 > k, N2 = k) + P(N1 = k, N2 > k)
= P(N1 = k)P(N2 = k)
+ P(N1 > k)P(N2 = k) + P(N1 = k) P(N2 > k)
= (1p)2k-2p2 + 2(1p)k(1p)k-1p
= (1p)2k-2p(2p)
34
Ex. #5 of WS#10
 (iii).
Find the expected number of rounds
taken to have the first sinking of a cruiser
 different
 by
ways to find E(Y)
direct computation
 E(Y)
=
=

k 1 kP(Y
p(2  p)
[1  (1  p)2 ]2
 k) =

k 1 (1 
p)2k 2 p(2  p)k
1
=
p (2  p )
35
Ex. #5 of WS#10
for X  0, E ( X )  0 P( X  s )ds
 from
the tail distribution
 E(Y)



k 0
k 0
=  P( X  k ) =  P(min( N1, N 2 )  k )

 P( N1  k , N2  k ) =  P( N1  k ) P( N 2  k )
k0
k 0
1
2k
(1

p
)
= 
= p (2  p )
k 0
=
36
Ex. #5 of WS#10

by conditioning
 E[min(N1,
N2)|N1 = 1, N2 = 1] = 1
 E[min(N1,
N2)|N1 = 1, N2 > 1] = 1
 E[min(N1,
N2)|N1 > 1, N2 = 1] = 1
 E[min(N1,
N2)|N1 > 1, N2 > 1] = 1 + E[min(N1, N2)]
 E[min(N1,
N2)] = 1 + (1-p)2 E[min(N1, N2)]
1
 E[min(N1, N2)] =
p (2  p )
37
Ex. #5 of WS#10
 (iv).
Z = # of rounds taken to sink both
cruisers; find E(Z)
 by
conditioning
 E(Z|N1
= 1, N2 = 1) = 1
 E(Z|N1
= 1, N2 > 1) = 1 + E(N2’)
 E(Z|N1
> 1, N2 = 1) = 1 + E(N1’)
 E(Z|N1
> 1, N2 > 1) = 1 + E(Z)
38
Ex. #5 of WS#10
 N1’,
N2’ ~ Geo( 1(1 p)2)
 E(Z)
=
 solving
1  2 p(1  p)
E(Z) =
1
1  (1  p)2
4 3p
 (1  p) 2 E ( Z )
p(2  p) 2
39
Ex. #5 of WS#10
 by
direct computation
Z  # of rounds to have the first sink
 0, if both are sunk at the same time,
+ 
o.w.
 N ',
1
E (# of rounds to have the first sink)=
p(2  p)
p
P(both are sunk at the same round) 
2 p
1
E ( N ') 
1  (1  p)2
4  3p
Solving,
E(Z ) 
p(2  p)2
40
Exercise 3.62 of Ross
 A,
B, and C are evenly matched tennis
players. Initially A and B play a set, and the
winner then plays C. This continues, with
the winner always playing the waiting
player, until one of the players has won two
sets in a row. That player is then declared
the overall winner. Find the probability that
A is the overall winner.
41
Exercise 3.62 of Ross

by direct computation
of a sample point: x1x2…xn , where xj is
the winner of the jth match
 convention
 AA
= A first beats B and then beats C
 sample
points with A winning the first match and
eventually winning the game
 AAACBAA
ACBACBAA …
 sample
points with A losing the first match but
eventually winning the game
 BCAABCACBAA
BCACBACBAA …
42
Exercise 3.62 of Ross
 by
direct computation
 P(A
wins the game)
= P(AAACBAA ACBACBAA …)
+ P(BCAABCACBAA BCACBACBAA …)
     
=  12
5
=
2
1
2
5
1
2
8
 ...   
 
     
1
2
4
1
2
7
1
2
10
 ... 

14
5
14
 P(B
wins the game) =
 P(C
wins the game) = 1 145  145  72
43
Exercise 3.62 of Ross

by conditioning





pi = P(i wins eventually), i = A, B, C
pA = pB; pA + pB + pC = 1
P(A wins the game|AA) = 1; P(A wins the game|BB) = 0
P(A wins the game|BC) = pC
P(A wins the game|AC)
= P(A wins the game|ACC)P(ACC|AC)
+ P(A wins the game|ACB)P(ACB|AC)
=
0  pc 12
 
pA 
 14  (1)  pc  14   pc  18 
5  p ;p 
solving, p A  14
B C
 p A  14  pc
 83 
2
7
44
Stochastic Modeling

given a problem statement
 formulate
the problem by
defining events, random
variables, etc.
 understand
the stochastic
mechanism
 deduce
means (including
probabilities, variances, etc.)
 identifying
special structure and
properties of the stochastic
mechanism
In the sea battle of Cape of No Return, two
cruisers of country Landpower (unluckily) ran
into two battleships of country Seapower.
With artilleries of shorter range, the two
cruisers had no choice other than receiving
rounds of bombardment by the two battleships.
Suppose that in each round of bombardment, a
battleship only aimed at one cruiser, and it
sank the cruiser with probability p in a round,
0 < p < 1, independent of everything else. The
two battleships fired simultaneously in each
round.
(b) Now suppose that initially the two
battleships aimed at a different cruiser. They
helped the other battleship only if its targeted
cruiser was sunk before the other one.
(i) What is the probability that the two cruisers
were sunk at the same time (with the same
number of rounds of bombardment).
45
Examples of Ross in Chapter 3
 Examples
3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.11, 2.12,
3.13
46
Exercises of Ross in Chapter 3
 Exercises
3.1, 3.3, 3.5, 3.7, 3.8, 3.14, 3.21,
3.23, 3.24, 3.25, 3.27, 3.29, 3.30, 3.34, 3.37,
3.40, 3.41, 3.44, 3.49, 3.51, 3.54, 3.61, 3.62,
3.63, 3.64, 3.66
47