15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables.

Download Report

Transcript 15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables.

15. Poisson Processes
In Lecture 4, we introduced Poisson arrivals as the limiting behavior
of Binomial random variables. (Refer to Poisson approximation of
Binomial random variables.)
From the discussion there (see (4-6)-(4-8) Lecture 4)
" k arrivals occur in an    k
P
,
e
k!
 interval of duration "
where
k  0, 1, 2,

  np  T   
T
0
(15-2)
k
arrivals



k
arrivals




2
0
T
(15-1)
T
1
Fig. 15.1
PILLAI
It follows that (refer to Fig. 15.1)
" k arrivals occur in an  2  (2 )k
P
,
e
k!
 interval of duration 2" 
k  0, 1, 2,
, (15-3)
since in that case
2
 2   2.
(15-4)
T
From (15-1)-(15-4), Poisson arrivals over an interval form a Poisson
random variable whose parameter depends on the duration
of that interval. Moreover because of the Bernoulli nature of the
underlying basic random arrivals, events over nonoverlapping
intervals are independent. We shall use these two key observations
to define a Poisson process formally. (Refer to Example 9-5, Text)
Definition: X(t) = n(0, t) represents a Poisson process if
(i) the number of arrivals n(t1, t2) in an interval (t1, t2) of length
t = t2 – t1 is a Poisson random variable with parameter  t .
Thus
2
np1  T 
PILLAI
k
(

t
)
(15-5)
P{n(t1 , t2 )  k}  e t
, k  0, 1, 2,, t  t2  t1
k!
and
(ii) If the intervals (t1, t2) and (t3, t4) are nonoverlapping, then the
random variables n(t1, t2) and n(t3, t4) are independent.
Since n(0, t) ~ P(t ), we have
E[ X (t )]  E[n(0, t )]  t
(15-6)
and
E[ X 2 (t )]  E[n 2 (0, t )]  t  2 t 2 .
(15-7)
To determine the autocorrelation function RXX (t1 , t2 ), let t2 > t1 ,
then from (ii) above n(0, t1) and n(t1, t2) are independent Poisson
random variables with parameters t1 and (t2  t1 ) respectively.
Thus
E[n(0, t1 )n(t1 , t2 )]  E[n(0, t1 )]E[n(t1 , t2 )]  2 t1 (t2  t1 ).
(15-8)
3
PILLAI
But
n(t1 , t2 )  n(0, t2 )  n(0, t1 )  X (t2 )  X (t1 )
and hence the left side if (15-8) can be rewritten as
E[ X (t1 ){ X (t2 )  X (t1 )}]  RXX (t1 , t2 )  E[ X 2 (t1 )].
(15-9)
Using (15-7) in (15-9) together with (15-8), we obtain
RXX (t1 , t2 )  2 t1 (t2  t1 )  E[ X 2 (t1 )]
 t1  2 t1 t2 ,
t2  t1 .
(15-10)
Similarly
RXX (t1 , t2 )  t2  2 t1 t2 ,
t2  t1 .
(15-11)
RXX (t1 , t2 )  2 t1 t2   min( t1 , t2 ).
(15-12)
Thus
4
PILLAI
Poisson
arrivals
t
From (15-12), notice that
the Poisson process X(t)
does not represent a wide
sense stationary process.
ti
t1
0
X (t )
t
Y (t )
1
t1
t
1
Define a binary level process
Y (t )  ( 1) X ( t )
Fig. 15.2
(15-13)
that represents a telegraph signal (Fig. 15.2). Notice that the
transition instants {ti} are random. (see Example 9-6, Text for
the mean and autocorrelation function of a telegraph signal).
Although X(t) does not represent a wide sense stationary process,5
PILLAI
its derivative X (t ) does represent a wide sense stationary process.
X (t )
d ()
dt
X (t )
Fig. 15.3 (Derivative as a LTI system)
To see this, we can make use of Fig. 14.7 and (14-34)-(14-37).
From there
 (t ) 
X
and
and
d X (t ) dt

  , a constant
dt
dt
2
 RXX (t1 , t2 )   t1
RXX  (t1 , t2 ) 
 2
 t2
  t1  
  2 t1   U (t1  t2 )
(15-14)
t1  t2
t1  t2
 RXX  (t1 , t2 )
RX X  (t1, t 2 ) 
 2    (t1  t 2 ).
 t1
(15-15)
(15-16)
6
PILLAI
From (15-14) and (15-16) it follows that X (t ) is a wide sense
stationary process. Thus nonstationary inputs to linear systems can
lead to wide sense stationary outputs, an interesting observation.
• Sum of Poisson Processes:
If X1(t) and X2(t) represent two independent Poisson processes,
then their sum X1(t) + X2(t) is also a Poisson process with
parameter (1  2 )t. (Follows from (6-86), Text and the definition
of the Poisson process in (i) and (ii)).
• Random selection of Poisson Points:
Let t1 , t2 ,, ti ,  represent random arrival points associated
with a Poisson process X(t) with parameter t ,
and associated with


each arrival point,
t
t
t
t
define an independent
Bernoulli random
Fig. 15.4
variable Ni, where
P ( N i  1)  p,
P ( N i  0)  q  1  p.
(15-17) 7
1
2
i
PILLAI
Define the processes
Y (t ) 
X (t )
 Ni
i 1
;
Z (t ) 
X (t )
 (1  N i )  X (t )  Y (t )
(15-18)
i 1
we claim that both Y(t) and Z(t) are independent Poisson processes
with parameters pt and qt respectively.
Proof:

(15-19)
Y (t )   P{Y (t )  k | X (t )  n}P{ X (t )  n )}.
n k
n
But given X(t) = n, we have Y (t )   N i ~ B(n, p ) so that
i 1
P{Y (t )  k | X (t )  n} 

n
k
p k q n k ,
0  k  n,
(15-20)
and
n
(

t
)
P{ X (t )  n}  e  t
.
n!
Substituting (15-20)-(15-21) into (15-19) we get
(15-21)
8
PILLAI
P{Y (t )  k }  e
 t


n k
n!
( n  k )! k !
k
p q
n  k ( t )
n!
n
p k e  t

(t )k
k!


n k
( q t ) n  k
( n  k )!
e qt
 (1 q )  t
k
e
(

pt
)
 ( pt )k
 e   pt
,
k!
k!
~ P ( pt ).
k  0, 1, 2,
(15-22)
More generally,
P{Y (t )  k , Z (t )  m}  P{Y (t )  k , X (t )  Y (t )  m}
 P{Y (t )  k , X (t )  k  m}
 P{Y (t )  k | X (t )  k  m}P{ X (t )  k  m}
  k km  p k q m  e  t
n
n
(t )k  m
(

pt
)
(

qt
)
 e   pt
e   qt
( k  m)!
k!
m!
P (Y ( t )  k )
 P{Y (t )  k }P{Z (t )  m},
P( Z (t )m)
(15-23)
9
PILLAI
which completes the proof.
Notice that Y(t) and Z(t) are generated as a result of random Bernoulli
selections from the original Poisson process X(t) (Fig. 15.5),
where each arrival gets tossed
over to either Y(t) with
Y ( t ) ~ P ( pt )
probability p or to Z(t) with
t
probability q. Each such
p
p
p
p
sub-arrival stream is also
a Poisson process. Thus
X ( t ) ~ P ( t )
random selection of Poisson
t
points preserve the Poisson
q
nature of the resulting
q
processes. However, as we
Z ( t ) ~ P ( qt )
shall see deterministic
t
selection from a Poisson
Fig. 15.5
process destroys the Poisson
10
property for the resulting processes.
PILLAI
Inter-arrival Distribution for Poisson Processes
2nd
nth
Let  1 denote the time interval (delay)
arrival
Ist arrival
to the first arrival from any fixed point
t

t0. To determine the probability
t
t

t
t
distribution of the random variable
Fig. 15.6
 1 , we argue as follows: Observe that
the event " 1  t" is the same as “n(t0, t0+t) = 0”, or the complement
event " 1  t" is the same as the event “n(t0, t0+t) > 0” .
Hence the distribution function of  1 is given by
0
1
1
2
n
F1 (t )  P{ 1  t}  P{ X (t )  0}  P{n(t0 , t0  t )  0}
(15-24)
 1  P{n(t0 , t0  t )  0}  1  e  t
(use (15-5)), and hence its derivative gives the probability density
function for  1 to be
dF1 (t )
f1 (t ) 
  e  t , t  0
(15-25)
dt
i.e.,  1 is an exponential random variable with parameter 
11
so that E ( 1 )  1/ .
PILLAI
Similarly, let tn represent the nth random arrival point for a Poisson
process. Then

Ftn (t ) 
P{tn  t}  P{ X (t )  n}
( t )k  t
 1  P{ X (t )  n}  1  
e
k 0 k !
n 1
(15-26)
and hence
f tn ( x ) 

dFtn ( x )
dx
 n x n 1
(n  1)!
n 1
 ( x )k 1
k 1
(k  1)!
 
e  x ,
x0
n 1
 ( x )k
k 0
k!
e  x  
e x
(15-27)
which represents a gamma density function. i.e., the waiting time to
the nth Poisson arrival instant has a gamma distribution.
Moreover
n
tn    i
12
i 1
PILLAI
where  i is the random inter-arrival duration between the (i – 1)th
and ith events. Notice that  i s are independent, identically distributed
random variables. Hence using their characteristic functions,
it follows that all inter-arrival durations of a Poisson process are
independent exponential random variables with common parameter  .
i.e.,
f i (t )   e t , t  0.
(15-28)
Alternatively, from (15-24)-(15-25), we have  1 is an exponential
random variable. By repeating that argument after shifting t0 to the
new point t1 in Fig. 15.6, we conclude that  2 is an exponential
random variable. Thus the sequence  1 ,  2 , ,  n ,  are independent
exponential random variables with common p.d.f as in (15-25).
Thus if we systematically tag every mth outcome of a Poisson process
X(t) with parameter t to generate a new process e(t), then the
inter-arrival time between any two events of e(t) is a gamma
13
random variable.
PILLAI
Notice that
E[e(t )]  m /  , and if   m , then E[e(t )]  1 / .
The inter-arrival time of e(t) in that case represents an Erlang-m
random variable, and e(t) an Erlang-m process (see (10-90), Text).
In summary, if Poisson arrivals are randomly redirected to form new
queues, then each such queue generates a new Poisson process
(Fig. 15.5). However if the arrivals are systematically redirected
(Ist arrival to Ist counter, 2nd arrival to 2nd counter, , mth to mth ,
(m +1)st arrival to Ist counter, ), then the new subqueues form
Erlang-m processes.
Interestingly, we can also derive the key Poisson properties (15-5)
and (15-25) by starting from a simple axiomatic approach as shown
below:
14
PILLAI
Axiomatic Development of Poisson Processes:
The defining properties of a Poisson process are that in any “small”
interval t , one event can occur with probability that is proportional
to t. Further, the probability that two or more events occur in that
interval is proportional to  (t ), (higher powers of t ), and events
over nonoverlapping intervals are independent of each other. This
gives rise to the following axioms.
Axioms:

(i) P{n(t , t  t )  1}  t   (t )


(ii) P{n(t , t  t )  0}  1  t   (t )

 (15-29)

(iii) P{n(t , t  t )  2}   (t )

and

n
(
t
,
t


t
)
is
independen
t
of
n
(
0
,
t
)
(iv)

Notice that axiom (iii) specifies that the events occur singly, and axiom
(iv) specifies the randomness of the entire series. Axiom(ii) follows
15
from (i) and (iii) together with the axiom of total probability.
PILLAI
We shall use these axiom to rederive (15-25) first:
Let t0 be any fixed point (see Fig. 15.6) and let t0   1 represent the
time of the first arrival after t0 . Notice that the random variable  1
is independent of the occurrences prior to the instant t0 (Axiom (iv)).
With F1 (t )  P{ 1  t} representing the distribution function of  1 ,
as in (15-24) define Q(t )  1  F1 (t )  P{ 1  t}. Then for t  0
Q (t  t )  P{ 1  t  t}
 P{ 1  t , and no event occurs in (t0  t , t0  t  t )}
 P{ 1  t , n(t0  t , t0  t  t )  0}
 P{n(t0  t , t0  t  t )  0 |  1  t}P{ 1  t}.
From axiom (iv), the conditional probability in the above expression
is not affected by the event {1  t} which refers to {n(t0, t0 + t) = 0},
i.e., to events before t0 + t, and hence the unconditional probability
in axiom (ii) can be used there. Thus
Q(t  t )  [1  t   (t )]Q(t )
or
Q(t t ) Q(t )
 t
16

lim

Q
(
t
)



Q
(
t
)

Q
(
t
)

ce
.
t
t 0
PILLAI
But c  Q(0)  P{1  0}  1 so that
Q(t )  1  F1 (t )  e t
or
F1 (t )  1  e t ,
t 0
which gives
f1 (t ) 
dF1 (t )
dt
  e  t ,
t0
(15-30)
to be the p.d.f of  1 as in (15-25).
Similarly (15-5) can be derived from axioms (i)-(iv) in (15-29) as well.
To see this, let

pk (t ) 
P{n(0, t )  k )},
k  0, 1, 2,
represent the probability that the total number of arrivals in the
interval (0, t) equals k. Then
pk (t  t )  P{n(0, t  t )  k )}  P{ X 1  X 2  X 3 }
17
PILLAI
where the events

X 1 = " n(0, t )  k , and n(t , t  t )  0"
X 2 = " n(0, t )  k  1, and n(t , t  t )  1"
X 3 = " n(0, t )  k  i, and n(t , t  t )  i  2"
are mutually exclusive. Thus
pk (t  t )  P( X 1 )  P( X 2 )  P( X 3 ).
But as before
P ( X 1 )  P{n(t , t  t )  0 | n(0, t )  k}P{n(0, t )  k}
 P{n(t , t  t )  0}P{n(0, t )  k}
 (1  t ) pk (t )
P ( X 2 )  P{n(t , t  t )  1 | n(0, t )  k  1}P{n(0, t )  k  1}
 t pk 1t
and
P( X 3 )  0
18
PILLAI
where once again we have made use of axioms (i)-(iv) in (15-29).
This gives
pk (t  t )  (1  t ) pk (t )  tpk 1 (t )
or with
pk (t  t )  pk (t )
lim
 pk (t )
t 0
t
we get the differential equation
pk (t )  pk (t )  pk 1 (t ),
k  0, 1, 2, 
whose solution gives (15-5). Here p1 (t )  0. [Solution to the above
differential equation is worked out in (16-36)-(16-41), Text].
This is completes the axiomatic development for Poisson processes.
19
PILLAI
Poisson Departures between Exponential Inter-arrivals
Let X (t ) ~ P(t ) and Y (t ) ~ P(t ) represent two independent
Poisson processes called arrival and departure processes.
X (t ) 


 

t
t
t
t
t
Y (t ) 
1
3
2
1
2
i
k
i 1
i
Z
Fig. 15.7
Let Z represent the random interval between any two successive
arrivals of X(t). From (15-28), Z has an exponential distribution with
parameter  . Let N represent the number of “departures” of Y(t)
between any two successive arrivals of X(t). Then from the Poisson
nature of the departures we have
k
(

t
)
P{N  k | Z  t}  e   t
.
k!
Thus
20
PILLAI

P{N  k }   0 P{N  k | Z  t} f Z (t )dt

 0 e

  t (  t )k
k!
 e  t dt

 k !  0 (  t ) k e  (    ) t dt
k
    1  k x
   
x e dx


k
!
0
   
k!
k
       ,
       
k  0, 1, 2,
(15-31)
i.e., the random variable N has a geometric distribution. Thus if
customers come in and get out according to two independent
Poisson processes at a counter, then the number of arrivals between
any two departures has a geometric distribution. Similarly the
number of departures between any two arrivals also represents
another geometric distribution.
21
PILLAI
Stopping Times, Coupon Collecting, and Birthday Problems
Suppose a cereal manufacturer inserts a sample of one type
of coupon randomly into each cereal box. Suppose there are n such
distinct types of coupons. One interesting question is that how many
boxes of cereal should one buy on the average in order to collect
at least one coupon of each kind?
We shall reformulate the above problem in terms of Poisson
processes. Let X 1 (t ), X 2 (t ), , X n (t ) represent n independent
identically distributed Poisson processes with common parameter t .
Let ti1 , ti 2 ,  represent the first, second,  random arrival instants
of the process X i (t ), i  1, 2,  , n. They will correspond to the first,
second,  appearance of the ith type coupon in the above problem.
n
Let

X (t )   X i (t ),
(15-32)
i 1
so that the sum X(t) is also a Poisson process with parameter t, where
  n.
(15-33)
22
PILLAI
From Fig. 15.8, 1 /  represents
Ist Y
The average inter-arrival duration
t
between any two arrivals of
X i (t ), i  1, 2,  , n, whereas
1 /  represents the average inter-arrival
time for the combined sum process
X(t) in (15-32).
1
Y2
Y3
k1
t 21
t11
Nth
arrival
Yk
t i1
t 31
t12
stopping
time T

t n1
t
1

Fig. 15.8
Define the stopping time T to be that random time instant by which
at least one arrival of X 1 (t ), X 2 (t ), , X n (t ) has occurred.
Clearly, we have
T  max (t11 , t 21 , , ti1 , , t n1 ).
(15-34)
But from (15-25), ti1 , i  1, 2,  , n are independent exponential
random variables with common parameter  . This gives
FT (t )  P{T  t}  P{max (t11 , t21 ,, tn1 )  t}
 P{t11  t , t21  t , , tn1  t}
 P{t11  t}P{t21  t} P{tn1  t}  [ Fti (t )] .
n
23
PILLAI
Thus
FT (t )  (1  e  t )n ,
t 0
(15-35)
represents the probability distribution function of the stopping time
random variable T in (15-34). To compute its mean, we can make
use of Eqs. (5-52)-(5-53) Text, that is valid for nonnegative random
variables. From (15-35) we get
P(T  t )  1  FT (t )  1  (1  e  t ) n ,
t 0
so that


E{T }   0 P(T  t )dt   0 {1  (1  e  t )n }dt.
(15-36)
 t
Let 1  e  x, so that  e  t dt  dx, or dt   (1dx x ) ,
and
dx
1 1
1 11 x n
n
E{T }    0 (1  x )
  0
dx
1

x
1 x
1
   0 (1  x  x    x
1
2
n 1
n
k 1
x
)dx   
k 1 k
1
24
0
PILLAI
1 1 1
1   
 2 3
1
n(ln n   )
E{T } 

0.5772157
where 
1
n
1 1
    1   
n  2 3
1
 
n
(15-37)
is the Euler’s constant1.
Let the random variable N denote the total number of all arrivals
up to the stopping time T, and Yk , k  1, 2,  , N the inter-arrival
random variables associated with the sum process X(t) (see Fig 15.8).
1
1 1
Euler’s constant: The series {1  2  3 
un 
and un 

1
x dx
0 n2



1
x
dx
0 n ( n x )
1
1 dx
0 n2

1
n2

1
 n  ln n} converges, since
1
0 ( n1  n1 x ) dx  n1  ln

so that

n 1

un 
to some number   0. From (1) we obtain also
1 1
lim{1  2  3 
n
1
 n  ln n}  lim
n

n
u
k 1 k
(1)

 6  . Thus the series {un} converges
2
n
 uk   k1  ln ( n  1)
k 1
n

1
2
n 1 n
n 1
n 0

 ln nn1 
k 1

k 1 uk
so that
   0.5772157
.
25
PILLAI
Then we obtain the key relation
N
T   Yi
(15-38)
i 1
so that
n
n
E{T | N  n}  E{ Yi | N  n}  E{ Yi }  nE{Yi }
i 1
i 1
(15-39)
since {Yi} and N are independent random variables, and hence
E{T }  E [ E{T | N  n}]  E{N }E{Yi }
(15-40)
But Yi ~ Exponentia l( ), so that E{Yi }  1 /  and substituting this
into (15-37), we obtain
E{N } n(ln n   ).
(15-41)
Thus on the average a customer should buy about n ln n, or slightly
26
more, boxes to guarantee that at least one coupon of each
type has been collected.
Next, consider a slight generalization to the above problem: What if
two kinds of coupons (each of n type) are mixed up, and the objective
is to collect one complete set of coupons of either kind?
Let Xi(t) and Yi(t), i  1 , 2, , n represent the two kinds of
coupons (independent Poisson processes) that have been mixed up to
form a single Poisson process Z(t) with normalized parameter unity.
i.e.,
n
Z (t )   [ X i (t )  Yi (t )] ~ P(t ).
i 1
(15-42)

 of
As before let ti1 , ti 2 ,  represent the first, second,
arrivals
the process Xi(t), and  i1 ,  i 2 ,  represent the first, second,
arrivals of the process Yi (t ), i  1, 2, , n.
27
PILLAI
The stopping time T1 in this case represents that random instant at
which either all X – type or all Y – type have occurred at least
once. Thus
T1  min{ X , Y }
(15-43)
where

X 
max (t11 , t21 ,
, t n1 )
(15-44)
, n1 ).
(15-45)
and

Y 
max ( 11 , 21 ,
Notice that the random variables X and Y have the same distribution
as in (15-35) with  replaced by 1/2n (since   1 and there are 2n
independent and identical processes in (15-42)), and hence
FX (t )  FY (t )  (1  e  t / 2 n )n ,
Using (15-43), we get
t  0.
(15-46)
28
PILLAI
FT1 (t )  P(T1  t )  P(min{ X , Y }  t )
 1  P(min{ X , Y }  t )  1  P( X  t , Y  t )
 1  P( X  t ) P(Y  t )
 1  (1  FX (t ))(1  FY (t ))
 1  {1  (1  e  t / 2 n ) n }2 ,
t 0
(15-47)
to be the probability distribution function of the new stopping time T1.
Also as in (15-36)


E{T1 }   0 P(T1  t )dt   0 {1  FT (t )}dt
1

  0 {1  (1  e  t / 2 n ) n }2 dt.
Let 1  et / 2n  x, or 21n et / 2n dt  dx, dt  21ndx
x .
n
1
1
dx
1

x


n 2
E{T1}  2n  0 (1  x )
 2n  0 
(1  x n ) dx
1 x
 1 x 
1
 2n  0 (1  x  x 2    x n 1 )(1  x n )dx
29
PILLAI
1 n 1
n 1
k 0
k 0
E{T1 }  2n  0 (  x k   x n  k ) dx
 2n
 11213  n1    n11 n1 2
1
2n

2n(ln( n / 2)   ).
(15-48)
Once again the total number of random arrivals N up to T1 is related
as in (15-38) where Yi ~ Exponential (1), and hence using (15-40) we
get the average number of total arrivals up to the stopping time to be
E{N }  E{T1 } 2n(ln( n / 2)   ).
(15-49)
We can generalize the stopping times in yet another way:
Poisson Quotas
Let
n
X (t )   X i (t ) ~ P( t )
i 1
where Xi(t) are independent, identically distributed Poisson
(15-50)
30
PILLAI
processes with common parameter i t so that   1  2    n .
Suppose integers m1 , m2 ,  , mn represent the preassigned number of
arrivals (quotas) required for processes X 1 (t ), X 2 (t ), , X n (t ) in
the sense that when mi arrivals of the process Xi(t) have occurred,
the process Xi(t) satisfies its “quota” requirement.
The stopping time T in this case is that random time
instant at which any r processes have met their quota requirement
where r  n is given. The problem is to determine the probability
density function of the stopping time random variable T, and
determine the mean and variance of the total number of random
arrivals N up to the stopping time T.
Solution: As before let ti1 , ti 2 ,  represent the first, second, 
arrivals of the ith process Xi(t), and define
(15-51)
Yij  tij  ti , j 1
Notice that the inter-arrival times Yij and independent, exponential
random variables with parameter i , and hence
mi
ti ,mi   Yij ~ Gamma( mi , i )
j 1
31
PILLAI
Define Ti to the stopping time for the ith process; i.e., the occurrence
of the mith arrival equals Ti . Thus
Ti  ti ,mi ~ Gamma( mi , i ),
or
i  1, 2, , n
t mi 1 mi i t
f Ti (t ) 
i e ,
(mi  1)!
(15-52)
t  0.
(15-53)
Since the n processes in (15-49) are independent, the associated
stopping times Ti , i  1, 2,  , n defined in (15-53) are also
independent random variables, a key observation.
Given independent gamma random variables T1 , T2 ,  , Tn ,
in (15-52)-(15-53) we form their order statistics: This gives
T(1)  T( 2)    T( r )    T( n ) .
Note that the two extremes in (15-54) represent
T(1)  min (T1 , T2 ,, Tn )
and
T( n )  max (T1 , T2 ,, Tn ).
(15-54)
(15-55)
(15-56)
32
PILLAI
The desired stopping time T when r processes have satisfied their
quota requirement is given by the rth order statistics T(r). Thus
T  T(r )
(15-57)
where T(r) is as in (15-54). We can use (7-14), Text to compute the
probability density function of T. From there, the probability density
function of the stopping time random variable T in (15-57) is given by
n!
fT (t ) 
FTki 1 (t )[1  FT i (t )]nk fTi (t )
(r  1)!(n  r )!
(15-58)
where FTi (t ) is the distribution of the i.i.d random variables Ti and
fTi (t ) their density function given in (15-53). Integrating (15-53) by
parts as in (4-37)-(4-38), Text, we obtain
(i t ) k it
FTi (t )  1  
e ,
k!
k 0
mi 1
t  0.
(15-59)
Together with (15-53) and (15-59), Eq. (15-58) completely specifies
33
the density function of the stopping time random variable T,
PILLAI
where r types of arrival quota requirements have been satisfied.
If N represents the total number of all random arrivals up to T,
then arguing as in (15-38)-(15-40) we get
N
T   Yi
(15-60)
i 1
where Yi are the inter-arrival intervals for the process X(t), and hence
E{T }  E{N}E{Yi }  1 E{N}
(15-61)
E{N }  E{T }.
(15-62)
with normalized mean value  ( 1) for the sum process X(t), we get
To relate the higher order moments of N and T we can use their
characteristic functions. From (15-60)
N
E{e jT }  E{e
j  Yi
i 1
n
}  E{E[e
j  Yi
i 1
| N  n ]}
[ E {e jYi }]n
 E[{E{e jYi } | N  n}n ].
(15-63)
34
PILLAI
But Yi ~ Exponential (1) and independent of N so that
E{e
jYi
| N  n}  E{e
jYi
1
}
1  j
and hence from (15-63)
E{e
j T

}   [ E{e
jYi
}] P( N  n ) E
n
n 0

1
1 j

N

 E{(1  j )  N }
which gives (expanding both sides)


k 0
k 0
( j )k
( j )k
k
 k! E{T }   k! E{N ( N  1)( N  k  1)}
or
E{T k }  E{N ( N  1)  ( N  k  1)},
(15-64)
a key identity. From (15-62) and (15-64), we get
var{N}  var{T }  E{T }.
(15-65)
35
PILLAI
As an application of the Poisson quota problem, we can reexamine
the birthday pairing problem discussed in Example 2-20, Text.
“Birthday Pairing” as Poisson Processes:
In the birthday pairing case (refer to Example 2-20, Text), we
may assume that n = 365 possible birthdays in an year correspond to
n independent identically distributed Poisson processes each with
parameter 1/n, and in that context each individual is an “arrival”
corresponding to his/her particular “birth-day process”. It follows that
the birthday pairing problem (i.e., two people have the same birth date)
corresponds to the first occurrence of the 2nd return for any one of
the 365 processes. Hence
m1  m2   mn  2, r  1
(15-66)
so that from (15-52), for each process
Ti ~ Gamma (2, 1/n).
(15-67)
Since i   / n  1 / n, and from (15-57) and (15-66) the stopping time
in this case satisfies
36
T  min (T1 , T2 ,  , Tn ).
(15-68)
PILLAI
Thus the distribution function for the “birthday pairing” stopping
time turns out to be
FT (t )  P{T  t}  1  P{T  t}
 1  P{min (T1 , T2 ,
, Tn )  t}
 1  [ P{Ti  t}]n  1  [1  FTi (t )]n
(15-69)
 1  (1  nt ) n e  t
where we have made use of (15-59) with mi  2 and i  1 / n.
As before let N represent the number of random arrivals up to the
stopping time T. Notice that in this case N represents the number of
people required in a crowd for at least two people to have the same
birth date. Since T and N are related as in (15-60), using (15-62) we get

E{N }  E{T }   0 P(T  t )dt


  0 {1  FT (t )}dt   0 (1  nt ) n e  t dt.
(15-70)
To obtain an approximate value for the above integral, we can expand
37
ln(1  nt ) n  n ln(1  nt ) in Taylor series. This gives
PILLAI
ln
and hence
so that
 
t
1
n
2
3
t
t
t
  2 3
n 2n
3n
 
t n
1
n
 
2
e
( t  2t n 
t3 )
3 n2
3
t


t
e  e e  e 1  2 
(15-71)
3
n


and substituting (15-71) into (15-70) we get the mean number of
people in a crowd for a two-person birthday coincidence to be
t n
1
n
2
 2t n

E{ N }   0 e
1
2
t2 / 2n
2
t3
3 n2
2
 2t n
dt  1 2
3n
2
n  2n2
3n
 24.612.

0
 3 t2 / 2n
0 t e
dt
xe  x dx  2n  2
3
(15-72)
On comparing (15-72) with the mean value obtained in Lecture 6
38
(Eq. (6-60)) using entirely different arguments ( E{X }  24.44), PILLAI
we observe that the two results are essentially equal. Notice that the
probability that there will be a coincidence among 24-25 people is
about 0.55. To compute the variance of T and N we can make use of
the expression (see (15-64))

2
E{N ( N  1)}  E{T }   0 2tP(T  t )dt

 0
n

t  t

t2 / 2n
2t  1   e dt  2  0 t e
dt  22
3n
 n
 4
0 t
e
t2 / 2n
dt

 2n  0 e  x dx  22  3 (2n ) 2 2n  2n  2 n
3n
8
(15-73)
which gives (use (15-65))
 T  13.12,
 N  12.146.
(15-74)
The high value for the standard deviations indicate that in reality the
crowd size could vary considerably around the mean value.
Unlike Example 2-20 in Text, the method developed here
can be used to derive the distribution and average value for
39
PILLAI
a variety of “birthday coincidence” problems.
Three person birthday-coincidence:
For example, if we are interested in the average crowd size where
three people have the same birthday, then arguing as above, we
obtain
m1  m2    mn  3, r  1,
(15-75)
so that
(15-76)
Ti ~ Gamma (3, 1 / n )
and T is as in (15-68), which gives

2
FT (t )  1  [1  FTi (t )]n  1  1  t  t 2
n 2n

n
et ,
t0
(15-77)
(use (15 - 59) with mi  3, i  1 / n ) to be the distribution of the
stopping time in this case. As before, the average crowd size for threeperson birthday coincidence equals
n
2


E{N }  E{T }   0 P(T  t )dt   0 1  t  t 2 e  t dt.
40
n 2n
PILLAI


By Taylor series expansion
2
ln1  t  t 2    t  t 2   1  t  t 2   1  t  t 2 
 n 2n   n 2n  2  n 2n  3  n 2n 
3
t
t
  3
n 6n
so that
2
2

E{ N }   0 e
 t3 / 6n2
2
1/ 3 2 / 3
6
dt  3n

0
2
3
1 1

3  x
x
e dx
(15-78)
 61/ 3 (4 / 3) n 2 / 3  82.85.
Thus for a three people birthday coincidence the average crowd size
should be around 82 (which corresponds to 0.44 probability).
Notice that other generalizations such as “two distinct birthdays
to have a pair of coincidence in each case” (mi =2, r = 2) can be easily
worked in the same manner.
We conclude this discussion with the other extreme case,
where the crowd size needs to be determined so that “all days
41
in the year are birthdays” among the persons in a crowd.
PILLAI
All days are birthdays:
Once again from the above analysis in this case we have
m1  m2    mn  1,
r  n  365
(15-79)
so that the stopping time statistics T satisfies
T  max (T1 , T2 ,  , Tn ),
(15-80)
where Ti are independent exponential random variables with common
parameter   1n . This situation is similar to the coupon collecting
problem discussed in (15-32)-(15-34) and from (15-35), the
distribution function of T in (15-80) is given by
(15-81)
F (t )  (1  e  t / n )n ,
t0
T
and the mean value for T and N are given by (see (15-37)-(15-41))
E{N }  E{T }  n(ln n   )  2,364.14.
(15-82)
Thus for “everyday to be a birthday” for someone in a crowd,
42
PILLAI
the average crowd size should be 2,364, in which case there is 0.57
probability that the event actually happens.
For a more detailed analysis of this problem using Markov chains,
refer to Examples 15-12 and 15-18, in chapter 15, Text. From there
(see Eq. (15-80), Text) to be quite certain (with 0.98 probability) that
all 365 days are birthdays, the crowd size should be around 3,500.
43
PILLAI
Bulk Arrivals and Compound Poisson Processes
In an ordinary Poisson process X(t), only one event occurs at
any arrival instant (Fig 15.9a). Instead suppose a random number
of events Ci occur simultaneously as a cluster at every arrival instant
of a Poisson process (Fig 15.9b). If X(t) represents the total number of
all occurrences in the interval (0, t), then X(t) represents a compound
Poisson process, or a bulk arrival process. Inventory orders, arrivals
at an airport queue, tickets purchased for a show, etc. follow this
process (when things happen, they happen in a bulk, or a bunch of
items are involved.)
Ci  4
C1  3
C2  2





t1
t2
(a) Poisson Process
tn
t
t1
t
tn
t2
(b) Compound Poisson Process
Fig. 15.9
Let
pk  P{Ci  k },
k  0, 1, 2,
(15-83)
44
PILLAI
represent the common probability mass function for the occurrence
in any cluster Ci. Then the compound process X(t) satisfies
X (t ) 
N (t )
 Ci ,
(15-84)
i 1
where N(t) represents an ordinary Poisson process with parameter  . Let

P( z )  E{z }   pk z k
Ci
(15-85)
k 0
represent the moment generating function associated with the cluster
statistics in (15-83). Then the moment generating function of the
compound Poisson process X(t) in (15-84) is given by

 ( z )   z n P{ X (t )  n}  E{z X ( t ) }
X
n 0
k
 Ci
 E{E [ z X ( t ) | N (t )  k ]}  E [ E{z i1 | N (t )  k }]

  ( E{z Ci }) k P{N (t )  k }
k 0

  P ( z )e
k 0
k
 t
(  t )k
  t (1 P ( z ))

e
k!
(15-86)
45
PILLAI
If we let
k


k 
(15-87)
P ( z )    pn z    pn( k ) z n
 n 0
 n 0
where { pn(k ) } represents the k fold convolution of the sequence {pn}
with itself, we obtain
k



P{ X (t )  n}   e
 t
k 0
(  t )k ( k )
p
k! n
(15-88)
that follows by substituting (15-87) into (15-86). Eq. (15-88)
represents the probability that there are n arrivals in the interval (0, t)
for a compound Poisson process X(t).
Substituting (15-85) into (15-86) we can rewrite  X (z ) also as
 1t (1 z )  2 t (1 z 2 )
 k t (1 z k )
(15-89)
 ( z)  e
e
e
where k  pk  , which shows that the compound Poisson process
X
can be expressed as the sum of integer-secaled independent
Poisson processes m1 (t ), m2 (t ),. Thus
46
PILLAI

X (t )   k mk (t ).
k 1
(15-90)
More generally, every linear combination of independent Poisson
processes represents a compound Poisson process. (see Eqs. (10-120)(10-124), Text).
Here is an interesting problem involving compound Poisson processes
and coupon collecting: Suppose a cereal manufacturer inserts either
one or two coupons randomly – from a set consisting of n types of
coupons – into every cereal box. How many boxes should one buy
on the average to collect at least one coupon of each type? We leave
it to the reader to work out the details.
47
PILLAI