Queueing Theory (Delay Models)

Download Report

Transcript Queueing Theory (Delay Models)

Queueing Theory
(Delay Models)
Introduction

Total delay of the i-th customer in the
system
Ti = Wi + τi





N(t) : the number of customers in the system
Nq (t) : the number of customers in the queue
Ns (t) : the number of customers in the service
N : the avg number of customers in the queue
τ : the service time


T : the total delay in the system
λ: the customer arrival rate [#/sec]
Little’s Theorem

E[N] = λE[T]
Number of customer in the system at t
N(t)=A(t)-D(t)
where


D(t) : the number of customer departures
up to time t
A(t) : the number of customer arrivals up
to time t
Poisson Process

The interarrival probability density function
p(n)  e
  n
mean: 1/λ, variance: 1/λ2
 for every t, δ≥0
P[ A(t   )  A(t )  0]  1    o( )
P[ A(t   )  A(t )  1]    o( )
P[ A(t   )  A(t )  1]  o( )

where
lim
 0
o( )

0
Poisson Process
P[ A(t   )  A(t )  n]  e


( )n
n!
,
n  0,1,2,...
Characteristics of the Poisson process


Interarrival times are independent and
exponentially distributed
If t n denotes the n-th arrival time and the
interval τn = t n+1- t n , the probability
distribution is
P[  s]  1  e  s , s  0
Sum of Poisson Random
Variables

Xi , i =1,2,…,n, are independent RVs
Xi follows Poisson distribution with parameter i
 Partial sum defined as: Sn  X1  X 2  ...  X n

Sn follows Poisson distribution with parameter 
  1  2  ...  n
Sum of Poisson Random Variables
P r o o f : For n = 2. Generalizat ion by induct ion. T he pm f of S = X 1 + X 2 is
P f S = mg =
=
=
=
Xm
k= 0
Xm
P f X 1 = k; X 2 = m ¡ k g
P f X 1 = kg P f X 2 = m ¡ kg
k= 0
Xm
m¡ k
k
¸
¸
2
e¡ ¸ 1 1 ¢e¡ ¸ 2
( m ¡ k) !
k!
k= 0
m
m!
¡ k
¡ ( ¸ 1+ ¸ 2) 1 X
e
¸ k1 ¸ m
2
m ! k= 0 k!( m ¡ k) !
+ ¸ 2) m
=
m!
Poisson w it h param et er ¸ = ¸ 1 + ¸ 2 .
(¸ 1
e¡ ( ¸ 1 + ¸ 2 )
Sampling a Poisson Variable
X follows Poisson distribution with parameter 
 Each of the X arrivals is of type i with probability pi,
i =1,2,…,n, independently of other arrivals;
p1 + p2 +…+ pn = 1
 Xi denotes the number of type i arrivals

X1 , X2 ,…Xn are independent
Xi follows Poisson distribution with parameter i pi
Sampling a Poisson Variable
(cont.)
P r o o f : For n = 2. Generalize by induct ion. Joint pm f:
P f X 1 = k1 ; X 2 = k2 g =
=
=
=
=
P f X 1 = k1 ; X 2 = k2 j X = k1 + k2 g P f X = k1 + k2 g
³k + k ´
¸ k 1 + k2
1
2
¡ ¸
k 1 k2
p1 p2 ¢e
( k1 + k2 ) !
k1
1
( ¸ p1 ) k 1 ( ¸ p2 ) k 2 ¢e¡ ¸ ( p1 + p2 )
k 1 !k 2 !
k1
k2
¡ ¸ p1 ( ¸ p1 )
¡ ¸ p2 ( ¸ p2 )
¢e
e
k1 !
k2 !
² X 1 and X 2 are indep endent
² P f X 1 = k1g =
e¡ ¸ p1 ( ¸ kp11!)
k1
, P f X 2 = k2 g =
e¡ ¸ p2 ( ¸ kp22!)
k2
X i f ollow s Poisson dist ribut ion w it h param et er ¸ pi .
Merging & Splitting Poisson
Processes
1
p
p
1  2
2

1-p
(1-p)
 A: Poisson processes with rate 
A1,…, Ak independent
Poisson processes with rates  Split into processes A1 and A2
1,…, k
independently, with probabilities p
and 1-p respectively
 Merged in a single process
A= A1+…+ Ak
A1 is Poisson with rate 1= p
A is Poisson process with
A2 is Poisson with rate 2= (1-p)
rate = 1+…+ k

Poisson Variable

mean
( T )
e   T ( T ) k
 T
e
( T )
E[ N T ]   k
k!
k 1 ( k  1)!
k 1



k 1
 t

variance
 2 ( NT )  E[ N 2T ]  ( E[ NT ])2  T

Memoryless property (if exponentially
distributed)
P[ n    t  n
P[ n    t ]
e   (  t )
 t] 

 P[ n   ]
 t
P[ n  t ]
e
Review of Markov chain theory

Discrete time Markov chains


discrete time stochastic process {Xn |n=0,1,2,..}
taking values from the set of nonnegative
integers
Markov chain if
Pij  P{Xn1  j Xn  i, Xn1  in1 ,, X0  i0}
 P{X n1  j X n  i}

where
Pij  0,

P
ij
j 0
 1, i  0,1,
Markov chain

Markov chain formulation

Consider a discrete time MC
Pij  P[ N k 1  j N k  i ]

where Nk is the number of customers at time k
and N(t) is the number of customers at time t
probabilities
( )e  

P[1 arrival in  ] 
  (1 
 )    o( )
1!
1!
where the arrival and departure processes are
independent
Review of Markov chain theory
The transition probability matrix
 P00
P
10

P
 ...


P01
P11
...

P02 ...

P12 ...



n-step transition probabilities
Pijn  P{X nm  j X m  i}

Review of Markov chain theory

Chapman-Kolmogorov
equations

Pijnm   Pikn Pkjm
(n, m, i, j  0)
k 0

detailed balance equations for birth-death
systems (in steady state)
p j Pji  pi Pij , i , j  0
Example
1- 12/2
2/2
1- 2/2
0
1
 12/2
P0=(2-2)/(2-2(1-1))
the throuput=
P0 *P(s=0| P0)*0+ P0 *P(s=1| P0)*1+ P0 *P(s=2| P0 )*2+
P1 *P(s=0| P1)*0+ P1 *P(s=1| P1)*1+ P1 *P(s=2| P1)*2

Continuous time Markov chains




{X(t)| t≥0} taking nonnegative integer values
υi : the transition rate from state i
qij : the transition rate from state i to j
qij = υi Pij
the steady state occupancy probability of state j
p j  lim P[ X (t )  j X (0)  i ]
t 

Analog of detailed balance equations for DTMC
p j q ji  pi qij
Birth-And-Death Process
State:
0
0
1
1
2
2
N
 N-2  N-1
1
....
N-2
N-1
 N-1
N
N
N+1
....
 N+1
In the long run, we have:
Rate IN = Rate Out Principle
Queueing Theory
21
Birth-And-Death Process(cont.)
Equation Expressing This:
State
Rate In = Rate Out
0
1P1 = 0P0
1
0P0 + 2P2 = (1 + 1) P1
2
1P1 + 3P3 = (2 + 2) P2
....
...................
N-1
N-2PN-2 + NPN = (N-1 + N-1) PN-1
N
N-1PN-1 + N+1PN+1 = (N + N) PN
....
...................
l
Queueing Theory
22
Birth-And-Death Process(cont.)
Finding Steady State Process:
State
l
0: P1 = (0 / 1) P0
1: P2 = (1 / 2) P1 + (1P1 - 0P0) / 2
= (1 / 2) P1 + (1P1 - 1P1) / 2
= (1 / 2) P1
10
P0
=
 2 1
Queueing Theory
23
Birth-And-Death Process(cont.)
Finding Steady State Process(cont.):
State
n-1: Pn = (n-1 / n) Pn-1 + (n-1Pn-1- n-2Pn-2) / n
= (n-1 / n) Pn-1 + (n-1Pn-1- n-1Pn-1) / n
= (n-1 / n) Pn-1
l
n -1n -2 0

P0
 n  n -1  1
Queueing Theory
24
Birth-And-Death Process(cont.)
l
Finding Steady State Process(cont.):
N: Pn+1 = (n / n+1) Pn+ (nPn - n-1Pn-1) / n+1
= (n / n+1) Pn
n n -1 0

P0
 n 1 n  1
To Simplify:
Let C = (n-1 n-2 .... 0) / (n n-1 ......... 1)
Then Pn = Cn P0 , N = 1, 2, ....
Queueing Theory
25
M/M/1 queueing system

Arrival statistics:
stochastic process { A(t ) t  0} taking
nonnegative integer values is called a
Poisson process with rate λ if



A(t) is a counting process representing the
total number of arrivals from 0 to t
arrivals are independent
probability distribution function
M/M/1 queueing system
P[1 arrival and no departure in δ]=
( )e   (  ) 0 e  


  (1 
 )(1 
 )    o( )
1!
0!
1!
1!
P[0 arrival and one departure in  ]    o( )
P[0 arrival and 0 departure in  ]  1      o( )
where the arrival and departure processes are
independent
M/M/1 queueing system

Global balance equation
(  ) p j   p j 1   p j 1
pn   pn 1
pn1   n1 p0 ,
n  0,1,...
 p0   p1
M/M/1 queueing system
from
Then

1

p
n
n0
p0  1  

p0
   p0 
1 
n0
n
Average number of customers in the system
N  lim E[ N (t )] 
t 
 np
n
n0

  (1  ) n
n0
  (1   )

n 1


 n
n
(1   )
n0
  n
 (1  )
(  )
 n0

1


(
)

 1  
1    
M/M/1 queueing system


Average delay per customer (waiting time +
service time)
E[ N ]
1
E[T ] 


   by Little’s theorem
Average waiting time
E[W ]  E[T ] 

1



Average number of customer
in queue
2
E[ N Q ]   E[W ] 


Server utilization 
1 p   
0


1 
M/M/1 queueing system

example
1/λ=4 ms, 1/μ=3 ms
E[ N ] 
E[T ] 

1 
E[ N ]


3/ 4
3
1 3 / 4
3

 12[ms]
1/ 4
‫תרגיל‬
‫‪‬‬
‫‪‬‬
‫‪‬‬
‫‪‬‬
‫‪‬‬
‫‪‬‬
‫קצב הגע למערכת הוא ‪n*‬‬
‫קצב השרות הוא ‪*n‬‬
‫א‪ .‬שרטט את דיאגרמת המצבים‬
‫ב‪ .‬מצא את הסתברויות הסטציונרות‬
‫ג‪ .‬מצא את מספר הצרכנים הממוצע במערכת‬
‫במצב היציב‬
‫ד‪ .‬מצא את זמן ההשהייה הממוצע במערכת‬
‫באמצעות משפט ‪LITTLE‬‬
‫פתרון‬


0
2
1
(n-1) 
n
2

2

n
Pn 
n

n0
n 1
1   Pn  P0  
1



n 
1

P0  P0 1     P0 
 1  ln(1   )

 n1 n 
n

n   0
1    
 n 1 n 
n
x n  ln(1  x )




n 1 n

n
n
n 1
n!
 n1
Pn 

Pn1 
 n P1 
P0
n 1
n 1
n
(n  1)!
(n  1)
P0  n  1

;(0  x  1)
;x 1
‫ב‬

(n+1) 


0  P0  P1  P1  P0
n

n

n  nPn  (n  1)Pn1  Pn1 
‫א‬
;0    1
;  1
‫פתרון‬


n
n0
n 1
n
N   nPn   n


n0
n 1
    n Pn  P0   n

T
N


1 
P0
1 
P0


1


P0  P0  n  P0
n 1
n

1 


1
1   1  ln(1  )





 

P0  P0 1    n   P0 1 
P0

n


 1   1  
n 1
‫ג‬

‫ד‬

‫תרגיל‬
‫‪ ‬צומת ברשת משתמש בשיטת הניתוב הבאה‪:‬‬
‫כאשר חבילה מגיע אליו ללא תלות ביעדה הוא מפנה אותה לקו‬
‫יצאה אםם התור לקו זה הוא ריק ולא נשלחת ברגע זה שום‬
‫חבילה דרך קו זה‪ .‬אחרת חבילה זו מופנת חבילה זו דרך כו‬
‫אחר כלשהו‪ .‬נניח כי שמופע ההודעות לצומת הוא פואסוני עם‬
‫‪ ‬אורך החבילות מתפלג אקספוננצילי אם ‪ ‬וכיבולת הקו היא‬
‫‪.C‬‬
‫א‪ .‬איזה חלק מהחבילות מגעות דרך קו ההעדיף ?‬
‫ב‪ .‬אם הוחלט להצמיד תור לקו המהיר‪ ,‬מה אורכו המינימלי‬
‫של התור כך שההסתברות שחבילה תשודרנה בקו זה תהיה‬
‫לפחות ‪ 0.9‬בהנחה ש ? (האם המערכת במצב יציב?)‬
‫פתרון‬
‫א‬
Message Length: l [bits]
Transmission Rate:
Transmission Time:
Service Rate:
P( l  a )  1  ea
bits 
C 
 sec 
T
l
[sec]
C
P(T  t )  P(
customers 
C 


1
C

P0  CP1  P1 

0
sec

P0
C

 
1
1  P0  P1  P0 1 
  P0 

C 

1
C
l
 t )  P( l  Ct )  1  eCt
C
‫פתרון‬

0


C
1
m-1
Pn1  CPn  Pn  Pn1   n P0
 1 
m 1

1   Pn    n P0  P0  1  
1
n0
n0

 m 1
M
M
  m   m 1
m 1

Pm   1  
1

 m  1
 1
 1
 1  m
1   m 1   1
1  Pm  
 m
 m  1   1
m
 0.9  m  9
m 1
0  n  M
 1
 1

1
C
0


M-1
C
C
M
C
‫ב‬

M/M/1 Example I
Traffic to a message switching center for one of the
outgoing communication lines arrive in a random
pattern at an average rate of 240 messages per
minute. The line has a transmission rate of 800
characters per second. The message length
distribution (including control characters) is
approximately exponential with an average length
of 176 characters. Calculate the following principal
statistical measures of system performance,
assuming that a very large number of message
buffers are provided:
Queueing Theory
39
M/M/1 Example I (cont.)
l
l
l
l
l
(a) Average number of messages in the system
(b) Average number of messages in the queue
waiting to be transmitted.
(c) Average time a message spends in the system.
(d) Average time a message waits for
transmission
(e) Probability that 10 or more messages are
waiting to be transmitted.
Queueing Theory
40
M/M/1 Example I (cont.)
1.
2.
.
4.
E[s] = Average Message Length / Line Speed
= {176 char/message} / {800 char/sec}
= 0.22 sec/message or
 = 1 / 0.22 {message / sec}
= 4.55 message / sec
 = 240 message / min
= 4 message / sec
 =  E[s] =  / 
= 0.88
Queueing Theory
41
M/M/1 Example I (cont.)
l
l
l
l
l
(a) N=  / (1 - ) = 7.33 (messages)
(b) Nq = 2 / (1 - ) = 6.45 (messages)
(c) W = E[s] / (1 - ) = 1.83 (sec)
(d) Wq =  × E[s] / (1 - ) = 1.61 (sec)
(e) P [11 or more messages in the system]
= 11 = 0.245
Queueing Theory
42
M/M/1 Example II
A branch office of a large engineering firm has one
on-line terminal that is connected to a central
computer system during the normal eight-hour
working day. Engineers, who work throughout the
city, drive to the branch office to use the terminal
to make routine calculations. Statistics collected
over a period of time indicate that the arrival
pattern of people at the branch office to use the
terminal has a Poisson (random) distribution, with a
mean of 10 people coming to use the terminal each
day. The distribution of time spent by an engineer
at a terminal is exponential, with a
Queueing Theory
43
M/M/1 Example II (cont.)
mean of 30 minutes. The branch office receives
complains from the staff about the terminal
service. It is reported that individuals often wait
over an hour to use the terminal and it rarely
takes less than an hour and a half in the office
to complete a few calculations. The manager is
puzzled because the statistics show that the
terminal is in use only 5 hours out of 8, on the
average. This level of utilization would not seem
to justify the acquisition of another terminal.
What insight can queueing theory provide?
Queueing Theory
44
M/M/1 Example II (cont.)
l
l
l
{10 person / day}×{1 day / 8hr}×{1hr / 60
min}
= 10 person / 480 min
= 1 person / 48 min
==>  = 1 / 48 (person / min)
30 minutes : 1 person
= 1 (min) : 1/30 (person)
==>  = 1 / 30 (person / min)
 =  /  = {1/48} / {1/30} = 30 / 48
=5/8
Queueing Theory
45
M/M/1 Example II (cont.)
l
l
l
l
l
Arrival Rate
 = 1 / 48 (customer / min)
Server Utilization
 =  /  = 5 / 8 = 0.625
Probability of 2 or more customers in system
P[N  2] = 2 = 0.391
Mean steady-state number in the system
L = E[N] =  / (1 - ) = 1.667
S.D. of number of customers in the system
N = sqrt() / (1 - ) = 2.108
Queueing Theory
46
M/M/1 Example II (cont.)
l
l
l
l
l
Mean time a customer spends in the system
W = E[w] = E[s] / (1 - ) = 80 (min)
S.D. of time a customer spends in the system
w = E[w] = 80 (min)
Mean steady-state number of customers in
queue
Nq = 2 / (1 - ) = 1.04
Mean steady-state queue length of nonempty
Qs
E[Nq | Nq > 0] = 1 / (1 - ) = 2.67
Mean time in queue
Wq = E[q] = ×E[s] / (1 - ) = 50 (min)
Queueing Theory
47
M/M/1 Example II (cont.)
Mean time in queue for those who must wait
E[q | q > 0] = E[w] = 80 (min)
90th percentile of the time in queue
pq(90) = E[w] ln (10 )
= 80 * 1.8326
= 146.6 (min)
Queueing Theory
l
l
48
M/M/m, M/M/m/m, M/M/∞

M/M/m (infinite buffer)

detailed balance equations in steady state
M/M/m
 pn1  n pn , n  m
 pn1  m pn , n  m
(m ) n
pn  p0
,n  m
n!
mm  n
pn  p0
, nm
n!


1
m 
p
n
n0
1
where
From
M/M/m
(m) n  (m) n 1 1
p0  [1  

]
nm
n!
m! m
n 1
nm
m1

The probability that all servers are busy
m
p0 (m)
PQ  P[all servers are busy]   pn 
m!(1  )
nm

- Erlang C formula

expected number of customers waiting in queue

N Q   npmn
n0
mm  mn

  np0
 PQ
m!
1 
n0

M/M/m

average waiting time of a customer in queue
PQ
W


 (1  )
NQ

average delay per customer
T

1

W
average number of customer in the system
PQ

N  T  
 m  
by Little’s theorem
M/M/s Case Example I
Example:
Find p0
M/M/2 ; s = 2
 = 1/8 (=service rate/server)
 = 1 1 ,
11
0
11
1
1
11
2
21
11
3
21
.......
21
    s  11  21   .4
Queueing Theory
53
M/M/s Case Example I (cont.)
P0 
1
 0.80 0.81 0.82
1 





1!
2!
1  0.4 
 0!
= 0.429 (@ 43% of time, system is empty)
as compared to m = 1: P0 = 0.20
Queueing Theory
54
M/M/s Case Example I (cont.)
 Find W
Wq = Lq /  = 0.152 / (1/10) = 1.52 (min)
W = Wq + 1 /  = 1.52 + 1 / (1/8) = 9.52 min)
 What proportion of time is both repairman
busy? (long run)
P(N  2) = 1 - P0 - P1
= 1 - 0.429 - 0.343
= 0.228 (Good or Bad?)

Queueing Theory
55
M/M/∞

M/M/∞: The infinite server case

The detailed balance equations
pn1  npn , n  1,2,...
 n1
pn  p0 ( )
,
n  1,2,...
 n!

p
n
1
n0
Then


pn  ( ) n





 n 1 1
p0  [1   ( )
] e 
n!
n 1 

e
, n  0,1,
n!
M/M/m/m

M/M/m/m : The m server loss system


when m servers are busy, next arrival will be
lost
circuit switched network model
M/M/m/m

0


1



i
2
i

m-1
(i+1)
(m-1)
m
m
 pn1  n pn , n  1,2,, m
 n 1
pn  p0 ( )
, n  1,2,, m
 n!
m
p
n
n0

1
 n 1 1
p0  [  ( )
]
n!
n0 
m
The blocking probability (Erlang-B formula)
(  /  ) m / m!
PB  pm 
m
n
(

/

)
/ n!
n  0
Moment Generating Function
1. D e¯ nit ion: for any t 2 IR:
8 Z1
>
>
et x f X ( x ) dx ;
X cont inuous
<
tX
¡
1
X
M X ( t) = E [e ] =
>
et x j P f X = x j g ; X discret e
>
:
j
2. If t he m om ent generat ing funct ion M X ( t) of X
exist s and is ¯ nit e in som e neighb orhoo d of t = 0,
it det erm ines t he dist ribut ion of X uniquely.
3. Fundam ent al P rop ert ies: f or any n 2 IN :
dn
( i)
M X ( t ) = E [X n et X ]
n
dt
dn
n
( ii)
(
0)
=
[X
]
M
E
X
dt n
4. M om ent Generat ing Funct ions and Independence:
X ; Y : independent )
M X + Y ( t ) = M X ( t) M Y ( t )
T he opp osit e is not t rue.
Discrete Random Variables
Distribut ion
(paramet ers)
Prob. Mass Fun.
P f X = kg
Variance
Var(X )
pk (1 ¡ p) n¡ k
k = 0; 1; : : : ; n
(pet + 1 ¡ p) n
np
np(1 ¡ p)
(1 ¡ p) k ¡ 1 p
k = 1; 2; : : :
pet
1¡ (1¡ p)et
1
p
1¡ p
p2
r
p
r (1¡ p)
p2
¸
¸
k
Geometric
p
³
Poisson
¸
Mean
E [X ]
¡ n¢
Binomial
(n; p)
Negative Bin.
(r ; p)
Moment Gen. Fun.
M X (t)
k¡ 1
r¡ 1
´
r
k¡ r
p (1 ¡ p)
k = r; r + 1; : : :
¡ ¸ ¸k
e k!
k = 0; 1; : : :
h
ir
pet
1¡ (1¡ p)et
¸ (et ¡ 1)
e
Continuous Random Variables
Dist ribution
(paramet ers)
Prob. Density Fun.
f X (x)
Moment Gen. Fun.
M X (t)
Mean
E [X ]
Variance
Var(X )
Uniform over
(a; b)
1
b¡ a
et b ¡ et a
t(b¡ a)
a+ b
2
(b¡ a) 2
12
a< x < b
Exponent ial
¸
¸ e¡ ¸ x
x¸ 0
¸
¸¡ t
1
¸
1
¸
¹
¾2
Normal
(¹ ; ¾2)
2
2
p 1 e¡ (x ¡ ¹ ) =2¾
2¼¾
¡ 1 < x< 1
¹ t + (¾t) 2 =2
e