Lecture 11 – Stochastic Processes

Download Report

Transcript Lecture 11 – Stochastic Processes

Lecture 11 – Stochastic
Processes
Topics
• Definitions
• Review of probability
• Realization of a stochastic process
• Continuous vs. discrete systems
• Examples
Basic Definitions
Stochastic process: System that changes over time in
an uncertain manner
Examples
• Automated teller machine (ATM)
• Printed circuit board assembly operation
• Runway activity at airport
State: Snapshot of the system at some fixed point in
time
Transition: Movement from one state to another
Elements of Probability Theory
Experiment: Any situation where the outcome is uncertain.
Sample Space, S: All possible outcomes of an experiment (we
will call them the “state space”).
Event: Any collection of outcomes (points) in the sample
space. A collection of events E1, E2,…,En is said to be
mutually exclusive if Ei  Ej =  for all i ≠ j = 1,…,n.
Random Variable (RV): Function or procedure that
assigns a real number to each outcome in the sample
space.
Cumulative Distribution Function (CDF), F(·): Probability
distribution function for the random variable X such that
F(a)  Pr{X ≤ a}
Components of Stochastic Model
Time: Either continuous or discrete parameter.
t0 t1
t2
t3 t4
time
State: Describes the attributes of a system at some point in time.
s = (s1, s2, . . . , sv); for ATM example s = (n)
Convenient to assign a unique nonnegative integer index to
each possible value of the state vector. We call this X and
require that for each s  X.
For ATM example, X = n.
In general, Xt is a random variable.
Model Components (continued)
Activity: Takes some amount of time – duration.
Culminates in an event.
For ATM example  service completion.
Transition: Caused by an event and results in movement
from one state to another. For ATM example,
a
a
0
1
d
a
3
2
d
a
d
d
# = state, a = arrival, d = departure
Stochastic Process: A collection of random variables {Xt},
where t  T = {0, 1, 2, . . .}.
Realization of the Process
Deterministic Process
Time between
arrivals
Pr{ ta   } = 0,  < 1 min
Time for
servicing
customer
Pr{ ts   } = 0,  < 0.75 min
Arrivals occur
every minute.
= 1,   1 min
= 1,   0.75 min
Processing takes
exactly 0.75
minutes.
n
Number in
system, n
2
1
0
0
1
2
3
4
5
6
7
8
9
10
time
(no transient
response)
Realization of the Process (continued)
Stochastic Process
Pr{ ts   } = 0,  < 0.75 min
Time for servicing
customer
= 0.6, 0.75    1.5 min
= 1,   1.5 min
n
a
3
a
2
a
1
a
d
a
d
a
a
d
a
d
d
a
a
d
Number in
system, n
d
d
d
d
0
0
2
4
6
8
10
12
time
Markovian Property
Given that the present state is known, the conditional probability of
the next state is independent of the states prior to the present state.
Present state at time t is i: Xt = i
Next state at time t + 1 is j: Xt+1 = j
Conditional Probability Statement of Markovian Property:
Pr{Xt+1 = j | X0 = k0, X1 = k1,…, Xt = i } = Pr{Xt+1 = j | Xt = i }
for t = 0, 1,…, and all possible sequences i, j, k0, k1, . . . , kt–1
Interpretation: Given the present, the past is irrelevant in
determining the future.
Transitions for Markov Processes
State space: S = {1, 2, . . . , m}
Probability of going from state i to state j in one move: pij
State-transition matrix
P 
p11
p21
p12
p22
p1m 

p2m 
m pm1
pm2
p
1
2













mm 
P   pij 
Theoretical requirements: 0  pij  1, j pij = 1, i = 1,…,m
Discrete-Time Markov Chain
• A discrete state space
• Markovian property for transitions
• One-step transition probabilities, pij, remain constant over
time (stationary)
Simple Example
State-transition matrix
0
1
State-transition diagram
(0.6)
2
0
P =
0
0.6
0.3
0.1
1
0.8
0.2
0
2
1
0
0
(0.1)
(0.3)
(1)
2
(0.8)
1
(0.2)
Game of Craps
• Roll 2 dice
• Outcomes
– Win = 7 or 11
– Loose = 2, 3, 12
– Point = 4, 5, 6, 8, 9, 10
• If point, then roll again.
– Win if point
– Loose if 7
– Otherwise roll again, and so on
(There are other possible bets not include here.)
State-Transition Network for Craps
no t (4,7)
no t (5,7)
no t (6,7)
no t (8,7)
no t (9,7)
no t (10,7)
P4
P5
P6
P8
P9
P10
4
5
Win
6
8
10
9
7
5
6
4
(7, 11)
Start
7
8 9
7
10
(2, 3, 12)
7 7
Lose
7
Transition Matrix for Game of Craps
Sum
2
3
4
5
6
7
8
9
10
11
12
Prob
.
0.02
8
0.05
6
0.08
3
0.11
1
0.13
9
0.16
7
0.13
9
0.11
1
0.08
3
0.05
6
0.02
8
Probability of win = Pr{ 7 or 11 } = 0.167 + 0.056 = 0.223
Probability of loss = Pr{ 2, 3, 12 } = 0.028 + 0.56 + 0.028 = 0.112
Start
P=
Win
Lose
P4
P5
P6
P8
P9
P10
Start
0
0.222 0.111 0.083 0.111 0.139 0.139 0.111 0.083
Win
0
1
0
0
0
0
0
0
0
Lose
0
0
1
0
0
0
0
0
0
P4
0
0.083 0.167
0.75
0
0
0
0
0
P5
0
0.111 0.167
0
0.722
0
0
0
0
P6
0
0.139 0.167
0
0
0.694
0
0
0
P8
0
0.139 0.167
0
0
0
0.694
0
0
P9
0
0.111 0.167
0
0
0
0
0.722
0
P10
0
0.083 0.167
0
0
0
0
0
0.75
Examples of Stochastic Processes
Single stage assembly process with single worker, no queue
a
State = 0, worker is idle
0
1
State = 1, worker is busy
d
Multistage assembly process with single worker, no queue
a
0
d1
1
d2
2
d3
3
d4
4
5
d5
State = 0, worker is idle
State = k, worker is performing operation k = 1, . . . , 5
Examples (continued)
Multistage assembly process with single worker with queue
(Assume 3 stages only; i.e., 3 operations)
s1  number of parts in the system
s = (s1, s2) where 
s2  current operation being performed
(1,3)
Operations
k = 1, 2, 3
(2,3)
a
d3
d3
d2
a
a
d2
(2,2)
(3,2)
a
d1
d1
a
…
d1
(2,1)
(1,1)
…
d3
d2
(1,2)
(0,0)
(3,3)
a
a
(3,1)
…
Queuing Model with Two Servers
0, if server i is idle
s = (s1, s2 , s3) where si  
1, if server i is busy
i = 1, 2
s3  number in the queue
Arrivals
d1
1
…
d2
2
(1,0,0)
Statetransition
network
d1
0
(0,0,0)
a
1
d2
a
d1
d2
2
(0,1,0)
a
d1 ,d2
d1, d2
3
(1,1,0)
4
a
5
a
(1,1,1)
(1,1,2)
• • •
Series System with No Queues
Arrivals
Transfer
1
Component
Notation
State
s = (s1, s2 , s3)
2
Transfer
3
Finished
Definition
0, if server i is idle
si  
1, if server i is busy
for i  1, 2,3
State space
S = { (0,0,0), (1,0,0), . . . ,
(0,1,1), (1,1,1) }
The state space consists of all
possible binary vectors of 3
components.
Activities
Y = {a, d1, d2 , d3}
a = arrival at operation 1
dj = completion of operation j for
j = 1, 2, 3
What You Should Know
About Stochastic Processes
• What a state is.
• What a realization is (stationary vs.
transient).
• What the difference is between a
continuous and discrete-time system.
• What the common applications are.
• What a state-transition matrix is.