Transcript Document
Extensions of the Basic Model
Chapter 6
Elements of Sequencing and Scheduling
by Kenneth R. Baker
Byung-Hyun Ha
R1
Outline
Introduction
Nonsimultaneous arrivals
Minimizing the makespan
Minimizing maximum tardiness
Other measures of performance
Dependent jobs
Minimizing maximum tardiness
Minimizing total flowtime with strings
Minimizing total flowtime with parallel chains
Sequence-dependent setup times
Dynamic programming solutions
Branch and bound solutions
Heuristic solutions
Summary
1
Introduction
Basic single-machine model
Assumptions
C1. A set of n independent, single operation jobs is available simultaneously (at
time zero).
C2. Setup times for the jobs are independent of job sequence and are included
in processing times.
C3. Job descriptors are deterministic and known in advance.
C4. One machine is continuously available and never kept idle while work is
waiting.
C5. Once an operation begins, it proceeds without interruption.
An opportunity to study a variety of scheduling criteria as well as a
number of solution techniques
Possible generalization (in this chapter)
C1 by non-simultaneous jobs arrival
C1 by dependent job sets
C2 by sequence-dependent setup
C3 by use of probabilistic methods
2
Nonsimultaneous Arrivals
Static version of single-machine problem
All jobs are simultaneously available for processing
e.g., 1 || Cj , 1 || wjUj
Dynamic version
Allowing different ready times (rj)
Examples of scheduling with ready times
• Basic model (C4 and C5 are regarded)
1
2
5
T1* = 3
7
Job j
1
2
rj
pj
dj
0
5
7
1
2
4
• Inserted idle time is allowed -- 1 | rj | Tj
2
1
1
T2* = 1
3
8
• With job preemption allowed (preempt-resume mode) -- 1 | rj , prmp | Tj
1
2
1
1
3
T3* = 0
7
3
Nonsimultaneous Arrivals
Scheduling with preemption allowed
Preempt-resume mode
• Schedules without inserted idle time constitute a dominant set for regular
measures.
• Properties associated with transitive rules are often essentially unchanged
• Dispatching as decision making is possible for optimality -- no look-ahead
• Example -- 1 | rj , prmp | Tmax
» Keep machine assigned to the available job with the earliest due date
• Example -- 1 | rj , prmp | Cj
» keep machine assigned to the available job with the minimum
remaining processing time (SRPT: shortest remaining processing
time)
Preempt-repeat mode
• Job must be restarted each time it is interrupted
• Schedules without preemption (permutation schedules) constitute a dominant
set
• Inserted idle time is determined uniquely by choice of permutation
• Need to use look-ahead information (which makes solution approach complex)
4
Nonsimultaneous Arrivals
Minimizing the makespan
1 | rj | Cmax
• Makespan -- denoted by M or Cmax
• Related to throughput of schedule
Cmax is always constant in basic model (1 || Cmax)
M is minimized by Earliest Ready Time (ERT) rule
• A nondelay dispatching procedure
• Yielding blocks of jobs
Generalization of the problem
Each job with delivery time, qj
• Delivery takes place immediately after job complete, in parallel
• Makespan includes delivery time
• Symmetric property -- equivalent to reversed problem
1
2
M = C2 + q2
3
tail of 1
tails (delivery)
tail of 2
tail of 3
5
Nonsimultaneous Arrivals
Generalization of the problem (cont’d)
Head-body-tail problem
• Job specification with triples (rj, pj, qj)
• NP-hard -- equivalent to 1 | rj | Lmax (discussed later)
A good heuristic solution
• Nondelay dispatching procedure that always selects the available job with the
largest tail qj
ALGORITHM 1 -- The Largest Tail (LT) Procedure
1. Initially, let t = 0.
2. If there are no unscheduled jobs at time t, set t equal to the minimum ready
time among unscheduled jobs; otherwise, proceed.
3. Find job j with the largest qj among unscheduled jobs available at time t.
Schedule j to begin at time t.
4. Increase t by pj . If all n jobs are scheduled, stop; otherwise return to Step 2.
Exercise -- head-body-tail problem with 5 jobs
• Algorithm 1 should be executed twice (why?)
Job j
1
2
3
4
5
rj
pj
qj
0
2
5
2
1
2
3
2
6
0
3
3
6
2
1
6
Nonsimultaneous Arrivals
Generalization of the problem (cont’d)
Optimality condition
• Makespan M = ri + j=i..k pj + qk ,
• for some job i that initiates a block and for some k in the block called the
critical job (jobs are renumbered according to sequence)
a block
i+1
...
k
...
...
M = Ck + qk
...
i
... ...
tail of i
critical job
tail of k
...
...
• If qk qj for all jobs j from i to k, M is optimal. (Theorem 1, discussed later)
It is sufficient condition, i.e., if not, it may not be optimal.
7
Nonsimultaneous Arrivals
Minimizing maximum tardiness
1 | rj | Lmax
• Strongly NP-hard (p. 44 of Pinedo, 2008)
• 3-PARTITION reduces to 1 | rj | Lmax
EDD solves 1 || Lmax
Equivalence to head-body-tail problem
• Let qj = D – dj , where D = max{dj}
• min. Lmax = max{Cj – dj} = max{Cj – (D – qj)} = max{Cj + qj} – D
Theorem 1
• In the dynamic Lmax-problem, a non-delay implementation of the EDD rule
yields
Lmax = ri + j=i..k pj – dk
for some job i that initiates a block, and for some job k in the same block,
where the jobs are numbered in order of appearance in the schedule. If dk dj
for all jobs j from i to k, then Lmax is optimal.
Proof of Theorem 1
• Relaxation by considering only jobs from i to k with ready times as ri
8
Nonsimultaneous Arrivals
Other measures of performance
Mostly NP-hard, if not preempt-resume mode
• e.g., 1 | rj | Lmax , 1 | rj | Cj , 1 | rj | Uj (then, how about 1 | rj | Tj ?)
• Preempt-resume mode problem as lower bound for branch-and-bound
• Not clear in case of 1 | rj | Uj or 1 | rj | Tj
1 | rj | Lmax
• Theorem 2
• In the dynamic Lmax-problem, suppose that the nondelay implementation
of EDD yields a sequence of the jobs in EDD order. Then this nondelay
schedule is optimal.
• Proof of Theorem 2
• Relaxation by all ready times as zero
• Theorem 3
• In the dynamic Lmax-problem, if the ready times and due dates are
agreeable, then the nondelay implementation of EDD is optimal.
9
Nonsimultaneous Arrivals
Other measures of performance (cont’d)
1 | rj | Cj
• Theorem 4
• In the dynamic F-problem, if the ready times and processing times are
agreeable, then the nondelay implementation of SPT is optimal.
• Some heuristics for general cases
• Nondelay adaptation of SPT
• First Off First On (FOFO) rule
» Exploiting look-ahead information (so, not dispatching)
• Priority to job with smallest sum of earliest start time (rj) and earliest
finish time (rj + pj), i.e., (2rj + pj)
1 | rj | Tj
• Theorem 5
• In the dynamic T-problem, if the ready times, processing times and due
dates are all agreeable, the nondelay implementation of MDD is optimal.
1 | rj | Uj
• ALGORITHM 2 -- Minimizing U (Dynamic Version)
• Optimal in case of agreeable ready times and due dates
10
Dependent Jobs
Constraints in scheduling
Machine capacity (in basic model)
+ Technical restriction -- specification by admissible sequence of two jobs
• Reduction of the set of feasible solutions
Dominance between jobs
Precedence constraints, i j
Job j is not permitted to begin until job i is complete
Job i is predecessor of job j, job j is successor of job i
Direct predecessor, direct successor
Example -- 1 | prec | Cj
• Three jobs a, b, c with pa pb pc
• Optimal without precedence: a-b-c
• With additional precedence c a
• Clearly, c-b-a is not (why?)
• Then, c-a-b or b-c-a?
11
Dependent Jobs
Minimizing maximum tardiness
1 | rj , prec | Lmax -- NP-hard (why?)
• Apply any optimization approach for 1 | rj | Lmax , after the following revision
• Dominance property
» Job j follows job i in an optimal sequence if ri rj and di dj .
• For each precedence i j, revise rj and di to rj' and di' such that
» rj' = max{rj , ri + pi} and di' = min{di , dj – pj}
» i.e., make agreeable ready times and due dates consistent with the
precedence
It is not necessary to design new algorithm for this special case
• Justification of the revision -- when di dj – pj
• Li = Ci – di' = Ci – (dj – pj) (Cj – pj) – (dj – pj) = Cj – dj = Lj
1 | prec | Lmax
• Revise only due dates, and apply EDD
1 | prec | gj(Cj) -- extension of Theorem 1 of Ch. 3
• When the objective is to minimize the maximum penalty, job i may be
assigned the last position in sequence if job i has no unscheduled
successors and gi(P) gj(P) for all jobs j i.
12
Dependent Jobs
1 | prec | Cj
Strongly NP-hard for arbitrary precedence structure
Some special cases with existing polynomial-time algorithm
• Precedence structure with strings and chains
Minimizing total flowtime with strings
String
• A set of jobs that must appear together (continuously) and in a fixed order
• e.g., 4 jobs with a string (1-2-3)
• Only two possible sequences: 1-2-3-4 or 4-1-2-3
Some applications
• Conflict between sorting and precedence constraints
• e.g., Single relevant precedence constraints i j but pj pi
» j is preferred to i for F criterion
» There exists an optimal sequence in which jobs i and j are adjacent,
in that order (why?)
• Contiguity constraint, e.g., group of jobs with common major setup
• Chains and series-parallel network (discussed next)
13
Dependent Jobs
Minimizing total flowtime with strings (cont’d)
Problem with s strings
• nk -- number of jobs in string k (1 k s)
• pkj -- processing time of job j in string k (1 j nk)
Let
• pk = j=1..nk pkj -- total processing time in string k
• F(k, j) -- flowtime of job j in string k
• F(k) = F(k, nk) -- flowtime of string k
Objective -- to minimize the total flowtime (of jobs)
• Min. F = k=1..s j=1..nk F(k, j)
Theorem 6
• In the single-machine problem with job strings, total flowtime is minimized by
sequencing the strings in the order
p[1]/n[1] p[2]/n[2] ... p[s]/n[s]
Proof of Theorem 6
• F = k=1..s j=1..nk F(k, j) = k=1..s j=1..nk (F(k) – i=(j+1)..nk pki)
F = k=1..s j=1..nk F(k) – k=1..s j=1..nk i=(j+1)..nk pki = k=1..s nkF(k) – c
14
Dependent Jobs
Minimizing total flowtime with parallel chains
Chain
• Precedence structure in which each job has at most one direct predecessor
and one direct successor
• The jobs in a chain do not necessarily have to be sequenced contiguously
The jobs in a string should be
• Example with 9 jobs
10
1
4
2
6
3
5
4
7
5
1
6
8
7
4
8
7
9
• Feasible sequences: 4-1-2-3-7-8-9-5-6, 7-1-4-2-5-6-3-8-9, ...
ALGORITHM 3 -- Parallel Chain Algorithm for F
1. Initially, each job is a string.
2. Find a pair of string, u and v, such that u directly precedes v and pv /nv pu /nu .
Replace the pair by the string (u, v). Then repeat this step. When no such pair
can be found, proceed to step 3.
3. Sort the strings in non-decreasing order of p/n. This is an optimal schedule.
Justification of Algorithm 3
• Extended from Theorem 6 and related analysis
15
Dependent Jobs
Minimizing total flowtime with parallel chains (cont’d)
Series-parallel precedence structure
• Network N with a single node, or that can be partitioned into two subnetworks
N1 and N2 which are themselves series-parallel and where either:
• N1 is in series with N2 (if i N1 and j N2 , then i j), or
• N1 is in parallel with N2 (if i N1 and j N2 , then i j and j i)
• Example with 8 jobs
7
2
5
1
7
5
3
4
4
7
6
8
3
7
8
6
P
5
8
S
5
S
P
2
series-parallel
structure
P
S
4
S
1
3
2
decomposition
tree
• Optimal sequence construction -- recursively apply the following from leaves:
• Series type N: forming string (N1, N2)
• Parallel type N: applying Algorithm 3
16
Sequence-dependent Setup Times
1 | sjk | Cmax
Setup time that cannot be absorbed in a job’s processing time
Examples
• Production of different chemical compounds, colors of paint, strengths of
detergent, blends of fuels (with cleansing required for switching)
• Process line for four types of gasoline
• Setup times -- sij matrix
Racing
Premium
Regular
Unleaded
(1)
(2)
(3)
(4)
(1)
–
40
30
20
(2)
30
–
30
15
(3)
50
20
–
10
(4)
90
80
60
–
• Makespan
• 1-2-3-4-1: p1 + 30 + p2 + 20 + p3 + 60 + p4 + 20 = j pj + 30 + 20 + 60 + 20
• 1-2-4-3-1: p1 + 30 + p2 + 80 + p3 + 10 + p4 + 30 = j pj + 30 + 80 + 10 + 30
• ...
Objective -- to minimize makespan to minimize total setup times
• Min. M = F[n] + s[n],[n+1] = j=1..n+1 s[j –1],[j] + j=1..n pj min. j=1..n+1 s[j –1],[j]
17
Sequence-dependent Setup Times
1 | sjk | Cmax (cont’d)
Strongly NP-hard
• Traveling salesman problem (TSP) reduces to 1 | sjk | Cmax .
TSP -- mathematical programming model
• Decision variables
• xij = 1, if path (i, j) is part of a tour
xij = 0, otherwise
• Objective
• z = i j (i) sij xij
• Constraints
• xij’s make a tour
30
1
1
2
3
4
– 30 50 90
40 – 20 80
30 30 – 60
20 15 10 –
1
2
3
4
setup times sij
2
40
90
20
20
30
10
4
3
60
graph representation
Representation of a solution by selection of paths
• Cost of solution -- length of a tour
• Sum of length of selected paths
• Example
1
2
3
• x12 = x24 = x43
1
– X
= x31 = 1
2
–
3 X
–
• 0, others
4
X
tour 1-2-4-3
4
X
–
1
2
3
4
1
2
3
4
– 30 50 90
40 – 20 80
30 30 – 60
20 15 10 –
length: 150
30
1
2
40
90
20
30
4
10
60
18 3
20
Sequence-dependent Setup Times
Dynamic programming solutions
Let
•
•
•
•
n -- number of cities
X -- the set of all cities
J -- a subset of X
i -- arbitrarily chosen origin of tour
A representation of optimal tour
• Sequence of sets {i}, S, {k}, J, {i}
• where i k, S J = , |S| + |J| = n – 2, {i, k} S J = X
• Tour begins at city i, proceeded to cities in S, visits city k, then proceeds to
cities in J, and finally returns to i.
Formulation
• f(k, J) = minjJ {skj + f(j, J – {j})}
• The length of the shortest path from city k that passes through the cities
in J and finishes at city i
• f(k, ) = ski -- base case
• f(i, X) -- the length of the optimal tour
19
Sequence-dependent Setup Times
Branch and bound solutions
Branching scheme
• Creating two subproblems at all levels
• One containing a specific path constrained to be part of the solution
• The other subproblem prohibiting that same path
• e.g., a partition by solutions with (2,1) and solutions without (2,1), ...
1
2
3
4
1
–
X
2
?
–
?
?
3
?
4
?
–
?
?
–
P
1
2
3
4
2
?
–
X
?
2
?
–
?
?
3
?
?
–
?
–
?
4
X
–
1
2
3
4
1
–
X
2
?
–
X
?
?
*21,23
3
?
–
X
?
4
X
?
–
1
2
3
4
1
–
?
?
2
?
–
?
?
1
–
1
2
3
4
*21
21,*34
3
X
?
4
?
?
?
–
21
21,34
1
–
X
1
2
3
4
1
–
?
?
?
?
?
2
?
–
?
?
3
?
?
–
?
4
?
?
?
–
*21,*23
3
X
–
4
?
?
–
1
2
3
4
1
–
?
?
2
?
–
?
?
3
?
–
?
4
?
X
?
?
–
20
Sequence-dependent Setup Times
Branch and bound solutions (cont’d)
Reduction of sij matrix
• Subtracting the minimum row elements from each row
• Subtracting the minimum column elements from each column
Lower bound
• Sum of the subtraction constants for reduction
Better one by solving (relaxed) assignment problem
Example
• Root node (original problem) reduction: LB = 20
–
5
11
5
10
4
–
6
7
9
P -- sij
8
7
–
2
7
6
11
8
–
5
8
13
4
2
–
–
0
7
3
5
P (reduced) -- s'ij
0
4
2
–
2
6
2
–
4
5
0
–
4
2
0
4
8
0
0
–
• z = i j (i) sij xij = i j (i) s'ij xij + 4 + 5 + 4 + 2 + 5 = i j (i) s'ij xij + 20
Selection of path with x12 = x21 = x35 = x43 = x54 = 1?
P (reduced)
• i j (i) sij xij = i j (i) s'ij xij + 20 = 20
– X
X –
» Optimal? No! Subtour!
–
X
–
X
–
X
21
Sequence-dependent Setup Times
Branch and bound solutions (cont’d)
Justification of reduction and lower bound
• Exactly one element in each row is contained in a solution
• Exactly one element in each column is contained in a solution
• Lower bound as the distance that are unavoidable in any solution
Example of reduction
• Original and reduced length (LB = 120)
original
30 50
–
20
30
–
15 10
–
40
30
20
row reduction
–
30 30 30
20
–
20 20
30 30
–
30
10 10 10
–
90
80
60
–
+
column reduction
–
0
0
30
0
–
0
30
0
0
–
30
0
0
0
–
reduced
0
20
–
0
0
–
5
0
–
20
0
10
+
30
30
0
–
• Analysis in perspective of node 4
30
50
1
80
2
20
20
4
60
15
0
3
30
30
60
2
row reduction
by 30+20+30+10
10
4
30
5
30
2
0
0
3
column reduction
by 0+0+0+30
10
30
20
0
0
20
1
60
20
10
0
20
1
90
40
3
0
5
4
0
0
0
0
22
Sequence-dependent Setup Times
Branch and bound solutions (cont’d)
Branching scheme (cont’d)
• Selection of a zero element in reduced matrix for two subproblems (why?)
• A possible element selection criterion
• One that would permit the largest possible reduction next, when prohibited
–
5
11
5
10
–
*0
–
–
–
4
–
6
7
9
–
–
2
5
4
P
8
7
–
2
7
P(21)
4
–
–
0
2
6
11
8
–
5
2
–
4
–
0
P (reduced by 20)
–
0
4
2
0
–
2
6
7
2
–
4
3
5
0
–
5
4
2
0
8
13
4
2
–
4
–
0
0
–
P(21) (reduced by 4)
–
–
2 20
2
*0
–
–
–
–
2
0
–
0
–
4
0
–
3 20
– 00
–
2
2 20
–
4
8
0
0
–
P (reduced by 20)
– 40
4
2
4
50
–
2
6
8
2
7
2
–
4
0
3
5 20
– 00
5
4
2 40
–
–
–
7
3
5
P
20
21
*21
24
25
0
–
2
5
4
P(21)
4
2
–
0
2
2
6
4
–
0
4
8
0
0
–
P(21) (reduced by 5)
– 40
4
2
4
–
– 40
4
6
2
4
2
–
4
0
20
5 00
– 00
2
4
2 40
–
23
Sequence-dependent Setup Times
Heuristic solutions
Simple greedy procedures
• Closest unvisited city
• Variations
• Closest unvisited city based on reduced matrix (relative distance)
• Closest unvisited pair of cities (using look-ahead)
• Applying procedure for every city as origin
Insertion procedure
1. Select two cities arbitrarily and make partial tour
2. Insert remaining cities at every possible place of current partial tour one by
one, and choose best one.
3. Repeat Step 2 until complete tour is constructed.
General search methods
Huge amount..
24
Summary
Generalization of basic single-machine model
More applicability and new difficulties
Dynamic models
Job preemption
• Preempt-resume, preempt-repeat
Inserted idle times
Look-ahead procedures
Precedence constraints
Strings and chains
Series-parallel precedence
Sequence-dependent setup times
Traveling salesman problem
END OF SINGLE MACHINE!! AT LAST!!
25