Short Term Scheduling

Preview:

DESCRIPTION

Short Term Scheduling. Characteristics. Planning horizon is short Multiple unique jobs (tasks) with varying processing times and due dates Multiple unique jobs sharing the same set of resources (machines) Time is treated as continuous (not discretized into periods) - PowerPoint PPT Presentation

Citation preview

1

Short Term Scheduling

2

Planning horizon is short

Multiple unique jobs (tasks) with varying processing times and due dates

Multiple unique jobs sharing the same set of resources (machines)

Time is treated as continuous (not discretized into periods)

Varying objective functions

Characteristics

3

Common in make-to-order environments with high product variety

Common as a support tool for MRP in generating detailed schedules once orders have been released

Characteristics (Continued…)

4

Example

• Two jobs, A and B• Two machines M1 and M2• Jobs are processed on M1 and then on M2• Job A: 9 minutes on M1 and 2 minutes on M2• Job B: 4 minutes on M1 and 9 minutes on M2

5

Example

M1 A B

M2 A B

Time 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

6

Example (Continued…)

M1 B A

M2 B A

Time 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

7

Challenge

As the number of jobs increases, complete enumeration becomes difficult:3! = 6, 4! = 24, 5! = 120, 6! = 720, … 10! =3,628,800, while13! = 6,227,020,80025!= 15,511,210,043,330,985,984,000,000

8

Classification of Scheduling Problems

Number of jobs Number of machines Type of production facility

Single machine Flow shop parallel machines job shop

Job arrivals Static Dynamic

Performance measures

9

A Single Machine Example

Jobs1 2 3 4 5 6

Processing time, pj 12 8 3 10 4 18Release time, rj -20 -15 12 -10 3 2Due date, dj 10 2 72 -8 8 60

10

The Single Machine Problem

Single machine scheduling problems seek an optimal sequence (for a given criterion) in which to complete a given collection of jobs on a single machine that can accommodate only one job at a time.

11

Decision Variables xj: time job j is started (relative

to time 0 = now), xj max(0, rj) for all values of j.

12

Sequencing constraints- (start time of j)+ (processing time of

j) < start time of j’

or - (start time of j’)+ (processing time of

j’) < start time of j

13

Sequencing constraints

- (start time of j)+ (processing time of j) start time of j’

or

- (start time of j’)+ (processing time of j’) start time of j

xj+ pj xj’ or xj’ + pj’ xj

14

Disjunctive variables - Introduce disjunctive variables yjj’, yjj’ = 1 if job j is scheduled before job j’ and yjj’ = 0 otherwise.

xj+ pj xj’ + M(1 - yjj’),

xj’+ pj’ xj + Myjj’,

for all pairs of j and j’ (for every j and every j’ > j), M is a large positive constant

15

Due date constraints

xj+ pj dj for all values of j

16

Examples of Performance measures

- Completion time of job :

- Flow time of job :

- Lateness of job :

- Tardiness of job : max{0, }

j j

j j j

j j j

j j j

j x p

j x p r

j x p d

j x p d

17

Example

Jobs 1 2 3Processing time 15 6 9Release time 5 10 0Due date 20 25 36Start time 9 24 0

18

Objective functions

Maximum completion time Mean completion time Maximum flow time Mean flow time Maximum lateness Mean lateness Maximum tardiness Mean lateness

19

Objective functions

1

1

Maximum completion time max { }

Mean completion time ( )

Maximum flow time max { }

Mean flow time ( )

j j j

Nj jj

j j j j

Nj j jj

x p

x p N

x p r

x p r N

1

Maximum lateness max { }

Mean lateness ( )

Maximum tardiness max [max{0, }]

Mean tardiness

j j j j

Nj j jj

j j j j

x p d

x p d N

x p d

1 max{0, }N

j j jjx p d N

20

Formulation: Minimizing Makespan (Maximum Completion Time)

' '

' ' '

'

Minimize max ( )

subject to (1 ), , '

, , '

{0,1} , '

max(0, )

j j j

j j j jj

j j j jj

jj

j j

z x p

x p x M y j j j

x p x My j j j

y j j j

x r

j

21

A Formulation with a Linear Objective Function

' '

' ' '

'

Minimize subject to

(1 ), , '

, , '

,

{0,1}, , '

max(0, ),

j j j jj

j j j jj

j j

jj

j j

z f

x p x M y j j j

x p x My j j j

x p f j

y j j j

x r j

22

Similar formulations can be constructed with other min-max objective functions, such as minimizing maximum lateness or maximum tardiness.

Other objective functions involving minimizing means (other than mean tardiness) are already linear.

23

The Job Shop Scheduling Problem

N jobs M Machines A job j visits in a specified sequence a

subset of the machines

24

Notation

pjm: processing time of job j on machine m,

xjm: start time of job j on machine m,

yj,j’,m = 1 if job j is scheduled before job

j’ on machine m,

M(j): The subset of the machines visited by job j,

SS(m, j): the set of machines that job j visits after visiting machine m

25

Formulation

, , ,

, , ', , ',

', ', , , ',

Minimize max ( )

subject to (1 ), , ' , ( ) ( ')

,

j m j m j m

j m j m j m j j m

j m j m j m j j m

z x p

x p x M y j j j m M j M j

x p x My

, , , '

, ',

,

, ' , ( ) ( ')

, ' ( , )

{0,1} , ' , ( ) ( ')

max(0, ) , ( )

j m j m j m

j j m

j m j

j j j m M j M j

x p x m SS m j

y j j j m M j M j

x r j m M j

26

Solution Methods

Small to medium problems can be solved exactly (to optimality) using techniques such as branch and bound and dynamic programming

Structural results and polynomial (fast) algorithms for certain special cases

Large problems in general may not solve within a reasonable amount of time (the problem belongs to a class of combinatorial optimization problems called NP-hard)

Large problems can be solved approximately using heuristic approaches

27

Single Machine Results

Makespan Not affected by sequence

Average Flow Time Minimized by performing jobs according to the “shortest processing time” (SPT) order

Average Lateness Minimized by performing in “shortest processing time” (SPT) order

Maximum Lateness (or Tardiness) Minimized by performing in “earliest due date” (EDD) order. If there exists a sequence with no tardy jobs, EDD will do it

28

Single Machine Results (Continued…)

Average Weighted Flow Time Minimized by performing according to the “smallest processing time ratio” (processing time/weight) order

Average Tardiness No simple sequencing rule will work

29

Two Machine Results

Given a set of jobs that must go through a sequence of two machines, what sequence will yield the minimum makespan?

30

Johnson’s Algorithm

A Simple algorithm (Johnson 1954):1. Sort the processing times of the jobs on the two machines in two lists.

2. Find the shortest processing time in either list and remove the corresponding job from both lists.– If the job came from the first list, place it in the first available position in the sequence.

– If the job came from the second list, place it in the last available position in sequence.

3. Repeat until are all jobs have been sequenced.The resulting sequence minimizes makespan.

31

Data:Job Time on M1 Time on M21 4 92 7 103 6 5

32

Johnson’s Algorithm Example

Data:

Iteration 1: min time is 4 (job 1 on M1); place this job first and remove from both lists:

Job Time on M1 Time on M21 4 92 7 103 6 5

List 1 List 24 (1) 5 (3)6 (3) 9 (1)7 (2) 10 (2)

33

Data:Job Time on M1 Time on M21 4 92 7 103 6 5

34

Johnson’s Algorithm Example (Continued…)

Iteration 2: min time is 5 (job 3 on M2); place this job last and remove from lists:

Iteration 3: only job left is job 2; place in remaining position (middle).

Final Sequence: 1-2-3Makespan: 28

List 1 List 26 (3) 5 (3)7 (2) 10 (2)

35

Gantt Chart for Johnson’s Algorithm Example

Machine 1

Machine 2

Time 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28

3

1 2 3

1 2

Short task on M1 to“load up” quickly.

Short task on M2 to“clear out” quickly.

36

Three Machine Results

Johnson’s algorithm can be extended to three machines by creating two composite machines (M1* = M1 + M2) and (M2* = M2 + M3) and then applying Johnson’s algorithm to these two machines

Optimality is guaranteed only when certain conditions are met

smallest processing time on M1 is greater or equal than largest processing on machine 2, or

smallest processing time on M3 is greater or equal than largest processing on machine 2

37

Multi-Machine Results

Generate M-1 pairs of dummy machines Example: with 4 machines, we have the following three pairs (M1, M4), (M1+M2, M3+M4), (M1+M2+M3, M2+M3+M4)

Apply Johnson’s algorithm to each pair and select the best resulting schedule out of the M-1 schedules generated

Optimality is not guaranteed.

38

Dispatching Rules

In general, simple sequencing rules (dispatching rules) do not lead to optimal schedules. However, they are often used to solve approximately (heuristically) complex scheduling problems.

Basic Approach Decompose a multi-machine problem (e.g., a job shop scheduling problem) into sub-problems each involving a single machine.

Use a simple dispatching rule to sequence jobs on each of these machines.

39

Example Dispatching Rules

FIFO – simplest, seems “fair”. SPT – Actually works quite well with

tight due dates. EDD – Works well when jobs are mostly

the same size. Critical ratio (time until due

date/work remaining) - Works well for tardiness measures

Many (100’s) others.

40

Heuristics Algorithms

Construction heuristics Use a procedure (a set of rules) to construct from scratch a good (but not necessarily optimal) schedule

Improvement heuristics Starting from a feasible schedule (possibly obtained using a construction heuristic), use a procedure to further improve the schedule

41

Example: A Single Machine with Setups

N jobs to be scheduled on a single machine with a sequence dependent setup preceding the processing of each job.

The objective is to identify a sequence that minimizes makespan.

The problem is an instance of the Traveling Salesman Problem (TSP).

The problem is NP-hard (the number of computational steps required to solve the problem grow exponentially with the number of jobs).

42

A Heuristic Algorithm

43

A Heuristic Algorithm

Greedy heuristic: Start with an arbitrary job from the set of N jobs. Schedule jobs subsequently based on “next shortest setup time.”

44

A Heuristic Algorithm

Greedy heuristic: Start with an arbitrary job from the set of N jobs. Schedule jobs subsequently based on “next shortest setup time.”

Improved greedy heuristic: Evaluate sequences with all possible starting jobs (N different schedules). Choose schedule with the shortest makespan.

45

A Heuristic Algorithm

Greedy heuristic: Start with an arbitrary job from the set of N jobs. Schedule jobs subsequently based on “next shortest setup time.”

Improved greedy heuristic: Evaluate sequences with all possible starting jobs (N different schedules). Choose schedule with the shortest makespan.

Improved heuristic: Starting from the improved greedy heuristic solution carry out a series of pair-wise interchanges in the job sequence. Stop when solution stops improving.

46

A Problem Instance

16 jobs Each job takes 1 hour on single machine (the bottleneck resource)

4 hours of setup to change over from one job family to another

Fixed due dates Find a solution that minimizes tardiness

47

EDD Sequence

Average Tardiness: 10.375

Job Due CompletionNumber Family Date Time Lateness

1 1 5.00 5.00 0.002 1 6.00 6.00 0.003 1 10.00 7.00 -3.004 2 13.00 12.00 -1.005 1 15.00 17.00 2.006 2 15.00 22.00 7.007 1 22.00 27.00 5.008 2 22.00 32.00 10.009 1 23.00 37.00 14.00

10 3 29.00 42.00 13.0011 2 30.00 47.00 17.0012 2 31.00 48.00 17.0013 3 32.00 53.00 21.0014 3 32.00 54.00 22.0015 3 33.00 55.00 22.0016 3 40.00 56.00 16.00

48

A Greedy Search

Consider all pair-wise interchanges Choose one the one that reduces average tardiness the most

Continue until no further improvement is possible

49

First Interchange: Exchange Jobs 4 and 5.

Average Tardiness: 5.0 (reduction of 5.375!)

Job Due CompletionNumber Family Date Time Lateness

1 1 5.00 5.00 0.002 1 6.00 6.00 0.003 1 10.00 7.00 -3.004 1 15.00 8.00 -7.005 2 13.00 13.00 0.006 2 15.00 14.00 -1.007 1 22.00 19.00 -3.008 2 22.00 24.00 2.009 1 23.00 29.00 6.00

10 3 29.00 34.00 5.0011 2 30.00 39.00 9.0012 2 31.00 40.00 9.0013 3 32.00 45.00 13.0014 3 32.00 46.00 14.0015 3 33.00 47.00 14.0016 3 40.00 48.00 8.00

50

Job Due CompletionNumber Family Date Time Lateness

1 1 5.00 5.00 0.002 1 6.00 6.00 0.003 1 10.00 7.00 -3.005 1 15.00 8.00 -7.004 2 13.00 13.00 0.006 2 15.00 14.00 -1.008 2 22.00 15.00 -7.007 1 22.00 20.00 -2.009 1 23.00 21.00 -2.00

11 2 30.00 26.00 -4.0012 2 31.00 27.00 -4.0010 3 29.00 32.00 3.0013 3 32.00 33.00 1.0014 3 32.00 34.00 2.0015 3 33.00 35.00 2.0016 3 40.00 36.00 -4.00

Greedy Search Final Sequence

Average Tardiness: 0.5 (9.875 lower than EDD)

51

A Better (Due-Date Feasible) Sequence

Average Tardiness: 0

Job Due CompletionNumber Family Date Time Lateness

1 1 5.00 5.00 0.002 1 6.00 6.00 0.003 1 10.00 7.00 -3.005 1 15.00 8.00 -7.004 2 13.00 13.00 0.006 2 15.00 14.00 -1.008 2 22.00 15.00 -7.00

11 2 30.00 16.00 -14.0012 2 31.00 17.00 -14.007 1 22.00 22.00 0.009 1 23.00 23.00 0.00

13 3 32.00 28.00 -4.0010 3 29.00 29.00 0.0016 3 40.00 30.00 -10.0014 3 32.00 31.00 -1.0015 3 33.00 32.00 -1.00

52

Revisiting Computational Times

Current situation: computers can examine 1,000,000 sequences per second and we wish to build a scheduling system that has a response time of no longer than one minute. How many jobs can we sequence optimally (using a brute force approach)?

Number of Jobs Computer Time5 0.12 millisec6 0.72 millisec7 5.04 millisec8 40.32 millisec9 0.36 sec10 3.63 sec11 39.92 sec12 7.98 min13 1.73 hr14 24.22 hr15 15.14 day

20 77,147 years

53

Effect of Faster Computers

Future Situation: New computers will be 1,000 times faster, i.e. it can do 1 billion comparisons per second). How many jobs can we sequence optimally now?

Number of Jobs Computer Time5 0.12 microsec6 0.72 microsec7 5.04 microsec8 40.32 microsec9 362.88 microsec10 3.63 millisec11 39.92 millisec12 479.00 millisec13 6.23 sec14 87.18 sec15 21.79 min 20 7,147 years

54

Implications for Real Problems

Computation: NP (non-polynomial) algorithms are slow to use

No Technology Fix: Faster computers do not help on NP algorithm.

Exact Algorithms: Need for specialized algorithms that take advantage of the structure of the problem to reduce the search space.

Heuristics: Likely to continue to be the dominant approach to solving large problems in practice (e.g., multi-step exchange algorithms, Genetic Algorithms, Simulated Annealing, Tabu Search, among others)

55

Implications for Real Problems (Continued…)

Robustness: NP hard problems have many solutions, and presumably many “good” ones. Example: 25 job sequence problem. Suppose that only one in a trillion of the possible solutions is “good”. This still leaves 15 trillion “good” solutions. Our task is to find one of these.

Focus on Bottleneck: We can often concentrate on scheduling the bottleneck process, which simplifies the problem closer to a single machine case.

Recommended