0% found this document useful (0 votes)
100 views13 pages

Greedy Algorithms: 4.1 Interval Scheduling

Uploaded by

Huanye Liu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
100 views13 pages

Greedy Algorithms: 4.1 Interval Scheduling

Uploaded by

Huanye Liu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

4.

1 Interval Scheduling
Chapter 4

Greedy
Algorithms

Slides by Kevin Wayne.


Copyright © 2005 Pearson-Addison Wesley.
All rights reserved.

Interval Scheduling Interval Scheduling: Greedy Algorithms

Interval scheduling. Greedy template. Consider jobs in some natural order.


 Job j starts at sj and finishes at fj. Take each job provided it's compatible with the ones already taken.
 Two jobs compatible if they don't overlap.
 Goal: find maximum subset of mutually compatible jobs.  [Earliest start time] Consider jobs in ascending order of sj.

 [Earliest finish time] Consider jobs in ascending order of fj.


a
 [Shortest interval] Consider jobs in ascending order of fj - sj.
b

c
 [Fewest conflicts] For each job j, count the number of
d conflicting jobs cj. Schedule in ascending order of cj.
e

g
h
Time
0 1 2 3 4 5 6 7 8 9 10 11

3 4
Interval Scheduling: Greedy Algorithms Interval Scheduling: Greedy Algorithm

Greedy template. Consider jobs in some natural order. Greedy algorithm. Consider jobs in increasing order of finish time.
Take each job provided it's compatible with the ones already taken. Take each job provided it's compatible with the ones already taken.

Sort jobs by finish times so that f1 ≤ f2 ≤ ... ≤ fn.


counterexample for earliest start time
set of jobs selected

A ← φ
for j = 1 to n {
counterexample for shortest interval
if (job j compatible with A)
A ← A ∪ {j}
}
return A
counterexample for fewest conflicts

Implementation. O(n log n).


Remember job j* that was added last to A.
Job j is compatible with A if sj ≥ fj*.

5 6

Interval Scheduling: Analysis Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal. Theorem. Greedy algorithm is optimal.

Pf. (by contradiction) Pf. (by contradiction)


 Assume greedy is not optimal, and let's see what happens.  Assume greedy is not optimal, and let's see what happens.
 Let i1, i2, ... ik denote set of jobs selected by greedy.  Let i1, i2, ... ik denote set of jobs selected by greedy.
 Let j1, j2, ... jm denote set of jobs in the optimal solution with  Let j1, j2, ... jm denote set of jobs in the optimal solution with
i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r. i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

job ir+1 finishes before jr+1 job ir+1 finishes before jr+1

Greedy: i1 i2 ir ir+1 Greedy: i1 i2 ir ir+1

OPT: j1 j2 jr jr+1 ... OPT: j1 j2 jr ir+1 ...

why not replace job jr+1 solution still feasible and optimal,
with job ir+1? but contradicts maximality of r.

7 8
Interval Partitioning

4.1 Interval Partitioning Interval partitioning.


 Lecture j starts at sj and finishes at fj.
 Goal: find minimum number of classrooms to schedule all lectures
so that no two occur at the same time in the same room.

Ex: This schedule uses 4 classrooms to schedule 10 lectures.

4 e j

3 c d g

2 b h

1 a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

10

Interval Partitioning Interval Partitioning: Lower Bound on Optimal Solution

Interval partitioning. Def. The depth of a set of open intervals is the maximum number that
 Lecture j starts at sj and finishes at fj. contain any given time.
 Goal: find minimum number of classrooms to schedule all lectures
so that no two occur at the same time in the same room. Key observation. Number of classrooms needed ≥ depth.

Ex: This schedule uses only 3. Ex: Depth of schedule below = 3 ⇒ schedule below is optimal.
a, b, c all contain 9:30

Q. Does there always exist a schedule equal to depth of intervals?

3 c d f j 3 c d f j

2 b g i 2 b g i

1 a e h 1 a e h

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30
Time Time

11 12
Interval Partitioning: Greedy Algorithm Interval Partitioning: Greedy Analysis

Greedy algorithm. Consider lectures in increasing order of start time: Observation. Greedy algorithm never schedules two incompatible
assign lecture to any compatible classroom. lectures in the same classroom.

Theorem. Greedy algorithm is optimal.


Sort intervals by starting time so that s1 ≤ s2 ≤ ... ≤ sn.
d ← 0 Pf.
number of allocated classrooms
 Let d = number of classrooms that the greedy algorithm allocates.
for j = 1 to n {
if (lecture j is compatible with some classroom k)
 Classroom d is opened because we needed to schedule a job, say j,
schedule lecture j in classroom k that is incompatible with all d-1 other classrooms.
else  These d jobs each end after sj.
allocate a new classroom d + 1
schedule lecture j in classroom d + 1  Since we sorted by start time, all these incompatibilities are caused
d ← d + 1 by lectures that start no later than sj.
}  Thus, we have d lectures overlapping at time sj + ε.
 Key observation ⇒ all schedules use ≥ d classrooms. ▪
Implementation. O(n log n).
For each classroom k, maintain the finish time of the last job added.
Keep the classrooms in a priority queue.

13 14

Scheduling to Minimizing Lateness

4.2 Scheduling to Minimize Lateness Minimizing lateness problem.


 Single resource processes one job at a time.
 Job j requires tj units of processing time and is due at time dj.
 If j starts at time sj, it finishes at time fj = sj + tj.
 Lateness: lj = max { 0, fj - dj }.
 Goal: schedule all jobs to minimize maximum lateness L = max lj.

1 2 3 4 5 6
Ex:
tj 3 2 1 4 3 2

dj 6 8 9 9 14 15

lateness = 2 lateness = 0 max lateness = 6

d3 = 9 d2 = 8 d6 = 15 d1 = 6 d5 = 14 d4 = 9
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

16
Minimizing Lateness: Greedy Algorithms Minimizing Lateness: Greedy Algorithms

Greedy template. Consider jobs in some order. Greedy template. Consider jobs in some order.

 [Shortest processing time first] Consider jobs in ascending order  [Shortest processing time first] Consider jobs in ascending order
of processing time tj. of processing time tj.
1 2

tj 1 10 counterexample
 [Earliest deadline first] Consider jobs in ascending order of dj 100 10
deadline dj.

 [Smallest slack] Consider jobs in ascending order of slack dj - tj.  [Smallest slack] Consider jobs in ascending order of slack dj - tj.

1 2
tj 1 10
counterexample
dj 2 10

17 18

Minimizing Lateness: Greedy Algorithm Minimizing Lateness: No Idle Time

Greedy algorithm. Earliest deadline first. Observation. There exists an optimal schedule with no idle time.

d=4 d=6 d = 12
0 1 2 3 4 5 6 7 8 9 10 11
Sort n jobs by deadline so that d1 ≤ d2 ≤ … ≤ dn

t ← 0
d=4 d=6 d = 12
for j = 1 to n
0 1 2 3 4 5 6 7 8 9 10 11
Assign job j to interval [t, t + tj]
sj ← t, fj ← t + tj
t ← t + tj
output intervals [sj, fj]
Observation. The greedy schedule has no idle time.

max lateness = 1

d1 = 6 d2 = 8 d3 = 9 d4 = 9 d5 = 14 d6 = 15
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

19 20
Minimizing Lateness: Inversions Minimizing Lateness: Inversions

Def. Given a schedule S, an inversion is a pair of jobs i and j such that: Def. Given a schedule S, an inversion is a pair of jobs i and j such that:
i < j but j scheduled before i. i < j but j scheduled before i.
inversion inversion
fi fi

before swap j i before swap j i

[ as before, we assume jobs are numbered so that d1 ≤ d2 ≤ … ≤ dn ]


after swap i j
f'j

Observation. Greedy schedule has no inversions. Claim. Swapping two consecutive, inverted jobs reduces the number of
inversions by one and does not increase the max lateness.
Observation. If a schedule (with no idle time) has an inversion, it has
one with a pair of inverted jobs scheduled consecutively. Pf. Let l be the lateness before the swap, and let l ' be it afterwards.
 l 'k = lk for all k ≠ i, j
 l 'i ≤ li l"j = f j" # d j (definition)
= fi # d j ( j finishes at time fi )
 If job j is late:
$ fi # d i (i < j)
$ li (definition)

21 22

Minimizing Lateness: Analysis of Greedy Algorithm Greedy Analysis Strategies

Theorem. Greedy schedule S is optimal. Greedy algorithm stays ahead. Show that after each step of the greedy
Pf. Define S* to be an optimal schedule that has the fewest number of algorithm, its solution is at least as good as any other algorithm's.
inversions, and let's see what happens.
 Can assume S* has no idle time. Structural. Discover a simple "structural" bound asserting that every
 If S* has no inversions, then S = S*. possible solution must have a certain value. Then show that your
 If S* has an inversion, let i-j be an adjacent inversion. algorithm always achieves this bound.
– swapping i and j does not increase the maximum lateness and
strictly decreases the number of inversions Exchange argument. Gradually transform any solution to the one found
– this contradicts definition of S* ▪ by the greedy algorithm without hurting its quality.

Other greedy algorithms. Kruskal, Prim, Dijkstra, Huffman, …

23 24
Optimal Offline Caching

4.3 Optimal Caching Caching.


 Cache with capacity to store k items.
 Sequence of m item requests d1, d2, …, dm.
 Cache hit: item already in cache when requested.
 Cache miss: item not already in cache when requested: must bring
requested item into cache, and evict some existing item, if full.

Goal. Eviction schedule that minimizes number of cache misses.

a a b
Ex: k = 2, initial cache = ab, red = cache miss
b a b
requests: a, b, c, b, c, a, a, b.
c c b
Optimal eviction schedule: 2 cache misses.
b c b
c c b
a a b
a a b
b a b
requests cache
26

Optimal Offline Caching: Farthest-In-Future Reduced Eviction Schedules

Farthest-in-future. Evict item in the cache that is not requested until Def. A reduced schedule is a schedule that only inserts an item into
farthest in the future. the cache in a step in which that item is requested.

current cache: a b c d e f
Intuition. Can transform an unreduced schedule into a reduced one
with no more cache misses.
future queries: g a b c e d a b b a c d e a f a d e f g h ...

cache miss eject this one a a b c a a b c


a a x c a a b c
c a d c c a b c
Theorem. [Bellady, 1960s] FF is optimal eviction schedule. d a d b d a d c
Pf. Algorithm and theorem are intuitive; proof is subtle. a a c b a a d c
b a x b b a d b
c a c b c a c b
a a b c a a c b
a a b c a a c b

an unreduced schedule a reduced schedule

27 28
Reduced Eviction Schedules Farthest-In-Future: Analysis

Claim. Given any unreduced schedule S, can transform it into a reduced Theorem. FF is optimal eviction algorithm.
schedule S' with no more cache misses. doesn't enter cache at requested Pf. (by induction on number or requests j)
Pf. (by induction on number of unreduced items) time
 Suppose S brings d into the cache at time t, without a request. Invariant: There exists an optimal reduced schedule S that makes
 Let c be the item S evicts when it brings d into the cache. the same eviction schedule as SFF through the first j+1 requests.
 Case 1: d evicted at time t', before next request for d.
 Case 2: d requested at time t' before d is evicted. ▪ Let S be reduced schedule that satisfies invariant through j requests.
We produce S' that satisfies invariant after j+1 requests.
S S' S S'  Consider (j+1)st request d = dj+1.
c c c c
 Since S and SFF have agreed up until now, they have the same cache
t t t t contents before request j+1.
d d  Case 1: (d is already in the cache). S' = S satisfies invariant.
 Case 2: (d is not in the cache and S and SFF evict the same element).
t' t' t' t'
S' = S satisfies invariant.
e d evicted at time t',
e d requested at time t'
d
before next request

Case 1 Case 2
29 30

Farthest-In-Future: Analysis Farthest-In-Future: Analysis

Pf. (continued) Let j' be the first time after j+1 that S and S' take a different action,
 Case 3: (d is not in the cache; SFF evicts e; S evicts f ≠ e). and let g be item requested at time j'.
must involve e or f (or both)
– begin construction of S' from S by evicting e instead of f

j' same e same f


j same e f same e f
S S'
S S'
 Case 3a: g = e. Can't happen with Farthest-In-Future since there
j+1
j same e d same d f must be a request for f before e.
S S'
 Case 3b: g = f. Element f can't be in cache of S, so let e' be the
element that S evicts.
– now S' agrees with SFF on first j+1 requests; we show that having – if e' = e, S' accesses f from cache; now S and S' have same cache
element f in cache is no worse than having element e – if e' ≠ e, S' evicts e' and brings e into the cache; now S and S'
have the same cache

Note: S' is no longer reduced, but can be transformed into


a reduced schedule that agrees with SFF through step j+1

31 32
Farthest-In-Future: Analysis Caching Perspective

Let j' be the first time after j+1 that S and S' take a different action, Online vs. offline algorithms.
and let g be item requested at time j'.  Offline: full sequence of requests is known a priori.
must involve e or f (or both)
 Online (reality): requests are not known in advance.
j' same e same f
 Caching is among most fundamental online problems in CS.

S S'

otherwise S' would take the same action


LIFO. Evict page brought in most recently.
LRU. Evict page whose most recent access was earliest.
 Case 3c: g ≠ e, f. S must evict e.
FF with direction of time reversed!
Make S' evict f; now S and S' have the same cache. ▪
Theorem. FF is optimal offline eviction algorithm.
j' same g same g Provides basis for understanding and analyzing online algorithms.
LRU is k-competitive. [Section 13.8]
S S'

LIFO is arbitrarily bad.

33 34

Shortest Path Problem

4.4 Shortest Paths in a Graph Shortest path network.


Directed graph G = (V, E).
Source s, destination t.
Length le = length of edge e.

Shortest path problem: find shortest directed path from s to t.

cost of path = sum of edge costs in path

2 23 3
9
s
14
18 Cost of path s-2-3-5-t
= 9 + 23 + 2 + 16
2 6
6
30 4 19 = 50.
11
15 5
5
6
20 16
shortest path from Princeton CS department to Einstein's house

7 t
44

36
Dijkstra's Algorithm Dijkstra's Algorithm

Dijkstra's algorithm. Dijkstra's algorithm.


 Maintain a set of explored nodes S for which we have determined  Maintain a set of explored nodes S for which we have determined
the shortest path distance d(u) from s to u. the shortest path distance d(u) from s to u.
 Initialize S = { s }, d(s) = 0.  Initialize S = { s }, d(s) = 0.
 Repeatedly choose unexplored node v which minimizes  Repeatedly choose unexplored node v which minimizes
" (v ) = min d (u ) + l e , " (v ) = min d (u ) + l e ,
e = (u , v ) : u ! S e = (u , v ) : u ! S

add v to S, and set d(v) = π(v). shortest path to some u in explored add v to S, and set d(v) = π(v). shortest path to some u in explored
part, followed by a single edge (u, v) part, followed by a single edge (u, v)

le v
le v
d(u) d(u)
u u
S S
s s

37 38

Dijkstra's Algorithm: Proof of Correctness Dijkstra's Algorithm: Implementation

Invariant. For each node u ∈ S, d(u) is the length of the shortest s-u path. For each unexplored node, explicitly maintain " (v) = min d (u) + l e .
e = (u,v) : u # S
Pf. (by induction on |S|)
Base case: |S| = 1 is trivial.  Next node to explore = node with minimum π(v).
Inductive hypothesis: Assume true for |S| = k ≥ 1.  When exploring v, for each incident edge e = (v, w), update
!
 Let v be next node added to S, and let u-v be the chosen edge. " (w) = min { " (w), " (v) + l e }.
 The shortest s-u path plus (u, v) is an s-v path of length π(v).
 Consider any s-v path P. We'll see that it's no shorter than π(v). Efficient implementation. Maintain a priority queue of unexplored
Let x-y be the first edge in P that leaves S, ! nodes, prioritized by π(v).
P

and let P' be the subpath to x.


 P is already too long as soon as it leaves S. P' x y Priority Queue
PQ Operation Dijkstra Array Binary heap d-way Heap Fib heap †
s
Insert n n log n d log d n 1
S u
ExtractMin n n log n d log d n log n
v
ChangeKey m 1 log n log d n 1
l (P) ≥ l (P') + l (x,y) ≥ d(x) + l (x, y) ≥ π(y) ≥ π(v)
IsEmpty n 1 1 1 1
nonnegative inductive defn of π(y) Dijkstra chose v Total n2 m log n m log m/n n m + n log n
weights hypothesis instead of y
† Individual ops are amortized bounds

39 40
Edsger W. Dijkstra

The question of whether computers can think is like the Extra Slides
question of whether submarines can swim.

Do only what only you can do.

In their capacity as a tool, computers will be but a ripple


on the surface of our culture. In their capacity as
intellectual challenge, they are without precedent in the
cultural history of mankind.

The use of COBOL cripples the mind; its teaching should,


therefore, be regarded as a criminal offence.

APL is a mistake, carried through to perfection. It is the


language of the future for the programming techniques
of the past: it creates a new generation of coding bums.

41

Coin Changing

Coin Changing Goal. Given currency denominations: 1, 5, 10, 25, 100, devise a method
to pay amount to customer using fewest number of coins.

Ex: 34¢.

Greed is good. Greed is right. Greed works.


Greed clarifies, cuts through, and captures the
essence of the evolutionary spirit. Cashier's algorithm. At each iteration, add coin of the largest value
- Gordon Gecko (Michael Douglas) that does not take us past the amount to be paid.

Ex: $2.89.

44
Coin-Changing: Greedy Algorithm Coin-Changing: Analysis of Greedy Algorithm

Cashier's algorithm. At each iteration, add coin of the largest value Theorem. Greedy algorithm is optimal for U.S. coinage: 1, 5, 10, 25, 100.
that does not take us past the amount to be paid. Pf. (by induction on x)
 Consider optimal way to change ck ≤ x < ck+1 : greedy takes coin k.
 We claim that any optimal solution must also take coin k.
Sort coins denominations by value: c1 < c2 < … < cn.
– if not, it needs enough coins of type c1, …, ck-1 to add up to x
coins selected – table below indicates no optimal solution can do this
S ← φ  Problem reduces to coin-changing x - ck cents, which, by induction, is
while (x ≠ 0) {
let k be largest integer such that ck ≤ x optimally solved by greedy algorithm. ▪
if (k = 0)
return "no solution found" All optimal solutions Max value of coins
x ← x - ck k ck
must satisfy 1, 2, …, k-1 in any OPT
S ← S ∪ {k}
} 1 1 P≤4 -
return S
2 5 N≤1 4
3 10 N+D≤2 4+5=9
4 25 Q≤3 20 + 4 = 24
Q. Is cashier's algorithm optimal?
5 100 no limit 75 + 24 = 99

45 46

Coin-Changing: Analysis of Greedy Algorithm

Observation. Greedy algorithm is sub-optimal for US postal


denominations: 1, 10, 21, 34, 70, 100, 350, 1225, 1500.
Selecting Breakpoints

Counterexample. 140¢.
 Greedy: 100, 34, 1, 1, 1, 1, 1, 1.
 Optimal: 70, 70.

47
Selecting Breakpoints Selecting Breakpoints: Greedy Algorithm

Selecting breakpoints. Truck driver's algorithm.


 Road trip from Princeton to Palo Alto along fixed route.
 Refueling stations at certain points along the way. Sort breakpoints so that: 0 = b0 < b1 < b2 < ... < bn = L
 Fuel capacity = C.
S ← {0} breakpoints selected
 Goal: makes as few refueling stops as possible. current location
x ← 0

Greedy algorithm. Go as far as you can before refueling. while (x ≠ bn)


let p be largest integer such that bp ≤ x + C
if (bp = x)
return "no solution"
x ← bp
C C C C S ← S ∪ {p}
return S
Princeton C C C Palo Alto

Implementation. O(n log n)


1 2 3 4 5 6 7
Use binary search to select each breakpoint p.

49 50

Selecting Breakpoints: Correctness Selecting Breakpoints: Correctness

Theorem. Greedy algorithm is optimal. Theorem. Greedy algorithm is optimal.

Pf. (by contradiction) Pf. (by contradiction)


 Assume greedy is not optimal, and let's see what happens.  Assume greedy is not optimal, and let's see what happens.
 Let 0 = g0 < g1 < . . . < gp = L denote set of breakpoints chosen by greedy.  Let 0 = g0 < g1 < . . . < gp = L denote set of breakpoints chosen by greedy.
 Let 0 = f0 < f1 < . . . < fq = L denote set of breakpoints in an optimal  Let 0 = f0 < f1 < . . . < fq = L denote set of breakpoints in an optimal
solution with f0 = g0, f1= g1 , . . . , fr = gr for largest possible value of r. solution with f0 = g0, f1= g1 , . . . , fr = gr for largest possible value of r.
 Note: gr+1 > fr+1 by greedy choice of algorithm.  Note: gr+1 > fr+1 by greedy choice of algorithm.

g0 g1 g2 gr gr+1 g0 g1 g2 gr gr+1
Greedy: Greedy:

OPT: ... OPT: ...


f0 f1 f2 fr fr+1 fq f0 f1 f2 fr fq
why doesn't optimal solution another optimal solution has
drive a little further? one more breakpoint in common
⇒ contradiction

51 52

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy