Day 8—More Divide and Conquer and Master Method for...

Preview:

Citation preview

Day 8—More Divide and Conquer and Master Method for Solving Recurrences

Neil RhodesCSE 101

UC San Diego

MIDTERM April 29, 4-5 PM Center Hall Room 101NOT HSS

5.4 Closest Pair of Points

3

Closest Pair of Points

Closest pair. Given n points in the plane, find a pair with smallest Euclidean distance between them.

Fundamental geometric primitive. Graphics, computer vision, geographic information systems,

molecular modeling, air traffic control. Special case of nearest neighbor, Euclidean MST, Voronoi.

Brute force. Check all pairs of points p and q with Θ(n2) comparisons.

1-D version. O(n log n) easy if points are on a line.

Assumption. No two points have same x coordinate.

to make presentation cleaner

fast closest pair inspired fast algorithms for these problems

4

Closest Pair of Points: First Attempt

Divide. Sub-divide region into 4 quadrants.

L

5

Closest Pair of Points: First Attempt

Divide. Sub-divide region into 4 quadrants.Obstacle. Impossible to ensure n/4 points in each piece.

L

6

Closest Pair of Points

Algorithm. Divide: draw vertical line L so that roughly ½n points on each side.

L

7

Closest Pair of Points

Algorithm. Divide: draw vertical line L so that roughly ½n points on each side. Conquer: find closest pair in each side recursively.

12

21

L

8

Closest Pair of Points

Algorithm. Divide: draw vertical line L so that roughly ½n points on each side. Conquer: find closest pair in each side recursively. Combine: find closest pair with one point in each side. Return best of 3 solutions.

12

218

L

seems like Θ(n2)

9

Closest Pair of Points

Find closest pair with one point in each side, assuming that distance < δ.

12

21

δ = min(12, 21)

L

10

Closest Pair of Points

Find closest pair with one point in each side, assuming that distance < δ. Observation: only need to consider points within δ of line L.

12

21

δ

L

δ = min(12, 21)

11

12

21

1

2

3

45

6

7

δ

Closest Pair of Points

Find closest pair with one point in each side, assuming that distance < δ. Observation: only need to consider points within δ of line L. Sort points in 2δ-strip by their y coordinate.

L

δ = min(12, 21)

12

12

21

1

2

3

45

6

7

δ

Closest Pair of Points

Find closest pair with one point in each side, assuming that distance < δ. Observation: only need to consider points within δ of line L. Sort points in 2δ-strip by their y coordinate. Only check distances of those within 11 positions in sorted list!

L

δ = min(12, 21)

13

Closest Pair of Points

Def. Let si be the point in the 2δ-strip, withthe ith smallest y-coordinate.

Claim. If |i – j| ≥ 12, then the distance betweensi and sj is at least δ.Pf. No two points lie in same ½δ-by-½δ box. Two points at least 2 rows apart

have distance ≥ 2(½δ).

Fact. Still true if we replace 12 with 7.

δ

27

2930

31

28

26

25

δ

½δ

2 rows½δ

½δ

39

i

j

14

Closest Pair Algorithm

Closest-Pair(p1, …, pn) Compute separation line L such that half the points are on one side and half on the other side.

δ1 = Closest-Pair(left half) δ2 = Closest-Pair(right half) δ = min(δ1, δ2)

Delete all points further than δ from separation line L

Sort remaining points by y-coordinate.

Scan points in y-order and compare distance between each point and next 11 neighbors. If any of these distances is less than δ, update δ.

return δ.

O(n log n)

2T(n / 2)

O(n)

O(n log n)

O(n)

15

Closest Pair of Points: Analysis

Running time.

Q. Can we achieve O(n log n)?

A. Yes. Don't sort points in strip from scratch each time. Each recursive returns two lists: all points sorted by y coordinate,

and all points sorted by x coordinate. Sort by merging two pre-sorted lists.

T(n) ≤ 2T n /2( ) + O(n) ⇒ T(n) = O(n logn)

T(n) ≤ 2T n /2( ) + O(n log n) ⇒ T(n) = O(n log2 n)

16

O(n log n) Closest Pair Algorithm

Closest-Pair(p1, …, pn) Compute separation line L such that half the points are on one side and half on the other side.

(xsorted1, ysorted1, δ1) = Closest-Pair(left half) (xsorted2, ysorted2, δ2) = Closest-Pair(right half) δ = min(δ1, δ2)

Merge xsorted1 and xsorted2 into xsorted Merge ysorted1 and ysorted2 into ysorted

Scan points in y-order and compare distance between each point and next 11 neighbors. If any of these distances is less than δ, update δ.

return (xsorted, ysorted, δ).

O(n)

2T(n / 2)

O(n)

O(n)

Master method for solving recurrence relations

Master method idea

A recurrence tree will, in general, have either Time dominated by cost of the leaves

Time dominated by cost of the root

Time evenly distributed across the levels of the tree

18

Time dominated by leaves

19

Tree MethodT (n) = 4T (!n

2")+nn

( n2 )

( n4 ) ( n

4 ) ( n4 ) ( n

4 )

( n2 )

( n4 ) ( n

4 ) ( n4 ) ( n

4 )

( n2 )

( n4 ) ( n

4 ) ( n4 ) ( n

4 )

( n2 )

( n4 ) ( n

4 ) ( n4 ) ( n

4 )

log2 n

n

2n

4n

4log2 n

CSE 202—Day 3 – p.7

Tree MethodT (n) = 4T (!n

2")+n

T (n) = n+2n+4n+ ...+4log2 n

=log2 n

∑i=0

n2i = nlog2 n

∑i=0

2i

= n2logn+1

= n(n+1)

T (n) = Θ(n2)

CSE 202—Day 3 – p.8

Time evenly distributed across levels

20

Tree MethodT (n) = 4T (!n

4")+nn

n4

n16

n16

n16

n16

n4

n16

n16

n16

n16

n4

n16

n16

n16

n16

n4

n16

n16

n16

n16

log4 n

n

n

n

4log4 n

CSE 202—Day 3 – p.12

Tree MethodT (n) = 4T (!n

4")+n

T (n) = n+n+n+ ...+4log4 n

=log2 n

∑i=0

n = nlog2 n

∑i=0

1 = nΘ(logn)

T (n) = Θ(n logn)

CSE 202—Day 3 – p.13

Time dominated by root

21

Tree MethodT (n) = 3T (!n

2")+O(n2)

T (n) = cn2 + cn2(34)

1+ cn2(

34)

2+ ...+ c3log2 n

= cn2(34)

0+ cn2(

34)

1+ ...+ c3log2n(

n2log2 n)

2

= cn2(34)

0+ cn2(

34)

1+ ...+ cn2 3log2n

4log2 n

=log2 n

∑i=0

cn2(34)

i= cn2

log2 n

∑i=0

(34)

i= cn2Θ(1)

T (n) = Θ(n2)CSE 202—Day 3 – p.5

Tree MethodT (n) = 3T (!n

2")+O(n2)

cn2

c( n2 )2

c( n4 )2

c c c

c( n4 )2

c c c

c( n4 )2

c c c

c( n2 )2

c( n4 )2

c c c

c( n4 )2

c c c

c( n4 )2

c c c

c( n2 )2

c( n4 )2

c c c

c( n4 )2

c c c

c( n4 )2

c c c

log2 n

cn2

34 cn2

( 34 )2cn2

c3log2 n

CSE 202—Day 3 – p.4

Limited master method

22

Limited Master MethodRecurrence: T (n) = aT (n/b)+n

n

nb

nb2 ... n

b2

...

nb2 ... n

b2

nb

nb2 ... n

b2

logb n

n

a nb

a2 nb2

alogb n = nlogba

T (n) = n+ab

n+(ab)2n+ ...+alogb n

=logb n

∑i=0

(ab)

in = n

logb n

∑i=0

(ab)

i

Day 3 – p.14

Limited master method

23

Limited Master MethodEvaluate: T (n) = n∑logb n

i=0 (ab)

i

1. ab > 1: Time dominated by leaves. T (n) =

n∑logb ni=0 (a

b)i = n(ab)logb(n+1)−1

ab−1 = O(n(a

b)logb n) =

O(nalogb n

blogb n ) = O(alogb n) = O(nlogb a)

2. ab = 1: Time evenly distributed across all levels.T (n) = n logn = O(n logn).

3. ab < 1: Time dominated by root.

n∑logb ni=0 (a

b)i < n 11− a

b= O(n).

CSE 202—Day 3 – p.15

Limited master method

Examples

24

Limited Master MethodExamples:

• T (n) = 3T (!n2")+n: a

b = 32 > 1.

T (n) = O(nlog23) = O(n1.6)• T (n) = 4T (!n

4")+n: ab = 1. T (n) = O(n logn).

• T (n) = 4T (!n5")+n. a

b < 1. T (n) = O(n).

CSE 202—Day 3 – p.16

Limited Master MethodExamples:

• T (n) = 3T (!n2")+n: a

b = 32 > 1.

T (n) = O(nlog23) = O(n1.6)• T (n) = 4T (!n

4")+n: ab = 1. T (n) = O(n logn).

• T (n) = 4T (!n5")+n. a

b < 1. T (n) = O(n).

CSE 202—Day 3 – p.16

Limited Master MethodExamples:

• T (n) = 3T (!n2")+n: a

b = 32 > 1.

T (n) = O(nlog23) = O(n1.6)• T (n) = 4T (!n

4")+n: ab = 1. T (n) = O(n logn).

• T (n) = 4T (!n5")+n. a

b < 1. T (n) = O(n).

CSE 202—Day 3 – p.16

Master Method

Recurrence:

Case 1: f(n) polynomially smaller than (Lots of tiny subproblems)

25

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb ) Day 3 – p.17

Master Method

Recurrence:

Case 2: (All levels about the same)

26

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 2: f (n) = Θ(nlogb a) (All levels about the same)

T (n) = logb nΘ(nlogb a)CSE 202—Day 3 – p.18

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 2: f (n) = Θ(nlogb a) (All levels about the same)

T (n) = logb nΘ(nlogb a)CSE 202—Day 3 – p.18

Master Method

Recurrence:

Case 3: f(n) polynomially larger than (Big step is most expensive)

27

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 1: f (n) polynomially smaller than nlogb a (Lotsof tiny suproblems)

f (n) = O(nlogb a−ε)

T (n) = Θ(alogb n) = Θ(nlogb a) = Θ(nlogalogb )

CSE 202—Day 3 – p.17

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 3: f (n) polynomially larger than nlogb a (Big stepis most expensive)

f (n) = Ω(nlogb a+ε)T (n) = Θ( f (n))

CSE 202—Day 3 – p.19

Master MethodRecurrence: T (n) = aT (n/b)+ f (n)

f (n)

f ( nb )

f ( nb2 ) ... f ( n

b2 )

...

f ( nb2 ) ... f ( n

b2 )

f ( nb )

f ( nb2 ) ... f ( n

b2 )

logb n

f (n)

a f ( nb )

a2 f ( nb2 )

alogb n = nlogba

Case 3: f (n) polynomially larger than nlogb a (Big stepis most expensive)

f (n) = Ω(nlogb a+ε)T (n) = Θ( f (n))

Day 3 – p.19

Master Method

Additional requirement: For case 3, also need regularity: there exists c > 1 such that for

sufficiently large n: f(n) ≥ c•af(n/b). That is, each lower level is at least a constant fraction smaller than the higher level. Actually, this regularity implies polynomially larger, so case 3 need meet only this requirement.

Polynomials or are regular.

28

Master Method• For case 3, also need regularity: there exists c < 1

such that for sufficiently large n: c f (n) ≥ a f (nb).

That is, each lower level costs less than theprevious higher level. Actually, this regularityimplies polynomially larger, so case 3 need meetonly this requirement.Polynomials or f (n) = cnk log j n for c,k, j ≥ 0are regular.

• Book shows how floors and ceilings within call toT can be ignored in order analysis

T (n) = T ("nb#), S(n) = T ($n

b%), R(n) = T (

nb)

T (n) = Θ(S(n)) = Θ(R(n))Day 3 – p.20

Master Method

Examples

29

Master MethodExamples:

• T (n) = 3T (n2)+O(n2): f (n) = Ω(nlog23+.2).

T (n) = O(n2)• T (n) = 3T (n

2)+n1.4: f (n) = O(nlog2 3−.1).T (n) = O(nlog2 3).

• T (n) = 16T (n4)+10n2 +3n+5:

f (n) = Θ(nlog4 16). T (n) = O(n2 logn).

Day 3 – p.21

Master MethodExamples:

• T (n) = 3T (n2)+O(n2): f (n) = Ω(nlog23+.2).

T (n) = O(n2)• T (n) = 3T (n

2)+n1.4: f (n) = O(nlog2 3−.1).T (n) = O(nlog2 3).

• T (n) = 16T (n4)+10n2 +3n+5:

f (n) = Θ(nlog4 16). T (n) = O(n2 logn).

Day 3 – p.21

Master MethodExamples:

• T (n) = 3T (n2)+O(n2): f (n) = Ω(nlog23+.2).

T (n) = O(n2)• T (n) = 3T (n

2)+n1.4: f (n) = O(nlog2 3−.1).T (n) = O(nlog2 3).

• T (n) = 16T (n4)+10n2 +3n+5:

f (n) = Θ(nlog4 16). T (n) = O(n2 logn).

Day 3 – p.21

5.5 Integer Multiplication

31

Integer Arithmetic

Add. Given two n-digit integers a and b, compute a + b. O(n) bit operations.

Multiply. Given two n-digit integers a and b, compute a × b. Brute force solution: Θ(n2) bit operations.

1

1

0

0

0

1

1

1

0

0

1

1

1

1

0

0

1

1

1

1

0

1

0

1

00000000

01010101

01010101

01010101

01010101

01010101

00000000

0100000000001011

1

0

1

1

1

1

1

0

0

*

1

011 1

110 1+

010 1

111

010 1

011 1

100 0

10111

Add

Multiply

32

To multiply two n-digit integers: Multiply four ½n-digit integers. Add two ½n-digit integers, and shift to obtain result.

Divide-and-Conquer Multiplication: Warmup

T(n) = 4T n /2( )recursive calls1 2 4 3 4

+ Θ(n)add, shift1 2 3

⇒ T(n) =Θ(n2 )

x = 2n / 2 ⋅ x1 + x0

y = 2n / 2 ⋅ y1 + y0

xy = 2n / 2 ⋅ x1 + x0( ) 2n / 2 ⋅ y1 + y0( ) = 2n ⋅ x1y1 + 2n / 2 ⋅ x1y0 + x0 y1( ) + x0 y0

assumes n is a power of 2Master method: logba = log24 = 2

f(n) = O(n2-1). Case 1:

33

To multiply two n-digit integers: Add two ½n digit integers. Multiply three ½n-digit integers. Add, subtract, and shift ½n-digit integers to obtain result.

Theorem. [Karatsuba-Ofman, 1962] Can multiply two n-digit integers in O(n1.585) bit operations.

Karatsuba Multiplication

x = 2n / 2 ⋅ x1 + x0

y = 2n / 2 ⋅ y1 + y0

xy = 2n ⋅ x1y1 + 2n / 2 ⋅ x1y0 + x0 y1( ) + x0 y0

= 2n ⋅ x1y1 + 2n / 2 ⋅ (x1 + x0 ) (y1 + y0 ) − x1y1 − x0 y0( ) + x0 y0

T(n) ≤ T n /2 ( ) + T n /2 ( ) + T 1+ n /2 ( )recursive calls

1 2 4 4 4 4 4 4 4 3 4 4 4 4 4 4 4 + Θ(n)

add, subtract, shift1 2 4 3 4

⇒ T(n) = O(n log 2 3 ) = O(n1.585 )

A B CA C

Karatsuba: Master Method

34

T(n) =0 if n =1

3T(n /2) + n otherwise

logba = log23 = (roughly) 1.585

f(n) = n

f(n) = O(n1.585 - .1) (polynomially smaller)

T(n) = θ(n1.585)

35

Karatsuba: Recursion Tree

T(n) =0 if n =1

3T(n /2) + n otherwise

n

3(n/2)

9(n/4)

3k (n / 2k)

3 lg n (2)

. . .

. . .

T(n)

T(n/2)

T(n/4) T(n/4)

T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2)

T(n / 2k)

T(n/4)

T(n/2)

T(n/4) T(n/4)T(n/4)

T(n/2)

T(n/4) T(n/4)T(n/4)

. . .

. . .

T(n) = n 32( )k

k=0

log2 n

∑ = 32( )1+ log2 n

−132 −1

= 3nlog2 3 − 2

Recommended