25
Theory of Algorithms: Divide and Conquer

Algorithms - Chapter 4 Divide_conquer

Embed Size (px)

DESCRIPTION

Divide and conquere algorithm

Citation preview

Page 1: Algorithms - Chapter 4 Divide_conquer

Theory of Algorithms:Divide and Conquer

Page 2: Algorithms - Chapter 4 Divide_conquer

Objectives

To introduce the divide-and-conquer mind setTo show a variety of divide-and-conquer solutions:

Merge SortQuick SortStrassen’s Matrix MultiplicationConvex Hull by Divide-and-Conquer

To discuss the strengths and weaknesses of a divide-and-conquer strategy

Page 3: Algorithms - Chapter 4 Divide_conquer

Divide-and-Conquer

Best known algorithm design strategy:1. Divide instance of problem into two or more

smaller instances2. Solve smaller instances recursively3. Obtain solution to original (larger) instance by

combining these solutions

Silly Example (Addition):a0 + …. + an-1 = (a0 + … + an/2-1) + (an/2 + … an-1)

Page 4: Algorithms - Chapter 4 Divide_conquer

Divide-and-Conquer

Recurrence Templates applyGeneral divide and conquer recurrence: T(n) = aT(n/b) + f (n), a ≥ 1, b ≥ 2

Efficiency analysis of d&c algorithms simplified by Master Theorem:If f(n) ∈ Θ(nd) where d>= 0 then:

T(n) ∈ Θ(nd) if a < bd

T(n) ∈ Θ(ndlog n) if a = bd

T(n) ∈ Θ(nlogba) if a > bd

Page 5: Algorithms - Chapter 4 Divide_conquer

Divide-and-Conquer Illustrated

SUBPROBLEM 2 OF SIZE n/2

SUBPROBLEM 1 OF SIZE n/2

A SOLUTION TO SUBPROBLEM 1

A SOLUTION TO THEORIGINAL PROBLEM

A SOLUTION TO SUBPROBLEM 2

A PROBLEM OF SIZE n

Page 6: Algorithms - Chapter 4 Divide_conquer

Perfect example: Mergesort

Algorithm:1. Split A[1..n] in half and put copy of each half into arrays B

[1.. n/2 ] and C[1.. n/2 ]2. Recursively MergeSort arrays B and C3. Merge sorted arrays B and C into array A

Merging:REPEAT until no elements remain in one of B or C

1. Compare 1st elements in the rest of B and C2. Copy smaller into A, incrementing index of corresponding

array3. Once all elements in one of B or C are processed, copy

the remaining unprocessed elements from the other array into A

Page 7: Algorithms - Chapter 4 Divide_conquer

7

Page 8: Algorithms - Chapter 4 Divide_conquer

Mergesort Example7 2 1 6 4

7 2 1 6 4

7 2 1 6 4

7 2

2 7

1 2 7

4 6

1 2 4 6 7

Page 9: Algorithms - Chapter 4 Divide_conquer

Efficiency of Mergesort

Recurrence:C(n) = 2 C(n/2) + Cmerge(n) for n > 1, C(1) = 0

Cmerge(n) = n - 1 in the worst case

All cases have same efficiency: Θ(n log n) Number of comparisons is close to theoretical minimum for comparison-based sorting:

log n! ≈ n lg n - 1.44 n

Page 10: Algorithms - Chapter 4 Divide_conquer

Improving Efficiency of Mergesort

Principal drawback is space requirement: Θ( n ) (NOT in-place)

CAN be done in place, but with large multiplicative constantCan be implemented without recursion (bottom-up)

Page 11: Algorithms - Chapter 4 Divide_conquer

Quicksort

Select a pivot (partitioning element)Rearrange the list into two sublists:

All elements positioned before the pivot are ≤ the pivotThose positioned after the pivot are > the pivotRequires a pivoting algorithm

Exchange the pivot with the last element in the first sublist

The pivot is now in its final position

QuickSort the two sublistsp

A[i]≤p A[i]>p

Page 12: Algorithms - Chapter 4 Divide_conquer

The Partition Algorithm

Page 13: Algorithms - Chapter 4 Divide_conquer

Quicksort Example

5 3 1 9 8 2 7

5 3 1 9 8 2 7

i j

i j

5 3 1 2 8 9 7 j i

2 3 1 5 8 9 7

2 3 1 8 9 7 i j

2 1 3

5 3 1 2 8 9 7 i j

i j2 1 3 j i1 2 3

Recursive Call Quicksort (2 3 1) & Quicksort (8 9 7)

Recursive Call Quicksort (1) & Quicksort (3)

i j

8 7 9 i j

8 7 9 j i

7 8 9

Recursive Call Quicksort (7) & Quicksort (9)

Page 14: Algorithms - Chapter 4 Divide_conquer

Worst Case Efficiency of Quicksort

In the worst case all splits are completely skewed

For instance, an already sorted list!One subarray is empty, other reduced by only one:

Make n+1 comparisonsExchange pivot with itselfQuicksort left = Ø, right = A[1..n-1]Cworst = (n+1) + n + … + 3 = (n+1)(n+2)/2 - 3 = Θ(n2)

p

A[i]≤p A[i]>p

Page 15: Algorithms - Chapter 4 Divide_conquer

General Efficiency of Quicksort

Efficiency Cases:Best: split in the middle — Θ( n log n)Worst: sorted array! — Θ( n2) Average: random arrays — Θ( n log n)

Improvements (in combination 20-25% faster):

Better pivot selection: median of three partitioning avoids worst case in sorted filesSwitch to Insertion Sort on small subfilesElimination of recursion

Considered the method of choice for internal sorting for large files (n ≥ 10000)

Page 16: Algorithms - Chapter 4 Divide_conquer

Binary Search

O(logn) algorithms for searching sorted arraysAtypical example - solves just one problem of half the size on each ofi its iterations

16

Page 17: Algorithms - Chapter 4 Divide_conquer

Objectives

To introduce the divide-and-conquer mind setTo show a variety of divide-and-conquer solutions:

Merge SortQuick SortStrassen’s Matrix MultiplicationConvex Hull by Divide-and-Conquer

To discuss the strengths and weaknesses of a divide-and-conquer strategy

Page 18: Algorithms - Chapter 4 Divide_conquer

Strassen’s Matrix Multiplication

Strassen observed [1969] that the product of two matrices can be computed as follows:

where:m1=(a00+a11) * (b00+b11)

m2=(a10+a11) * b00

m3=a00 * (b01-b11)

m4=a11 * (b10-b00)

m5=(a00+a01) * b11

m6= (a10-a00) * (b00+b01)

m7= (a01-a11) * (b10+b11)

C00 C01

C10 C11

A00 A01

A10 A11

B00 B01

B10 B11

M1 + M4 -M5+M7 M3+ M5

M2 + M4 M1 + M3 - M2 + M6

=

= *

Page 19: Algorithms - Chapter 4 Divide_conquer

Strassen’s Matrix Multiplication

Op Count:Each of M1, …, M7 requires 1 mult and 1 or 2 add/sub

Total = 7 mul and 18 add/subCompared with brute force which requires 8 mults and 4 add/sub. But is asymptotic behaviour is NB

n/2 × n/2 submatrices are computed recursively by the same method

Page 20: Algorithms - Chapter 4 Divide_conquer

Efficiency of Strassen’s Algorithm

If n is not a power of 2, matrices can be padded with zerosNumber of multiplications:

M(n) = 7 M(n/2) for n > 1, M(1) = 1Set n = 2k, apply backward substitutionM(2k) = 7M(2k-1) = 7 [7 M(2k-2)] = 72 M(2k-2) = … = 7k

M(n) = 7log n = nlog 7 ≈ n2.807

Number of additions: A(n) ∈ Θ(nlog 7)

Other algorithms closer to the lower limit of n2 multiplications, but are even more complicated

Page 21: Algorithms - Chapter 4 Divide_conquer

QuickHull Algorithm1. Sort points by increasing x-coordinate values2. Identify leftmost and rightmost extreme points P1

and P2 (part of hull)

3. Compute upper hull:Find point Pmax that is farthest away from line P1P2

Quickhull the points to the left of line P1Pmax

Quickhull the points to the left of line PmaxP2

4. Similarly compute lower hull

Page 22: Algorithms - Chapter 4 Divide_conquer

QuickHull Algorithm1. Sort points by increasing x-coordinate values2. Identify leftmost and rightmost extreme points P1

and P2 (part of hull)

3. Compute upper hull:Find point Pmax that is farthest away from line P1P2

Quickhull the points to the left of line P1Pmax

Quickhull the points to the left of line PmaxP2

4. Similarly compute lower hull

P1

Pmax P2

L

L

Page 23: Algorithms - Chapter 4 Divide_conquer

Finding the Furthest Point

Given three points in the plane p1 , p2 , p3

Area of Triangle = Δ p1 p2 p3 = 1/2 D

D =

D = x1 y2 + x3 y1 + x2 y3 - x3 y2 - x2 y1 - x1 y3

Properties of D :Positive iff p3 is to the left of p1p2

Correlates with distance of p3 from p1p2

x1 y1 1

x2 y2 1

x3 y3 1

Page 24: Algorithms - Chapter 4 Divide_conquer

Efficiency of Quickhull

Finding point farthest away from line P1P2 is linear in the number of pointsThis gives same efficiency as quicksort:

Worst case: Θ(n2)Average case: Θ(n log n)If an initial sort is required, this can be accomplished in Θ( n log n) — no increase in asymptotic efficiency class

Alternative Divide-and-Conquer Convex Hull:

Graham’s scan and DCHullWorst-case efficiency Θ( n log n)

Page 25: Algorithms - Chapter 4 Divide_conquer

Strengths and Weaknesses of

Strengths:Generally improves on Brute Force by one base efficiency classEasy to analyse using the Recurrence Templates

Ideally suited for parallel computations

Weaknesses:Often requires recursion, which introduces overheads

Can be inapplicable and inferior to simpler algorithmic solutions