Upload
phoebe
View
45
Download
4
Embed Size (px)
DESCRIPTION
Growth Rates. Growth rates of functions: Linear n Quadratic n 2 Cubic n 3 In a log-log chart, the slope of the line corresponds to the growth rate of the function. Constant Factors. The growth rate is not affected by constant factors or lower-order terms Examples - PowerPoint PPT Presentation
Citation preview
Analysis of Algorithms 1
Growth RatesGrowth rates of functions:
Linear n Quadratic n2
Cubic n3
In a log-log chart, the slope of the line corresponds to the growth rate of the function
1E+01E+21E+41E+61E+8
1E+101E+121E+141E+161E+181E+201E+221E+241E+261E+281E+30
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10n
T(n
)
CubicQuadraticLinear
Analysis of Algorithms 2
Constant FactorsThe growth rate is not affected by
constant factors or
lower-order termsExamples
102n 105 is a linear function
105n2 108n is a quadratic function 1E+0
1E+21E+41E+61E+8
1E+101E+121E+141E+161E+181E+201E+221E+241E+26
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10n
T(n
)
QuadraticQuadraticLinearLinear
Analysis of Algorithms 3
Big-Oh NotationGiven functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constantsc and n0 such thatf(n) cg(n) for n n0
Example: 2n 10 is O(n) 2n 10 cn (c 2) n 10 n 10(c 2) Pick c 3 and n0 10
1
10
100
1,000
10,000
1 10 100 1,000n
3n
2n+10
n
Analysis of Algorithms 4
Big-Oh ExampleExample: the function n2 is not O(n)
n2 cn n c The above
inequality cannot be satisfied since c must be a constant
1
10
100
1,000
10,000
100,000
1,000,000
1 10 100 1,000n
n^2100n10nn
Analysis of Algorithms 5
More Big-Oh Examples
7n-27n-2 is O(n)need c > 0 and n0 1 such that 7n-2 c•n for n n0
this is true for c = 7 and n0 = 1 3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c•n3 for n
n0
this is true for c = 4 and n0 = 21 3 log n + log log n3 log n + log log n is O(log n)need c > 0 and n0 1 such that 3 log n + log log n c•log n
for n n0
this is true for c = 4 and n0 = 2
Analysis of Algorithms 6
Big-Oh and Growth RateThe big-Oh notation gives an upper bound on the growth rate of a functionThe statement “f(n) is O(g(n))” means that the growth rate of f(n) is no more than the growth rate of g(n)We can use the big-Oh notation to rank functions according to their growth rate
f(n) is O(g(n)) g(n) is O(f(n))g(n) grows more
Yes No
f(n) grows more No YesSame growth Yes Yes
Analysis of Algorithms 7
Big-Oh RulesIf is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
1. Drop lower-order terms2. Drop constant factors
Use the smallest possible class of functions Say “2n is O(n)” instead of “2n is O(n2)”
Use the simplest expression of the class Say “3n 5 is O(n)” instead of “3n 5 is O(3n)”
Analysis of Algorithms 8
Relatives of Big-Ohbig-Omega
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such that f(n) c•g(n) for n n0
big-Theta f(n) is (g(n)) if there are constants c’ > 0 and c’’ > 0 and an
integer constant n0 1 such that c’•g(n) f(n) c’’•g(n) for n n0
little-oh f(n) is o(g(n)) if, for any constant c > 0, there is an integer
constant n0 > 0 such that f(n) < c•g(n) for n n0little-omega
f(n) is (g(n)) if, for any constant c > 0, there is an integer constant n0 > 0 such that f(n) > c•g(n) for n n0
Analysis of Algorithms 9
Intuition for Asymptotic NotationBig-Oh
f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)big-Omega
f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n)
big-Theta f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
little-oh f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)
little-omega f(n) is (g(n)) if is asymptotically strictly greater than g(n)
Analysis of Algorithms 10
Example Uses of the Relatives of Big-Oh
f(n) is (g(n)) if, for any constant c > 0, there is an integer constant n0 > 0 such that f(n) > c•g(n) for n n0
need 5n02 > c•n0 given c, the n0 that satifies this is n0 > c/5 > 0
5n2 is (n)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such that f(n) c•g(n) for n n0
let c = 1 and n0 = 1
5n2 is (n)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n0 1 such that f(n) c•g(n) for n n0
let c = 5 and n0 = 1
5n2 is (n2)
Analysis of Algorithms 11
Divide-and-ConquerDivide-and conquer is a general algorithm design paradigm:
Divide: divide the input data S in two or more disjoint subsets S1, S2, …
Recur: solve the subproblems recursively
Conquer: combine the solutions for S1, S2, …, into a solution for S
The base case for the recursion are subproblems of constant sizeAnalysis can be done using recurrence equations
Analysis of Algorithms 12
Merge-Sort ReviewMerge-sort on an input sequence S with n elements consists of three steps:
Divide: partition S into two sequences S1 and S2 of about n2 elements each
Recur: recursively sort S1 and S2
Conquer: merge S1 and S2 into a unique sorted sequence
Algorithm mergeSort(S, C)Input sequence S with n
elements, comparator C Output sequence S sorted
according to Cif S.size() > 1
(S1, S2) partition(S, n/2) mergeSort(S1, C)mergeSort(S2, C)S merge(S1, S2)
Analysis of Algorithms 13
Recurrence Equation Analysis
The conquer step of merge-sort consists of merging two sorted sequences, each with n2 elements and implemented by means of a doubly linked list, takes at most bn steps, for some constant b.Likewise, the basis case (n < 2) will take at b most steps.Therefore, if we let T(n) denote the running time of merge-sort:
We can therefore analyze the running time of merge-sort by finding a closed form solution to the above equation.
That is, a solution that has T(n) only on the left-hand side.
2if)2/(22if
)(nbnnTnb
nT
Analysis of Algorithms 14
Iterative SubstitutionIn the iterative substitution, or “plug-and-chug,” technique, we iteratively apply the recurrence equation to itself and see if we can find a pattern:
Note that base, T(n)=b, case occurs when 2i=n. That is, i = log n. So,
Thus, T(n) is O(n log n).
ibnnT
bnnT
bnnT
bnnT
bnnbnT
bnnTnT
ii
)2/(2
...4)2/(2
3)2/(2
2)2/(2
))2/())2/(2(2
)2/(2)(
44
33
22
2
nbnbnnT log)(
Analysis of Algorithms 15
The Recursion TreeDraw the recursion tree for the recurrence relation and look for a pattern:
depth T’s size0 1 n
1 2 n2
i 2i n2i
… … …
2if)2/(22if
)(nbnnTnb
nT
timebn
bn
bn
…
Total time = bn + bn log n(last level plus all previous levels)
Analysis of Algorithms 16
Guess-and-Test MethodIn the guess-and-test method, we guess a closed form solution and then try to prove it is true by induction:
Guess: T(n) < cn log n.
Wrong: we cannot make this last line be less than cn log n
nbncnncnnbnncn
nbnnncnbnnTnT
logloglog)2log(log
log))2/log()2/((2log)2/(2)(
2iflog)2/(22if
)(nnbnnTnb
nT
Analysis of Algorithms 17
Guess-and-Test Method, Part 2
Recall the recurrence equation:
Guess #2: T(n) < cn log2 n.
if c > b.So, T(n) is O(n log2 n).In general, to use this method, you need to have a good guess and you need to be good at induction proofs.
ncn
nbncnncnncn
nbnncn
nbnnnc
nbnnTnT
2
2
2
2
log
loglog2log
log)2log(log
log))2/(log)2/((2
log)2/(2)(
2iflog)2/(22if
)(nnbnnTnb
nT
Analysis of Algorithms 18
Master MethodMany divide-and-conquer recurrence equations have the form:
The Master Theorem:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
Analysis of Algorithms 19
Master Method, Example 1
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
nnTnT )2/(4)(
Solution: logba=2, so case 1 says T(n) is O(n2).
Analysis of Algorithms 20
Master Method, Example 2
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
nnnTnT log)2/(2)( Solution: logba=1, so case 2 says T(n) is O(n log2 n).
Analysis of Algorithms 21
Master Method, Example 3
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
nnnTnT log)3/()( Solution: logba=0, so case 3 says T(n) is O(n log n).
Analysis of Algorithms 22
Master Method, Example 4
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
2)2/(8)( nnTnT Solution: logba=3, so case 1 says T(n) is O(n3).
Analysis of Algorithms 23
Master Method, Example 5
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
3)3/(9)( nnTnT Solution: logba=2, so case 3 says T(n) is O(n3).
Analysis of Algorithms 24
Master Method, Example 6
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
1)2/()( nTnTSolution: logba=0, so case 2 says T(n) is O(log n).
(binary search)
Analysis of Algorithms 25
Master Method, Example 7
The form:
The Master Theorem:
Example:
dnnfbnaTdnc
nTif)()/(if
)(
.1 somefor )()/( provided )),((is)(then),(is)(if 3.
)log(is)(then),log(is)(if 2.
)(is)(then),(is)(if 1.
log
1loglog
loglog
nfbnafnfnTnnf
nnnTnnnf
nnTnOnf
a
kaka
aa
b
bb
bb
nnTnT log)2/(2)( Solution: logba=1, so case 1 says T(n) is O(n).
(heap construction)
Analysis of Algorithms 26
Iterative “Proof” of the Master Theorem
1)(log
0
log
1)(log
0
log
2233
22
2
)/()1(
)/()1(
. . .)()/()/()/(
)()/()/(
))/())/((
)()/()(
n
i
iia
n
i
iin
bb
bb
bnfaTn
bnfaTa
nfbnafbnfabnTa
nfbnafbnTa
bnbnfbnaTa
nfbnaTnT
Analysis of Algorithms 27
Integer MultiplicationAlgorithm: Multiply two n-bit integers I and J.
Divide step: Split I and J into high-order and low-order bits
We can then define I*J by multiplying the parts and adding:
So, T(n) = 4T(n/2) + n, which implies T(n) is O(n2). But that is no better than the algorithm we learned in grade
school.
ln
h
ln
h
JJJ
III
2/
2/
2
2
lln
hln
lhn
hh
ln
hln
h
JIJIJIJI
JJIIJI
2/2/
2/2/
222
)2(*)2(*
Analysis of Algorithms 28
An Improved Integer Multiplication Algorithm
Algorithm: Multiply two n-bit integers I and J. Divide step: Split I and J into high-order and low-order bits
Observe that there is a different way to multiply parts:
So, T(n) = 3T(n/2) + n, which implies T(n) is O(n log2
3), by the Master Theorem.
Thus, T(n) is O(n1.585).
ln
h
ln
h
JJJ
III
2/
2/
2
2
lln
hllhn
hh
lln
llhhhlhhlllhn
hh
lln
llhhhllhn
hh
JIJIJIJI
JIJIJIJIJIJIJIJI
JIJIJIJJIIJIJI
2/
2/
2/
2)(2
2])[(2
2]))([(2*