View
2
Download
0
Category
Preview:
Citation preview
CSC236 Week 3Larry Zhang
1
Announcements
● Problem Set 1 due Oct 4
● Make sure to read “Submission Instructions” on the handout and the course website.○ submit both tex and pdf
○ make sure you can login to MarkUs
○ create a group and submit one PDF for each group
○ late submissions are not accepted
2
NEW TOPIC
Asymptotic NotationsBig-Oh, Big-Omega, Big-Theta
3
Review: Asymptotic Notations
● Algorithm Runtime: we describe it in terms of the number of steps as a function of input size
○ Like n², nlog(n), n, sqrt(n), ...
● Asymptotic notations are for describing the growth rate of functions.
○ constant factors don’t matter
○ only the highest-order term matters4
ConcepTest
5
Big-Oh, Big-Omega, Big-Theta
6
O( f(n) ): The set of functions that grows no faster than f(n)● asymptotic upper-bound on growth rate
Ω( f(n) ): The set of functions that grows no slower than f(n)● asymptotic lower-bound on growth rate
Θ( f(n) ): The set of functions that grows no faster and no slower than f(n)
● asymptotic tight-bound on growth rate
6
7
growth rate ranking of typical functions
grow slowly
grow fast
7
The formal mathematical definition of big-Oh
8
9
n0
Chicken grows slower than turkey, or
chicken size is in O(turkey size).
What it really means:
● Baby chicken might be larger than baby turkey at the beginning.
● But after certain “breakpoint”, the chicken size will be surpassed by the turkey size.
● From the breakpoint on, the chicken size will always be smaller than the turkey size.
Definition of f(n) is in O(g(n))
Let f(n) and g(n) be two functions of n. By “f(n)’s growth rate is upper-bounded by g(n)”, we mean the following:
● f(n) is bounded from above by g(n) in some way, something like f(n) ⩽ g(n)
● constant factors don’t matter, so we’re okay with f(n) being bounded by g(n) multiplied by some constant factor, i.e., f(n) ⩽ cg(n)
● we only care about large inputs, so we just need the above inequality to be true when n is larger than some “breakpoint” n₀
So the formal definition of “f(n) is in O(g(n))” is the following
10
Definition of big-Oh
11
cg(n)
f(n)
n₀
Beyond the breakpoint n₀, f(n) is upper-bounded by cg(n), where c is some wisely chosen constant multiplier.
Side Note
Both ways below are fine
● f(n) ∈ O(g(n)), i.e., f(n) is in O(g(n))
● f(n) = O(g(n)), i.e., f(n) is O(g(n))
Both means the same thing, while the latter is a slight abuse of notation.
12
Knowing the definition, now we can write proofs for big-Oh.
The key is finding n₀ and c
13
Example 1
Prove that 100n + 10000 is in O(n²)
Need to find the n₀ and c such that 100n + 10000 can be upper-bounded by n² multiplied by some c.
14
15
large
small
c n²
100n + 10000
under-estimate and simplify
over-estimate and simplify
Pick n₀ = 1
100n² + 10000# coz n >= 1
100n² + 10000n²# coz n >= 1
10100n²
Pick c = 10100
Write up the proofProof:
Choose n₀=1, c = 10100,
then for all n >= n₀,
100n + 10000 <= 100n² + 10000n² # because n >= 1
= 10100n²
= cn²
Therefore by definition of big-Oh, 100n + 10000 is in O(n²)
16
Proof that 100n + 10000 is in O(n²)
Quick note
The choice of n₀ and c is not unique.
There can be many (actually, infinitely many) different combinations of n₀ and c that would make the proof work.
It depends on what inequalities you use while doing the upper/lower-bounding.
17
Example 2
Prove that 5n² - 3n + 20 is in O(n²)
18
19
large
small
c n²
5n² - 3n + 20
under-estimate and simplify
over-estimate and simplify
Pick n₀ = 1
Pick c = 25
5n² + 20# remove a negative term
5n² + 20n²# coz n >= 1
25n²
Takeaway
over-estimate and simplify
under-estimate and simplify
large
small
₀≥
➔
◆ ≥➔
◆ ≥
➔
◆ ≤➔
◆ ≤20
The formal mathematical definition of big-Omega
21
22
Definition of big-Omega
The only difference
Example 3
Prove that 2n³ - 7n + 1 = Ω(n³)
23
24
large
small
2n³ - 7n + 1
c n³
under-estimate and simplify
over-estimate and simplify
Prove that 2n³ - 7n + 1 = Ω(n³)
n³ + n³ - 7n + 1
Pick n₀ = 3(coz we want n³ - 7n > 0)
n³ + 1# n³ - 7n > 0# coz n >= 3
n³
Pick c = 1
Write up the proofProof:
Choose n₀= 3, c = 1,
then for all n >= n₀,
2n³ - 7n + 1 = n³ + (n³ - 7n) + 1
>= n³ + 1 # because n >= 3
>= n³ = cn³
Therefore by definition of big-Omega, 2n³ - 7n + 1 is in Ω(n³)
25
Prove that 2n³ - 7n + 1 is in Ω(n³)
Takeaway
Additional trick learned
● Splitting a higher order term● Choose n₀ to however large you need it to be
26
n³ + n³ - 7n + 1
The formal mathematical definition of big-Theta
27
Definition of big-Theta
28
In other words, if you want to prove big-Theta, just prove both big-Oh and big-Omega, separately.
Summary of Asymptotic Notations
● We use functions to describe algorithm runtime○ Number of steps as a function of input size
● Big-Oh/Omega/Theta are used for describing function growth rate
● A proof for big-Oh and big-Omega is basically a chain of inequality.○ Choose the n₀ and c that makes the chain work.
29
ConcepTest
30
ConcepTest
31
Handout, Exercise 1
32
ConcepTest
33
Analyzing Algorithm RuntimeUsing the Asymptotic Notations
34
Runtime Analysis● We can use the asymptotic notations to describe algorithm runtimes.
● We will first focus on iterative algorithms
● The strategy: sum of the work done by all the iterations of the algorithm, then determine a Θ bound.
● In this course, we focus on worst-case analysis
○ What is the maximum number of operations used by the algorithm for input size n?
35
Example
36
● total = total + 1 occurs at least as often as any other operation
● So, we can count the number of times that this operation occurs, and use that as an asymptotic bound on the algorithm. (Why?)
ConcepTest
37
ConcepTest
38
Bounding the sum
39
ConcepTest
40
Recommended