Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
M2
Logic, Algorithms and Data Structures
The Big O(h)
How do we measure complexity? There are four (more) interesting aspects to
complexity: Accuracy Speed
Number of operations required in (best, average, worst) case
Space Memory usage (in memory and on disk)
Code Readability and portability of code
Algorithmic speed
The Big O(h) notation (“Order of magnitude”) O(n), O(n^2), O(n log n), …
Refers to the performance of the algorithm in the worst case
An approximation to make it easier to discuss the relative performance of algorithms
Expresses the rate of growth in computational resources needed
Some common expressions
O(1) The best time for any algorithm; regardless of data size, it takes a fixed amount of time
O(n) Linear time, depends heavily on the data size
O(log n) Logarithmic increase of time in relation to data size
O(n^2) Increases with the square of the data size
Calculating the Big O
Calculate taking the worst case into consideration!
Count the number of operations needed to complete the algorithm.
The highest-order term is usually the dominating rate of growth (log n, n, n^2, n^3)
Example algorithm
1: y := 0
2: x := 0
3: while x < N do begin
4: z := x * 10
5: y := y + z
6: x := x + 1
7: end
Constant time: L1, L2Variable time: N * (L4, L5, L6)
Complexity: O(2 + 3n)
Constants in Big O Notation
We can usually ignore the constants when reasoning around performance
By O(2+3n) we mean: In the worst case, the algorithm needs to go
through the entire data set, consisting of n elements, and for each perform 4 operations.
Compare against O(10+50n), O(log n), O(n^2) n is the highest order term and determines
the rate of growth Simplify to O(n)
Another example algorithm
i := 2while i < N dobegin
A[i] := ii := i * 2
end;1
23
45
67
89
1011
1213
1415
1617
1819
2021
2223
2425
2627
2829
3031
32
0
2
4
6
8
10
12
14
16
0 300 600 900 1200150 450 750 1050 1350
15001650
18001950
2100
0
5
10
15
20
25
30
35
40
O(log n)
A variation of the same
for j := 1 to N dobegin
i := 2while i < N dobegin
A[i] := ii := i * 2
end;end;
O(n log n)
Performance of insertion sort?
If the data is sorted?
If the data is reversed?
O(n)
O(n*n) = O(n^2)
Getting rid of m through a different data structure This is what we have:
Modifying both l and m takes time, but most importantly, memory!
7 8 ?m
8 7 3l 5 9
? ?
Our new data structure
Insert Sort Version 2:
7 8l 3 5 9
s
int [] l; // An array int s; // How much of l is sorted
Measuring the performance
V1 V2 V3
V1,V2
V3
Effect on space usageV1 V2 V3
V1
V2, V3
What happened with accuracy? Of course, you need to verify your algorithms! Simple algorithms are easy to implement Clever algorithms, not so much Most clever algorithms have already been
invented and proven correct
Refer to your literature and known sources!
Comparing our algorithms
V1,V2
V3
Time complexity
V1
V2, V3
Space complexity ReadabilityV1 – GoodV2 – OKV3 – Tricky
Which is the better one?
Choosing an algorithm
Implementations of algorithms vary with: Speed Space Code
Choosing an algorithm is a tradeoff between those qualities
Word of warning: Implement clever algorithms Don't invent clever algorithms
Relating ADTs to the Big O(h)
Final notes
Algorithms affect performance, through; Accuracy Speed Space usage Code readability
Choose wisely, prioritising between these qualities!