24
1 Algorithms Algorithms are simply a list of steps required to solve some particular problem They are designed as abstractions of processes carried out by computer programs Examples include Sorting Determining if a student qualifies for financial aid Determining the steps to set up a dating service

1 Algorithms Algorithms are simply a list of steps required to solve some particular problem They are designed as abstractions of processes carried

Embed Size (px)

Citation preview

Page 1: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

1

Algorithms

Algorithms are simply a list of steps required to solve some particular problem

They are designed as abstractions of processes carried out by computer programs

Examples include Sorting Determining if a student qualifies for financial aid Determining the steps to set up a dating service

Page 2: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

2

Algorithms

In some cases we have only one algorithm for a problem or the problem is so straightforward that there is no need to consider anything other than the obvious

Some other problems have many known algorithms We obviously want to choose the "best" algorithm

Other problems have no known algorithm!

Page 3: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

3

What is the "best" Algorithm?

Traditionally we focused on two questions

1. How fast does it run?

Early days, measured by timing the implementation of the algorithm

It was common to hear about a "new" SuperDuper Sort that could sort a list of 1 million integers in 17 seconds whereas Junk Sort requires 43 seconds

2. How much memory does it require?

Page 4: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

4

Analysis of Algorithms

Programs depend on the operating systems, machine, compiler/interpreter used, etc.

Analysis of algorithms compare algorithms and not programs

It is based on the premise that the longer the algorithm takes the longer its implementation will run. Sorting 1 million items ought to take longer than sorting 1000

But if we comparing algorithms (not yet implemented) how can we express it's performance? How can we "measure" the performance of an algorithm?

Page 5: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

5

Analysis of Algorithms

We want an expression that can be applied to any computer This is only possible by stating the efficiency in terms of some

critical operations

These operations depend on the problem

We could for instance say that in sorting algorithms it is the number of time two elements are compared

Page 6: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

6

Analysis of Algorithm

In general we do analysis of algorithms using the RAM model (Random Access Machine)

Instructions are executed one after the other There is no concurrency

Basic operations take the same time (constant time)

We normally say that each line (step) in the algorithm takes time 1 (one)

Page 7: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

7

Analysis of Algorithms

But you could be asking: If each line takes a constant time then the whole algorithm (any algorithm) will take constant time, right?

Wrong!

Although some algorithms may take constant time, the majority of algorithms varies its number of steps based on the size of instances we're trying to solve.

Page 8: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

8

Number of steps

It is easy to see that most of algorithms vary their number of steps

for i = 1 .. N a = a + 2 i = i + 1

for i = 1 .. N a = a + 2 i = i + 1

for i = 1 .. N a = a + 2 i = i + 2

for i = 1 .. N a = a + 2 i = i + 2

So must also consider the number of steps it will take to process the number of items(N).

Page 9: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

9

Analysis of Algorithms

Therefore the efficiency of an algorithm is always normally stated as a function of the problem size

We generally use the variable n to represent the problem size

On the implementation, we could find out that out SuperDuper Sort takes 0.6n2 + 0.3n + 0.45 seconds on Pentium.

Plug a value for n and you have how long it takes

Page 10: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

10

Analysis of Algorithms

But we're not yet independent of the machine.

Remember that we said that we said that the formula for the SuperDuper Sort is valid for a Pentium

We need to identify the most important aspect of the function that represents the running time of an algorithm

Which one is the "best" f(n) = 10000000n g(n) = n2 + n

Page 11: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

11

Asymptotic Analysis

Asymptotic analysis of an algorithm describes the relative efficiency of an algorithm as n gets very large.

In the example it is easy to see that for very large n, g(n) grows faster than f(n)

Take for instance the value n=20000000

Remember that the goal here is to compare algorithms. In practice, if you're writing small programs, asymptotic analysis may not be that important

When you're dealing with small input size, most algorithms will do

When the input size is very large, things change

Page 12: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

12

An simple comparison

Let's assume that you have 3 algorithms to sort a list f(n) = n log

2n

g(n) = n2 h(n) = n3 Let's also assume that each step takes 1 microsecond (10-6)

n n log n n^2 n^310 33.2 100 1000100 664 10000 1seg

1000 9966 1seg 16min100000 1.7s 2.8 hours 31.7 years

1s1s

Page 13: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

13

Higher order Term

Most of the algorithms discussed here will be given in terms of common functions: polynomials, logarithms, exponentials and product of these functions

Analyzing the table given earlier we can see that in an efficiency function we are interested in the term with higher order

If we have a function f(n) = n3 + n2, for the case when n = 100000 the running time of the algorithm is 31.7 years +

2.8 hours

Its clear that a couple of hours does not make much difference if the program is to run for 31.7 years!

Page 14: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

14

Higher order Term

In the case above we say that f(n) is O(n3) meaning that f(n) is of the order n3.

This is called big-O notation.

It disregards any constant multiplying the term of highest order and any term of smaller order f(n) = 10000000000000n3 is O(n3)

Page 15: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

15

Common Functions Constant (1): Very fast. Some hash table algorithms can look up one

item from the table of n items in an average time which is constant (independent of the table size)

Logarithmic(lg2 of N): Also very fast. Typical of many algorithms that use (binary) trees.

Linear Time( n): Typical of fast algorithms on a single-processor computer. If all the input of size n has to be read.

Poly-logarithmic (n log n): Typical of the best sorting algorithms. Considered a good solution

Polynomial(n^2): When a problem of size n can be solved in time nk where k is a constant. Small n's (n <= 3) is OK.

Exponential (2 ^ N): These problems can not be done in a reasonable time - see next slide

Page 16: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

16

Common Functions Exponential: Are those that use time kn where k is a constant.

Algorithms that grow on this rate are suitable only for small problems.

Unfortunately the best algorithms known to many problems use exponential time

Much of the work on developing algorithms today is focused on these problems because they take an huge amount of time to execute (even for reasonably small input size)

There is a large variation in the size of various exponential functions (20.0001n and 2n). But for large n the functions become huge

Page 17: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

17

Comparison of Algorithms

Big-O Notation A notation that expresses computing time (complexity) as the term in a function that increases most rapidly relative to the size of a problem

If

f(N) = N4 + 100N2 + 10N + 50then f(N) is 0(N4).

N represents the size of the problem.

Page 18: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

18

Worst Case and Best Case

If we return to our original question of "how fast does a program run?" we can see that this question is not enough

Inputs vary in the way they are organized and this can influence the number of critical operations performed

Suppose that we are searching of an element in an ordered list If the target key is the first in the list our function takes

constant time

If the target key is not in the list our function takes O(n), where n is the size of the list

Page 19: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

19

Worst Case and Best Case

The examples above are referred to as best case analysis and worst case analysis.

Which is the really relevant case?

Worst case is more important because it gives us a bound on how long the function might have to run

Page 20: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

20

Average Case

In some situations neither the best nor the worst case analysis express well the performance of an algorithm

Average case analysis can be used if necessary Still average case is uncommon because

It may be cumbersome to do an average analysis of non-trivial algorithms

In most cases the "order" of the average analysis is the same as the worst

Page 21: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

21

Comparison of Rates of Growth

N log2N N log2N N² N³ 2^N

Page 22: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

22

Comparison of Linear and Binary Searches

Page 23: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

23

Big-O Comparison of List OperationsOperation Unsorted List

Sorted List

O(LgN)

O(LgN)

Page 24: 1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried

24

Review Questions:

1. What problems arise when we "measure" the performance of an algorithm? What problems arise if we time it.

2. What is a “critical operation”?

3. How then do we measure the efficiency of an algorithm 1. The efficiency of an algorithm is stated as a function of the problem size. We generally use the variable N to represent the problem size

2. We must also consider the number of steps it will take to process the number of items(N).

4.What is big-O notation? What are Common Functions: Give an example

Constant (1):Logarithmic(lg2 of N):Linear Time( n):Poly-logarithmic (n log n):Polynomial(n^2):Exponential (2^n)

5.What is Worst Case Analysis?