Está en la página 1de 44

Complexity Analysis

(newer version)

What is an algorithm?

The ideas behind computer programs Stays the same no matter

Which kind of hardware it is running on Which programming language it is written in

Solves a general, well-specified problem Is specified by

Describing the set of instances (input) it must work on Describing the desired properties of the output

Important Properties of Algorithms

Correct

always returns the desired output for all legal instances of the problem.

Efficient

Can be measured in terms of


Time Space

Time tends to be more important

Expressing Algorithms

English description Pseudocode High-level programming language


More precise

More easily expressed

Pseudocode

A shorthand for specifying algorithms Leaves out the implementation details Leaves in the essence of the algorithm
Algorithm ArrayMax(A,n) Input: An array A storing n1 integers Output: The maximum element in A. currentMax A[0] for i 1 to n-1 do if currentMax < A[i] then currentMax A[i] return currentMax

Analysis of Algorithms

Why analyze algorithms?


Evaluate algorithm performance Compare different algorithms Running time, memory usage, solution quality Worst-case and typical case

Analyze what about them?

Analysis of algorithms compare algorithms, not programs.

Analysis of Algorithms (Cont.)


If each line takes constant time the whole algorithm (any algorithm) will take constant time, right? Wrong! Although some algorithms may take constant time, the majority of algorithms varies its number of steps based on the size of instance we're trying to solve Therefore the efficiency of an algorithm is always normally stated as a function of the problem size

We generally use the variable n to represent the problem size

On the implementation, we could find out that SuperDuper Sort takes 0.6n2 + 0.3n + 0.45 seconds on a Pentium 3.

Plug a value for n and you have how long it takes

Algorithm Complexity

Worst Case Complexity:

The function defined by the maximum number of steps taken on any instance of size n

Best Case Complexity:

The function defined by the minimum number of steps taken on any instance of size n
The function defined by the average number of steps taken on any instance of size n

Average Case Complexity:

Best, Worst, and Average Case Complexity


Number of steps

Worst Case Complexity Average Case Complexity

Best Case Complexity


n (input size)

Doing the Analysis

Its hard to estimate the running time exactly


Best case depends on the input Average case is difficult to compute So we usually focus on worst case analysis

Easier to compute Usually close to the actual running time


Upper bound Actual function Lower bound

Strategy: try to find upper and lower bounds of the worst case function.

Running Time Analysis

Running Time Analysis

Comparing Algorithms
Establish

a relative order among different algorithms in terms of their relative rates of growth. The rates of growth are expressed as functions, which are generally in terms of the number of inputs n.

Asymptotic Analysis

Asymptotic analysis of an algorithm describes the relative efficiency of an algorithm as n get very large. When you're dealing with small input size, most of algorithms will do When the input size is very large things change In the example it is easy to see that for very large n, g(n) grows faster than f(n)

Take for instance the value n=20000000

Remember that the goal here is to compare algorithms. In practice, if you're writing small programs, asymptotic analysis may not be that important

A simple comparison

Let's assume that you have 3 algorithms to sort a list f(n) = n log2n g(n) = n2 h(n) = n3 Let's also assume that each step takes 1 microsecond (10-6) n n log n n^2 n^3
10 100 1000 100000 33.2 664 9966 1.7s 100 10000 1seg 2.8 hours 1000 1seg 16min 31.7 years

Most of the algorithms discussed here will be given in terms of common functions: polynomials, logarithms, exponentials and product of these functions

Big-Oh Notation
T(n) = O(f(n)) if there are positive constants c and n0 such that T(n) c f(n) for n n0 This says that function T(n) grows at a rate no faster than f(n). cf(n) Thus, f(n) is an upper T(n) bound on T(n).

T(n) = O(f(n))

Big-Oh Upper Bound


T(n) = O(f(n))

n0

Example

Prove that 7n3+2n2 = O(n3) Since Then 7n3+2n2 < 7n3 +2n3 = 9n3 (for n 1) 7n3+2n2 = O(n3) with c = 9 and n0 = 1

Similarly, we can prove that 7n3+2n2 = O(n4). The first bound is tighter.

Tighter Upper Bound

n0

Big-Omega Notation

T(n) = (f(n)) if there are positive constants c and n0 such that T(n) c f(n) for n n0 This says that function T(n) grows at a rate no slower than f(n). T(n) Thus, f(n) is a lower bound on T(n) cf(n)

T(n) = (f(n))

Big-Omega Lower Bound


T(n) = (f(n))

n0

Example

Prove that 2n+5n2 = (n2) Since Then 2n+5n2 > 5n2 > 1 n2 (for n 1) 2n+5n2 = (n2) with c = 1 and n0 = 1

Similarly, we can prove that 2n+5n2 = (n). The first bound is tighter.

Tighter Lower Bound

n0

Big-Theta Notation
T(n) = (f(n)) if and only if T(n) = O(f(n)) and T(n) = (f(n)) This says that function T(n) grows at the same rate as f(n). c2f(n) Put it another way: T(n) = (f(n)) if there are T(n) positive constants c1, c2, c1f(n) and n0 such that c1 f(n) T(n) c2 f(n) for n n0
T(n) = (f(n))

Example

It the two functions f(n) and g(n) are proportional, then f(n) = (g(n)) Since log A n = log B n/log B A Then log A n = (log B n)

i. e., the base of the log is irrelevant.

Little-oh

T(n) = o(f(n)) if and only if T(n) = O(f(n)) and T(n) (f(n)) if This says that function T(n) grows at a rate strictly less than f(n)

Hierarchy of Growth Rates

c < log n < n< n<n< 2 < n3 < 2n < 3n < n log < n < n n n! < n

2 log

k log

36

44

También podría gustarte