COMP 320: Introduction to Computer Organization

Download Report

Transcript COMP 320: Introduction to Computer Organization

Asymptotes: Why?

How to describe an algorithm’s running time?

(or space, …) How does the running time depend on the input?

T(x) = running time for instance x

Problem:

Impractical to use, e.g., “15 steps to sort [3 9 1 7], 13 steps to sort [1 2 0 3 9], …” Need to abstract away from the individual instances.

1

Asymptotes: Why?

Standard solution:

Abstract based on

size

of input.

How does the running time depend on the input?

T(n) = running time for instances of size n

Problem:

Time also depends on other factors.

E.g., on sortedness of array.

2

Asymptotes: Why?

Solution:

Provide a bound over these instances.

Most common. Default.

Worst case Best case Average case T(n) = max{T(x) | x is an instance of size n} T(n) = min{T(x) | x is an instance of size n} T(n) =  |x|=n Pr{x}  T(x) Determining the input probability distribution can be difficult.

3

Asymptotes: Why?

What’s confusing about this notation?

Worst case Best case Average case T(n) = max{T(x) | x is an instance of size n} T(n) = min{T(x) | x is an instance of size n} T(n) =  |x|=n Pr{x}  T(x) Two different kinds of functions: T(instance) T(size of instance) Won’t use T(instance) notation again, so can ignore.

4

Asymptotes: Why?

Problem:

T(n) = 3n 2 + 14n + 27 Too much detail: constants may reflect implementation details & lower terms are insignificant.

Solution:

Ignore the constants & low-order terms.

(Omitted details still important pragmatically.) n 1 10 3n 2 3 300 14n+17 31 157 100 1000 30,000 3,000,000 1,417 14,017 10000 300,000,000 140,017 3n 2 > 14n+17  “large enough” n 5

Upper Bounds

Creating an algorithm proves we can solve the problem within a given bound.

But another algorithm might be faster.

E.g., sorting an array.

Insertion sort  O(n 2 ) What are example algorithms for O(1), O(log n), O(n), O(n log n), O(n 2 ), O(n 3 ), O(2 n )?

6

Lower Bounds

Sometimes can prove that we cannot compute something without a sufficient amount of time.

That doesn't necessarily mean we know how to compute it in this lower bound.

E.g., sorting an array.

# comparisons needed in worst case   (n log n) Shown in COMP 482.

7

Definitions: O,

 T(n)  O(g(n))   constants C,k > 0  such that n  k, T(n)  C  g(n) C  g(n) T(n)   (g’(n))   constants C’,k’ > 0  such that n  k’, T(n)  C’  g’(n) T(n) C’  g’(n) k k’ 8

Examples: O,

 2n+13 2n+13  O( ?

  ( ?

) ) O(n) Also, O(n 2 ), O(5n), … Can always weaken the bound.

 (n), also  (log n),  (1), … 2 n  O(n) ?

 (n) ?

Given a C, 2 n  (n), not O(n).

 C  n, for all but small n.

n log n  O(n 5 ) ?

No. Given a C, log n  C  5, for all large enough n. Thus,  (n 5 ).

9

Definitions:

 T(n)  T(n)   (g(n))  O(g(n)) and T(n)   (g(n)) Ideally, find algorithms that are asymptotically as good as possible.

10

Notation

O(),  (),  () are sets of functions.

But common to abuse notation, writing T(n) = O(…) instead of T(n)  O(…) as well as T(n) = f(n) + O(…) 11