Transcript Document
Introduction to Algorithms
Chapter 3: Growth of Functions
How fast will your program run?
The running time of your program will depend upon:
The algorithm
The input
Your implementation of the algorithm in a programming
language
The compiler you use
The OS on your computer
Your computer hardware
Our Motivation: analyze the running time of an
algorithm as a function of only simple parameters of
the input.
2
Complexity
Complexity is the number of steps required to solve
a problem.
The goal is to find the best algorithm to solve the
problem with a less number of steps
Complexity of Algorithms
The size of the problem is a measure of the quantity of the
input data n
The time needed by an algorithm, expressed as a function
of the size of the problem (it solves), is called the (time)
complexity of the algorithm T(n)
3
Basic idea: counting operations
Running Time: Number of primitive steps that are
executed
most statements roughly require the same amount of time
y=m*x+b
c = 5 / 9 * (t - 32 )
z = f(x) + g(y)
Each algorithm performs a sequence of basic
operations:
Arithmetic:
Comparison:
Assignment:
Branching:
…
(low + high)/2
if ( x > 0 ) …
temp = x
while ( true ) { … }
4
Basic idea: counting operations
Idea: count the number of basic operations
performed on the input.
Difficulties:
Which operations are basic?
Not all operations take the same amount of time.
Operations take different times with different
hardware or compilers
5
Measures of Algorithm Complexity
Let T(n) denote the number of operations required
by an algorithm to solve a given class of problems
Often T(n) depends on the input, in such cases one
can talk about
Worst-case complexity,
Best-case complexity,
Average-case complexity of an algorithm
Alternatively, one can determine bounds (upper or
lower) on T(n)
6
Measures of Algorithm Complexity
Worst-Case Running Time: the longest time for any
input size of n
Best-Case Running Time: the shortest time for any
input size of n
provides an upper bound on running time for any input
provides lower bound on running time for any input
Average-Case Behavior: the expected performance
averaged over all possible inputs
it is generally better than worst case behavior, but
sometimes it’s roughly as bad as worst case
difficult to compute
7
Example: Sequential Search
Algorithm
// Searches for x in array A of n items
// returns index of found item, or n+1 if not found
Seq_Search( A[n]: array, x: item){
done = false
i=1
while ((i <= n) and (A[i] <> x)){
i = i +1
}
return i
}
Total
Step Count
0
1
1
n+1
n
0
1
0
2n + 4
8
Example: Sequential Search
worst-case running time
when x is not in the original array A
in this case, while loop needs 2(n + 1) comparisons + c
other operations
So, T(n) = 2n + 2 + c Linear complexity
best-case running time
when x is found in A[1]
in this case, while loop needs 2 comparisons + c other
operations
So, T(n) = 2 + c Constant complexity
9
Order of Growth
For very large input size, it is the rate of grow, or order
of growth that matters asymptotically
We can ignore the lower-order terms, since they are
relatively insignificant for very large n
We can also ignore leading term’s constant
coefficients, since they are not as important for the
rate of growth in computational efficiency for very
large n
Higher order functions of n are normally considered
less efficient
10
Asymptotic Notation
Q, O, W, o, w
Used to describe the running times of algorithms
Instead of exact running time, say Q(n2)
Defined for functions whose domain is the set of
natural numbers, N
Determine sets of functions, in practice used to
compare two functions
11
Asymptotic Notation
By now you should have an intuitive feel for
asymptotic (big-O) notation:
What does O(n) running time mean? O(n2)?
O(n lg n)?
Our first task is to define this notation more
formally and completely
12
Big-O notation
(Upper Bound – Worst Case)
For a given function g(n), we denote by O(g(n)) the set of
functions
O(g(n)) = {f(n): there exist positive constants c >0 and n0 >0
such that 0 f(n) cg(n) for all n n0 }
We say g(n) is an asymptotic upper bound for f(n):
0
lim n
f ( n)
g ( n)
O(g(n)) means that as n , the execution time f(n) is at
most c.g(n) for some constant c
What does O(g(n)) running time mean?
The worst-case running time (upper-bound) is a function of
g(n) to a within a constant factor
13
Big-O notation
(Upper Bound – Worst Case)
c.g(n)
time
f(n)
n0
n
f(n) = O(g(n))
14
O-notation
For a given function g(n), we
denote by O(g(n)) the set of
functions
O(g(n)) = {f(n): there exist
positive constants c and n0
such that
0 f(n) cg(n),
for all n n0 }
We say g(n) is an asymptotic upper bound for f(n)
15
Big-O notation
(Upper Bound – Worst Case)
This is a mathematically formal way of ignoring
constant factors, and looking only at the “shape”
of the function
f(n)=O(g(n)) should be considered as saying that
“f(n) is at most g(n), up to constant factors”.
We usually will have f(n) be the running time of an
algorithm and g(n) a nicely written function
E.g. The running time of insertion sort algorithm is
O(n2)
Example: 2n2 = O(n3), with c = 1 and n0 = 2.
16
Examples of functions in
2
O(n )
n2
n2 + n
n2 + 1000n
1000n2 + 1000n
Also,
n
n/1000
n1.99999
n2/ lg lg lg n
17
Big-O notation
(Upper Bound – Worst Case)
Example1:
Let
T(n) = 2n + 7
T(n) = n (2 + 7/n)
Note for n=7;
Is 2n + 7 = O(n)?
2 + 7/n = 2 + 7/7 = 3
T(n) 3 n ;
n7
n0
c
Then T(n) = O(n)
lim n [T(n) / n)] = 2 0 T(n) = O(n)
18
Big-O notation
(Upper Bound – Worst Case)
Example2:
Let
Is 5n3 + 2n2 + n + 106 = O(n3)?
T(n) = 5n3 + 2n2 + n + 106
T(n) = n3 (5 + 2/n + 1/n2 + 106/n3)
Note for n=100;
5 + 2/n + 1/n2
+ 106/n3 =
5 + 2/100 + 1/10000 + 1
= 6.05
T(n) 6.05 n3 ; n 100
c
Then T(n) = O(n3)
limn[T(n) / n3)] = 5 0 T(n) = O(n3)
n0
19
Big-O notation
(Upper Bound – Worst Case)
Express the execution time as a function of the input size n
Since only the growth rate matters, we can ignore the
multiplicative constants and the lower order terms, e.g.,
n, n+1, n+80, 40n, n+log n
is O(n)
n1.1 + 10000000000n
is O(n1.1)
n2
is O(n2)
3n2 + 6n + log n + 24.5
is O(n2)
O(1) < O(log n) < O((log n)3) < O(n) < O(n2) < O(n3) < O(nlog n) <
O(2sqrt(n)) < O(2n) < O(n!) < O(nn)
Constant < Logarithmic < Linear < Quadratic< Cubic <
Polynomial < Factorial < Exponential
20
W-notation (Omega)
(Lower Bound – Best Case)
For a given function g(n), we denote by W(g(n)) the set of
functions
W(g(n)) = {f(n): there exist positive constants c >0 and n0 >0
such that 0 cg(n) f(n) for all n n0 }
We say g(n) is an asymptotic lower bound for f(n):
0
lim n
f ( n)
g ( n)
W(g(n)) means that as n , the execution time f(n) is at
least c.g(n) for some constant c
What does W(g(n)) running time mean?
The best-case running time (lower-bound) is a function of g(n)
to a within a constant factor
21
W-notation
(Lower Bound – Best Case)
f(n)
time
c.g(n)
n0
n
f(n) = W(g(n))
22
W-notation
For a given function g(n),
we denote by W(g(n)) the
set of functions
W(g(n)) = {f(n): there exist
positive constants c and
n0 such that
0 cg(n) f(n)
for all n n0 }
We say g(n) is an asymptotic lower bound for f(n)
23
W-notation (Omega)
(Lower Bound – Best Case)
We say Insertion Sort’s run time T(n) is W(n)
For example
the worst-case running time of insertion sort is O(n2),
and
the best-case running time of insertion sort is W(n)
Running time falls anywhere between a linear
function of n and a quadratic function of n2
Example: √n = W(lg n), with c = 1 and n0 = 16.
24
Examples of functions in
2
W(n )
n2
n2 + n
n2 − n
1000n2 + 1000n
1000n2 − 1000n
Also,
n3
n2.00001
n2 lg lg lg n
25
Q notation (Theta)
(Tight Bound)
In some cases,
f(n) = O(g(n)) and f(n) = W(g(n))
This means, that the worst and best cases require
the same amount of time t within a constant factor
In this case we use a new notation called “theta Q”
For a given function g(n), we denote by Q(g(n))
the set of functions
Q(g(n)) = {f(n): there exist positive constants c1>0,
c2 >0 and n0 >0 such that
c1 g(n) f(n) c2 g(n) n n0}
26
Q notation (Theta)
(Tight Bound)
We say g(n) is an asymptotic tight bound for f(n):
0
f ( n)
g ( n)
Theta notation
lim n
(g(n)) means that as n , the execution time f(n) is at
most c2.g(n) and at least c1.g(n) for some constants c1
and c2.
f(n) = Q(g(n)) if and only if
f(n) = O(g(n)) & f(n) = W(g(n))
27
Q notation (Theta)
(Tight Bound)
c2.g(n)
f(n)
time
c1.g(n)
n0
n
f(n) = Q(g(n))
28
Q notation (Theta)
(Tight Bound)
Example:
n2/2 − 2n = Q(n2), with c1 = 1/4, c2 = 1/2, and
n0 = 8.
29