Discrete Mathematics Summer 2006 By Dan Barrish-Flood originally for Fundamental Algorithms For use by Harper Langston in D.M.
Download
Report
Transcript Discrete Mathematics Summer 2006 By Dan Barrish-Flood originally for Fundamental Algorithms For use by Harper Langston in D.M.
Discrete Mathematics
Summer 2006
By Dan Barrish-Flood originally for
Fundamental Algorithms
For use by Harper Langston in D.M.
1
What is an Algorithm?
• A problem is a precise specification of input
and desired output
• An algorithm is a precise specification of a
method for solving a problem.
2
What We Want in an Algorithm
• Correctness
– Always halts
– Always gives the right answer
• Efficiency
– Time
– Space
• Simplicity
3
A Typical Problem
Sorting
Input:
A sequence b1, b2, ... bn
Output:
A permutation of that sequence b1, b2, ... bn such
that
b1 b2 b3 bn
4
A Typical Algorithm
Insertion - Sort( A)
1 for j 2 to length[ A]
2
3
do key A[ j ]
Insert A[ j ] into the sorted
4
sequence A[1.. j 1]
i j 1
5
6
7
8
while i 0 and A[i ] key
do A[i 1] A[i ]
i i 1
A[i 1] key
5
Running Time
Insertion - Sort( A )
1 for j 2 to length[ A ]
2 do key A[ j ]
3
Insert A[ j ] into the sorted
sequence A[1.. j 1]
4
5
6
7
8
i j1
w hile i 0 and A[ i ] key
do A[ i 1] A[ i ]
i i1
A[ i 1] key
Cost tim es
c1
n -1
c2
n -1
0
c4
c5
c6
c7
c8
n -1
n -1
n
j= 2
n
j= 2
n
j= 2
tj
(tj 1)
(tj 1)
n -1
tJ is the number of times the “while” loop test in line 5 is
executed for that value of j. Total running time is
c1 (n) + c2(n-1) +…+c8(n-1)
6
A Better Way to Estimate
Running Time
• Trying to determine the exact running time is
tedious, complicated and platform-specific.
• Instead, we’ll bound the worst-case running
time to within a constant factor for large
inputs.
• Why worst-case?
–
–
–
–
It’s easier.
It’s a guarantee.
It’s well-defined.
Average case is often no better.
7
Insertion Sort, Redux
• Outer loop happens about n times.
• Inner loop is done j times for each j, worst
case.
• j is always < n
• So running time is upper-bounded by about
n n n2
“Insertion sort runs in q(n2) time, worst-case.”
8
• We’ll make all the “about”s and “rough”s
precise.
• These so-called asymptotic running times
dominate constant factors for large enough
inputs.
9
Functions
• constants
0, 1
• polynomials
n, 2n 3 5
• exponentials
n
2 ,3
n2
2
• logarithms and
polylogarithms
log2 n, log n
• factorial
n! n (n 1) ... 1
10
Polynomials
d
i
a
n
i
i 0
E.g 2n 3 5 5n 0 0n1 0n 2 2n 3
(Moregenerally,Sum of termsanc
where c is some real number. E.g.
n1 / 2 n , n
3
11
f(n) is monotonically increasing iff m n implies
f ( m) f ( n ).
I.e. It never “goes down” — always increases
or stays the same.
For strictly increasing, replace in the
above with <.
I.e. The function is always going up.
12
Are constant functions monotonically
increasing?
Yes! (But not strictly increasing.)
c
c
0
n
If
, then
is strictly increasing.
.001
n
( 1000 n) increases forever to .
So even
13
Exponentials
f ( n) a n
Facts:
a0 1
a1 a
1
a 1/ a
a n n a
1
a m n a ma n
e.g. 2 n 1 2 n 2 1 21 2 n
(a m ) n a mn (a n ) m
e.g. 2 2
n
2
( 12 ) n
2
n
If a 1, a n is strictly increasing to
14
Logarithms
logb n t hatnumber x such t hatb x n.
Some not at ion:
lg n log2 n
ln n loge n
logk n (logn) k
log log n log (log n)
Recall :
logb b 1
logb 1 0
For 0 x 1, logb x 0.
15
log n
1. n = b b
(definition)
Log Facts
2. logb(xy)=logbx +
logby
• if b > 1, then logbn is
strictly increasing.
3. logban = nlogba
4. logbx = logcx / logcb
5. logb(1/a) = -logba
6. logba = 1/(logab)
7. alogbc= clogba
16
More About Logs
A polylogarithm is just
log k n
For k 0 and n 0, log n is strictly
increasing to .
k
17
Factorial
n! n (n 1) (n 2) 1
n
i
i 1
0! 1 (by definit ion)
n! is st rictlyincreasingto
Stirling's approximation :
n! 2n ( n e ) n (1 ( 1 n ))
Even wit hout Stirling, we can see :
n
For n > 3, 2 n! n
n
18
More About Monotonically
Increasing Functions
If f ( n ) and g ( n ) are monotonically
increasing, then so are
f ( n) g ( n)
f ( g ( n ))
and if f ( n ) and g ( n ) are also 0 , then
f ( n) g ( n)
is monotonically increasing, too.
Examples: n log n
log log n
2n n3
19
Bounding Functions
• We are more interested in finding upper and lower
bounds on functions that describe the running
times of algorithms, rather than finding the exact
function.
• Also, we are only interested in considering large
inputs: everything is fast for small inputs.
• So we want asymptotic bounds.
• Finally, we are willing to neglect constant factors
when bounding. They are tedious to compute, and
are often hardware-dependent anyway.
20
Example
20
15
3n+8
10
n
5
0
0
1
2
3
3n 8 is certainly not upper - bounded
by n
30
20
3n+8
10n
10
0
0
1
2
3
b u t w h e n n 2 , 3 n + 8 is u p p e r b o u n d e d b y 1 0 n.
21
Notation for Function Bounds
(The most important definition s in the course.
We' ll use them every day.)
Upper Bound : O (" big - oh" )
We say f (n) O( g (n)) iff there exist two
positive constants c and n0 such that
f (n) cg (n) for all n n0
E.g.
3n 8 O(n)
[Proof : choose c 10 and n0 2;
there are many other choices.]
22
Big-Oh Facts
• The “=” is misleading. E.g
O(g (n)) f ( n)
is meaningless.
• It denotes an upper bound, not necessarily
a tight bound. Thus,
n O ( n)
n O(n 2 )
n O (2 n ), etc.
• Transitivity:
If f (n) O( g (n)) and g (n) O(h(n))
then f (n) O(h(n))
23
Examples
We’ve seen that 3n + 8 = O(n).
Is 10n = O(3n + 8)?
Yes: choose c = 10, n0 = 1:
10n n + 80
24
More Examples
2n2 + 5n - 8 = O(n2)?
Yes:
2n2 + 5n - 8 cn2
n2: 2 + 5/n - 8/n2 c
Choose c = 3; subtracting 2:
5/n - 8/n2
Choose no = 5
25
More Examples
Is 2 n O(3 n )?
Yes : in fact ,2 n 3 n
(c 1, n0 1)
Is 3 n O(2 n ) ?
3n c2 n
2 n : 3n / 2 n c
(3 / 2) n c
But since 3/2 1, (3 / 2) n is st rict lyincreasing
t o - it cannotbe upper - bounded by any const ant .
26
Lower Bound Notation
f (n) ( g (n)) iff there are positive constants
c and n0 such that f(n) cg(n) for all n n0 .
(Same as definition of O, with replaced by .)
E.g.
3n (2 n )
n ( n )
In fact:
If f (n) O( g (n)), then g (n) ( f (n)),
and vice versa.
2
27
“Tight” Bound Notation
(Both upper and lower)
f ( n) ( g ( n)) iff there are positive constants
c1 , c2 , and n0 such that
c1g ( n) f ( n) c2 g ( n) for all n n0 .
8
c2g(n)
6
f (n)
4
2
0
n0
28
More Facts
is transitive, just like O and .
f (n) ( g (n)) iff f (n) O( g (n))
and f (n) ( g (n))
f (n) ( g (n)) iff f (n) O( g (n))
and g (n) O( f (n))
f (n) ( g (n)) iff g (n) ( f (n))
29
Examples
We've seen that 3n 8 O(n), and n O(3n 8) so
3n 8 (n).
2n 2 5n 8 (n 2 ). T hisis because :
Wesaw earlier that 2n 2 5n 8 O(n 2 )
Also, 2n 2 5n 8 (n 2 ) [c 1, n0 2].
n 2 (n 3 ). T hisis because :
n 2 O(n 3 ), but not viceversa.
logb1 n (logb 2 n)
Since log funct ionsdiffer by a constantfact or.
T herefore,we can writ e(lg n) for all logarithmic growth.30
Another Upper-Bound
Notation
We say f (n) o( g (n)) ["lit t leoh"] iff
f ( n)
lim
0.
n g ( n)
E.g. 2n o(n 2 )
2n 2 is not o(n 2 ).
T hinkof f (n) o( g (n)) as
" f (n) is (asympt otcally)
i
negligible wit h
respect t o g (n)."
31
Functions and Bounds
Const antfunct ions:
All const antfunct ionsare (1). (also (2), (3),
et c.,but we use 1 by convent ion
.)
If f ( n) is a const antfunct ionand g (n) is st rict ly
increasingt o , t hen f ( n) o( g (n)).
f ( n)
(P roof: g ( n) , so lim
0.)
n g ( n)
32
lg n o( n)
Logarithms
In fact,
(*)
logb n o(n a ) for any constantsa, b
wherea 0.
So - perhapssurprisingly lg100 n o(n .01 )
in spite of thefact thatfor n 4,
lg100 n 2100 (huge!) while n .01 1
Substituting lg n for n in (*), we see (lg lg n) b o(lg a n)
and so on.
33
Polynomials
Any polynomialof degree d whose leading
coefficient is posit iveis ( n d ).
P roof :
Homeworkfor next class.
Subst itut ing 2 n for n in lg b n o(n a ), a 0,
we have:
(lg(2 n ))b o((2 n ) a )
n b o((2 a ) n ) (recalllg n meanslog base 2)
Renaming 2 a , (where a 0) t o a, (where a 1),
n b o(a n ), a 1 i.e. polynomial
s o (exponentals)
i
T hus, n100 o(1.001n )
34
Comparing Functions
n vs.
n vs. lg 2 n
n
2 n vs. 2 n / 2
n lg n vs. n
2
2
lg n vs. ln n
n lg n vs. n
n
lg n
n
vs. lg( n )
lgn vs. lg n
n / lg n vs. lg n
35