Transcript Document

Chapter 5
Decrease and Conquer
Homework 7

hw7 (due 3/17)
– page
– page
– page
– page
127
132
137
168
question 5
questions 5 and 6
questions 5 and 6
questions 1 and 4
Decrease and Conquer
Also referred to as
inductive or incremental approach
Reduce problem instance to smaller
instance of the same problem and
extend solution
2. Solve smaller instance
3. Extend solution of smaller instance to
obtain solution to original problem
Note: We are not dividing the problem
into two smaller problems.
1.
Examples of Decrease
and Conquer

Decrease by one:
– Insertion sort
– Graph search algorithms:




Decrease by a constant
factor
–
–
–
–
DFS
BFS
Topological sorting
– Algorithms for generating
permutations, subsets

Binary search
Fake-coin problems
multiplication à la russe
Josephus problem
Variable-size decrease
– Euclid’s algorithm
– Selection by partition
What’s the difference?
Consider the problem of exponentiation:
Compute an




Brute Force:
Divide and conquer:
Decrease by one:
Decrease by constant factor:
an Brute Force:




a*a*a*a*a* ... *a*a*a
Requires n multiplications
Programmed as a loop
Obviously O(n)
an Divide and conquer:






an/2 * an/2
Isn’t this clever?
a8 =a4 * a4 = a2*a2 * a2*a2 =
a1*a1*a1*a1 * a1*a1*a1*a1
Why is this retarded?
If you compute a4 why do you have to
compute it again?
Sometimes divide and conquer doesn’t yield
an advantage!
an Decrease by one:




an = an-1 * a
Q: Is this really any different than the
brute force method?
A: No, except that it can be
programmed recursively.
We still haven’t done better than O(n)
an Decrease by constant
factor:






an= (an/2)2
Your probably thinking: Dr.B. kidding, right?
Q: Isn’t this exactly the same as the divide
an conquer approach?
A: No, check this out.
a8 = (a4)2 = ((a2)2)2 = ((a*a)2)2
We actually only do 3 multiplications
a*a = v1;
 v 1 * v 1 = v2 ;
 v2 * v2 = a8

an Decrease by constant
factor:







an= (an/2)2
a16= (((a*a)2)2)2  4 multiplications
a32= ((((a*a)2)2)2)2  5 multiplications
…
a1024  10 multiplications
Obviously this is an O(log n) algorithm
What about a47?
an Decrease by constant
factor:








What about a47?
a47 = a*(a23)2  2 multiplications
a23 = a*(a11)2  2 multiplications
a11 = a*(a5)2  2 multiplications
a5 = a*(a2)2  2 multiplications
a2 = a*a  1 multiplications
It might actually take 2*log n the worst
case, which is still O(log n).
Isn’t Big-O nice?
Graph Traversal


Many problems require processing all
graph vertices in systematic fashion
Graph traversal algorithms:
– Depth-first search
– Breadth-first search
Depth-first search


Given a graph G=(V,E), explore graph always moving
away from last visited vertex
G is a graph which consists of two sets
– V is a set of vertices
 V = {A, B, C, D, E, F}
– E is a set of edges
 E = {(A,B), (A,C), (C,D), (D,E), (E,C), (B,F)}
C
A
E
D
B
F
Depth-first search
DFS(G)
count :=0
mark each vertex with 0 (unvisited)
for each vertex v in V do
if v is marked with 0
dfs(v)
dfs(v)
count := count + 1
mark v with count
for each vertex w adjacent to v do
if w is marked with 0
dfs(w)
Types of edges




Tree edges: edges comprising forest
Back edges: edges to ancestor nodes
Forward edges: edges to descendants
(digraphs only)
Cross edges: none of the above
Depth-first search: Notes

DFS can be implemented with graphs
represented as:
– Adjacency matrices: Θ(V2)
– Adjacency linked lists: Θ(V+E)

Yields two distinct ordering of vertices:
– preorder: as vertices are first
encountered (pushed onto stack)
– postorder: as vertices become dead-ends
(popped off stack)
Depth-first search: Notes

Applications:
– checking connectivity, finding connected
components
– checking acyclicity
– searching state-space of problems for
solution (AI)
Breadth-first search




Explore graph moving across to all the
neighbors of last visited vertex
Similar to level-by-level tree traversals
Instead of a stack, breadth-first uses
queue
Applications: same as DFS, but can also
find paths from a vertex to all other
vertices with the smallest number of
edges
Breadth-first search
algorithm BFS(G)
count :=0
mark each vertex with 0
for each vertex v in V do
bfs(v)
bfs(v)
count := count + 1
mark v with count
initialize queue with v
while queue is not empty do
a := front of queue
for each vertex w adjacent to a do
if w is marked with 0
count := count + 1
mark w with count
add w to the end of the queue
remove a from the front of the queue
Breadth-first search:
Notes

BFS has same efficiency as DFS and
can be implemented with graphs
represented as:
– Adjacency matrices: Θ(V2)
– Adjacency linked lists: Θ(V+E)

Yields single ordering of vertices
(order added/deleted from queue is
the same)
Directed acyclic graph
(dag)

A directed graph with no cycles

Arise in modeling many problems, eg:
– prerequisite structure
– food chains

Imply partial ordering on the domain
Topological sorting


Problem: find a total order consistent with
a partial order
Order them so that they
tiger
don’t have to wait for any
Example:
human
fish
of their food
(i.e., from lower to higher,
consistent with food
chain)
sheep
shrimp
plankton
wheat
Problem is solvable iff graph is dag
Topological sorting
Algorithms
1.
DFS-based algorithm:
– DFS traversal noting order vertices are
popped off stack
– Reverse order solves topological sorting
– Back edges encountered?→ NOT a dag!
2.
Source removal algorithm
– Repeatedly identify and remove a source
vertex, ie, a vertex that has no incoming
edges
Variable-size-decrease:
Binary search trees

Arrange keys in a binary tree with the
binary search tree property:
Example 1: 5, 10, 3, 1, 7, 12, 9
k
Example 2: 4, 5, 7, 2, 1, 3, 6
• What about repeated keys?
<k
>k
Searching and insertion in
binary search trees



Searching – straightforward
Insertion – search for key, insert at leaf where search
terminated
All operations: worst case # key comparisons = h + 1
– lg n ≤ h ≤ n – 1 with average (random files) 1.41 lg n
– Thus all operations have:
 worst case: Θ(n)
 average case: Θ(lgn)

Bonus: inorder traversal produces sorted list (treesort)
Homework 7

hw7 (due 3/17)
– page
– page
– page
– page
127
132
137
168
question 5
questions 5 and 6
questions 5 and 6
questions 1 and 4