Document 7383976

Download Report

Transcript Document 7383976

COSC 3101A - Design and
Analysis of Algorithms
9
Knapsack Problem
Huffman Codes
Introduction to Graphs
Many of these slides are taken from Monica Nicolescu, Univ. of Nevada, Reno, [email protected]
The Knapsack Problem
• The 0-1 knapsack problem
– A thief rubbing a store finds n items: the i-th item is
worth vi dollars and weights wi pounds (vi, wi integers)
– The thief can only carry W pounds in his knapsack
– Items must be taken entirely or left behind
– Which items should the thief take to maximize the
value of his load?
• The fractional knapsack problem
– Similar to above
– The thief can take fractions of items
6/29/2004 Lecture 9
COSC3101A
2
Fractional Knapsack Problem
• Knapsack capacity: W
• There are n items: the i-th item has value vi and
weight wi
• Goal:
– find xi such that for all 0  xi  1, i = 1, 2, .., n
 wixi  W and
 xivi is maximum
6/29/2004 Lecture 9
COSC3101A
3
Fractional Knapsack Problem
• Greedy strategy 1:
– Pick the item with the maximum value
• E.g.:
–
–
–
–
W=1
w1 = 100, v1 = 2
w2 = 1, v2 = 1
Taking from the item with the maximum value:
Total value taken = v1/w1 = 2/100
– Smaller than what the thief can take if choosing the
other item
Total value (choose item 2) = v2/w2 = 1
6/29/2004 Lecture 9
COSC3101A
4
Fractional Knapsack Problem
Greedy strategy 2:
•
Pick the item with the maximum value per pound vi/wi
•
If the supply of that element is exhausted and the thief can
carry more: take as much as possible from the item with the
next greatest value per pound
•
It is good to order items based on their value per pound
v1 v2
vn

 ... 
w1 w2
wn
6/29/2004 Lecture 9
COSC3101A
5
Fractional Knapsack Problem
Alg.: Fractional-Knapsack (W, v[n], w[n])
1.
While w > 0 and as long as there are items remaining
2.
pick item with maximum vi/wi
3.
xi  min (1, w/wi)
4.
remove item i from list
5.
w  w – xiwi
•
w – the amount of space remaining in the knapsack (w = W)
•
Running time: (n) if items already ordered; else (nlgn)
6/29/2004 Lecture 9
COSC3101A
6
Fractional Knapsack - Example
• E.g.:
20
$80
--+
30
Item 3
50
Item 2
50
20 $100
Item 1
30
+
20
10
$60
10
$100
$120
$60
$240
$6/pound $5/pound $4/pound
6/29/2004 Lecture 9
COSC3101A
7
Greedy Choice
Items:
Optimal solution:
Greedy solution:
• We know that: x1’  x1
1
x1
x1’
2
x2
x2’
3 … j …
x3
xj
x3’
xj ’
n
xn
xn’
– greedy choice takes as much as possible from item 1
• Modify the optimal solution to take x1’ of item 1
– We have to decrease the quantity taken from some item j: the
new xj is decreased by: (x1’ - x1) w1/wj
• Increase in profit: (x1’ - x1 ) v1
• Decrease in profit: (x 1’ - x1 )w 1 v j/w j
(x 1’ - x1 ) v1  (x 1’ - x1 )w1 v j/w j
vj
v1 v j
True, since x1 had the

v 1  w1

best value/pound ratio
w1 w j
wj
6/29/2004 Lecture 9
COSC3101A
8
Optimal Substructure
• Consider the most valuable load that weights at
most W pounds
• If we remove a weight w of item j from the
optimal load
 The remaining load must be the most valuable
load weighing at most W – w that can be taken
from the remaining n – 1 items plus wj – w
pounds of item j
6/29/2004 Lecture 9
COSC3101A
9
The 0-1 Knapsack Problem
• Thief has a knapsack of capacity W
• There are n items: for i-th item value vi and
weight wi
• Goal:
– find xi such that for all xi = {0, 1}, i = 1, 2, .., n
 wixi  W and
 xivi is maximum
6/29/2004 Lecture 9
COSC3101A
10
The 0-1 Knapsack Problem
• Thief has a knapsack of capacity W
• There are n items: for i-th item value vi and
weight wi
• Goal:
– find xi such that for all xi = {0, 1}, i = 1, 2, .., n
 wixi  W and
 xivi is maximum
6/29/2004 Lecture 9
COSC3101A
11
0-1 Knapsack - Greedy Strategy
• E.g.:
30 $120
Item 3
50
Item 2
50
50
+
20 $100
Item 1
30
+
20
10
10
$60
$100
$120
20
$100
$60
$160
$220
$6/pound $5/pound $4/pound
• None of the solutions involving the greedy
choice (item 1) leads to an optimal solution
– The greedy choice property does not hold
6/29/2004 Lecture 9
COSC3101A
12
0-1 Knapsack - Dynamic Programming
• P(i, w) – the maximum profit that can be
obtained from items 1 to i, if the
knapsack has size w
• Case 1: thief takes item i
P(i, w) = vi + P(i - 1, w-wi)
• Case 2: thief does not take item i
P(i, w) = P(i - 1, w)
6/29/2004 Lecture 9
COSC3101A
13
0-1 Knapsack - Dynamic Programming
Item i was taken
Item i was not taken
P(i, w) = max {vi + P(i - 1, w-wi), P(i - 1, w) }
0
i-1
i
n
0:
1
0
0
0
0
0
0
0
0
6/29/2004 Lecture 9
w - wi
0
0
0
W
w
0
0
0
0
0
0
first
second
COSC3101A
14
W=5
Example:
P(i, w) = max {vi + P(i - 1, w-wi), P(i - 1, w) }
Item
Weight
Value
1
2
12
2
1
10
3
3
20
4
2
15
0
1
2
3
4
5
0
0
0
0
0
0
0
1
0
0
12
12
12
12
P(1, 2) = max{12+0, 0} = 12
2
0
10
12
22
22
22
P(1, 3) = max{12+0, 0} = 12
3
0
10
12
22
30
32
P(1, 4) = max{12+0, 0} = 12
4
0
10
15
25
30
37
P(1, 5) = max{12+0, 0} = 12
P(1, 1) = P(0, 1) = 0
P(2, 1)= max{10+0, 0} = 10
P(3, 1)= P(2,1) = 10
P(4, 1)= P(3,1) = 10
P(2, 2)= max{10+0, 12} = 12
P(3, 2)= P(2,2) = 12
P(4, 2)= max{15+0, 12} = 15
P(2, 3)= max{10+12, 12} = 22 P(3, 3)= max{20+0, 22}=22 P(4, 3)= max{15+10, 22}=25
P(2, 4)= max{10+12, 12} = 22 P(3, 4)= max{20+10,22}=30 P(4, 4)= max{15+12, 30}=30
P(2, 5)= max{10+12, 12} = 22 P(4, 5)= max{20+12,22}=32 P(4, 5)= max{15+22, 32}=37
6/29/2004 Lecture 9
COSC3101A
15
Reconstructing the Optimal Solution
0
1
2
3
4
5
0
0
0
0
0
0
0
• Item 4
1
0
0
12
12
12
12
2
0
10
12
22
22
22
• Item 2
3
0
10
12
22
30
32
4
0
10
15
25
30
37
• Item 1
• Start at P(n, W)
• When you go left-up  item i has been taken
• When you go straight up  item i has not been
taken
6/29/2004 Lecture 9
COSC3101A
16
Optimal Substructure
• Consider the most valuable load that weights at
most W pounds
• If we remove item j from this load
 The remaining load must be the most valuable
load weighing at most W – wj that can be taken
from the remaining n – 1 items
6/29/2004 Lecture 9
COSC3101A
17
Overlapping Subproblems
P(i, w) = max {vi + P(i - 1, w-wi), P(i - 1, w) }
0
i-1
0:
1
0
0
0
0
i
0
0
0
n
0
w
0
0
0
0
W
0
0
0
0
0
E.g.: all the subproblems shown in grey may
depend on P(i-1, w)
6/29/2004 Lecture 9
COSC3101A
18
Huffman Codes
• Widely used technique for data compression
• Assume the data to be a sequence of characters
• Looking for an effective way of storing the data
6/29/2004 Lecture 9
COSC3101A
19
Huffman Codes
• Idea:
– Use the frequencies of occurrence of characters to
build a optimal way of representing each character
Frequency (thousands)
a
45
b
13
c
12
d
16
e
9
f
5
• Binary character code
– Uniquely represents a character by a binary string
6/29/2004 Lecture 9
COSC3101A
20
Fixed-Length Codes
E.g.: Data file containing 100,000 characters
Frequency (thousands)
a
45
b
13
c
12
d
16
e
9
f
5
• 3 bits needed
• a = 000, b = 001, c = 010, d = 011, e = 100, f = 101
• Requires: 100,000  3 = 300,000 bits
6/29/2004 Lecture 9
COSC3101A
21
Variable-Length Codes
E.g.: Data file containing 100,000 characters
Frequency (thousands)
a
45
b
13
c
12
d
16
e
9
f
5
• Assign short codewords to frequent characters and
long codewords to infrequent characters
• a = 0, b = 101, c = 100, d = 111, e = 1101, f = 1100
• (45  1 + 13  3 + 12  3 + 16  3 + 9  4 + 5  4) 1,000
= 224,000 bits
6/29/2004 Lecture 9
COSC3101A
22
Prefix Codes
• Prefix codes:
– Codes for which no codeword is also a prefix of some
other codeword
– Better name would be “prefix-free codes”
• We can achieve optimal data compression using
prefix codes
– We will restrict our attention to prefix codes
6/29/2004 Lecture 9
COSC3101A
23
Encoding with Binary Character Codes
• Encoding
– Concatenate the codewords representing each
character in the file
• E.g.:
– a = 0, b = 101, c = 100, d = 111, e = 1101, f = 1100
– abc = 0  101  100 = 0101100
6/29/2004 Lecture 9
COSC3101A
24
Decoding with Binary Character Codes
• Prefix codes simplify decoding
– No codeword is a prefix of another  the codeword
that begins an encoded file is unambiguous
• Approach
– Identify the initial codeword
– Translate it back to the original character
– Repeat the process on the remainder of the file
• E.g.:
– a = 0, b = 101, c = 100, d = 111, e = 1101, f = 1100
– 001011101 = 0  0  101  1101
6/29/2004 Lecture 9
COSC3101A
= aabe
25
Prefix Code Representation
• Binary tree whose leaves are the given characters
• Binary codeword
– the path from the root to the character, where 0 means “go to the
left child” and 1 means “go to the right child”
• Length of the codeword
– Length of the path from root to the character leaf (depth of node)
100
0
100
1
0
86
14
1
0
58
0
a: 45 b: 13
0
c: 12
6/29/2004 Lecture 9
a: 45
55
0
0
28
1
1
d: 16
0
e: 9
30
25
14
1
1
1
f: 5
COSC3101A
0
1
0
1
c: 12
b: 13
14
d: 16
0
1
f: 5
e: 9
26
Optimal Codes
• An optimal code is always represented by a full
binary tree
– Every non-leaf has two children
– Fixed-length code is not optimal, variable-length is
• How many bits are required to encode a file?
– Let C be the alphabet of characters
– Let f(c) be the frequency of character c
– Let dT(c) be the depth of c’s leaf in the tree T
corresponding to a prefix code
B(T )   f (c)dT (c)
the cost of tree T
cC
6/29/2004 Lecture 9
COSC3101A
27
Constructing a Huffman Code
• A greedy algorithm that constructs an optimal prefix code
called a Huffman code
• Assume that:
– C is a set of n characters
– Each character has a frequency f(c)
– The tree T is built in a bottom up manner
• Idea:
f: 5
e: 9
c: 12 b: 13 d: 16 a: 45
– Start with a set of |C| leaves
– At each step, merge the two least frequent objects: the frequency of
the new node = sum of two frequencies
– Use a min-priority queue Q, keyed on f to identify the two least
frequent objects
6/29/2004 Lecture 9
COSC3101A
28
Example
f: 5
e: 9
c: 12 b: 13 d: 16 a: 45
c: 12 b: 13
14
0
f: 5
14
0
f: 5
d: 16
1
e: 9
25
a: 45
1
0
c: 12 b: 13
25
1
e: 9
0
0
1
c: 12 b: 13
55
0
25
1
0
1
0
c: 12 b: 13
14
0
f: 5
6/29/2004 Lecture 9
0
1
a: 45
1
d: 16
1
e: 9
100 1
a: 45
30
30
14
0
f: 5
a: 45
d: 16 a: 45
0
55
1
30
25
d: 16
1
0
c: 12 b: 13
1
e: 9
COSC3101A
0
14
0
f: 5
1
d: 16
1
e: 9
29
Building a Huffman Code
Running time: O(nlgn)
Alg.: HUFFMAN(C)
1.
2.
3.
4.
5.
6.
7.
8.
9.
n  C 
O(n)
QC
for i  1 to n – 1
do allocate a new node z
left[z]  x  EXTRACT-MIN(Q)
right[z]  y  EXTRACT-MIN(Q)
f[z]  f[x] + f[y]
INSERT (Q, z)
return EXTRACT-MIN(Q)
6/29/2004 Lecture 9
COSC3101A
O(nlgn)
30
Greedy Choice Property
Lemma: Let C be an alphabet in which each
character c  C has frequency f[c]. Let x and y
be two characters in C having the lowest
frequencies.
Then, there exists an optimal prefix code for C in
which the codewords for x and y have the same
length and differ only in the last bit.
6/29/2004 Lecture 9
COSC3101A
31
Proof of the Greedy Choice
• Idea:
– Consider a tree T representing an arbitrary optimal
prefix code
– Modify T to make a tree representing another optimal
prefix code in which x and y will appear as sibling
leaves of maximum depth
The codes of x and y will have the same length and
differ only in the last bit
6/29/2004 Lecture 9
COSC3101A
32
Proof of the Greedy Choice (cont.)
T’
T
T’’
x
y
b
y
a
b
a
a
x
b
x
y
• a, b – two characters, sibling leaves of maximum depth in T
• Assume: f[a]  f[b] and f[x]  f[y]
• f[x] and f[y] are the two lowest leaf frequencies, in order
 f[x]  f[a] and f[y]  f[b]
• Exchange the positions of a and x (T’) and of b and y (T’’)
6/29/2004 Lecture 9
COSC3101A
33
Proof of the Greedy Choice (cont.)
T’
T
T’’
x
y
b
y
a
b
x
B(T) – B(T’) =
a
a
 f ( c )d
cC
T
b
x
y
( c )   f ( c )d T ' ( c )
cC
= f[x]dT(x) + f[a]dT(a) – f[x]dT’(x) – f[a]dT’(a)
= f[x]dT(x) + f[a]dT(a) – f[x]dT(a) – f[a]dT (x)
= (f[a] - f[x]) (dT(a) - dT(x))
≥0
x is a minimum
frequency leaf
6/29/2004 Lecture 9
0
≥0
a is a leaf of
maximum depth
COSC3101A
34
Proof of the Greedy Choice (cont.)
T’
T
T’’
x
y
b
y
a
b
a
a
x
b
x
y
B(T) – B(T’)  0
Similarly, exchanging y and b does not increase the cost
 B(T’) – B(T’’)  0
 B(T’’)  B(T) and since T is optimal  B(T)  B(T’’)
 B(T) = B(T’’)  T’’ is an optimal tree, in which x and y are
sibling leaves of maximum depth
6/29/2004 Lecture 9
COSC3101A
35
Discussion
• Greedy choice property:
– Building an optimal tree by mergers can begin with
the greedy choice: merging the two characters with
the lowest frequencies
– The cost of each merger is the sum of frequencies of
the two items being merged
– Of all possible mergers, HUFFMAN chooses the one
that incurs the least cost
6/29/2004 Lecture 9
COSC3101A
36
Graphs
• Applications that involve not only a set of items,
but also the connections between them
• Maps
• Hypertexts
• Circuits
• Schedules
• Transactions
• Matching
• Computer Networks
6/29/2004 Lecture 9
COSC3101A
37
Graphs - Background
Graphs = a set of nodes (vertices) with edges
(links) between them.
Notations:
• G = (V, E) - graph
• V = set of vertices
• E = set of edges
V = n
E = m
1
2
1
2
1
2
3
4
3
4
3
4
Directed
graph
6/29/2004 Lecture 9
Undirected
graph
COSC3101A
Acyclic
graph
38
Other Types of Graphs
• A graph is connected if there is
a path between every two
vertices
1
2
1
2
3
4
3
4
Connected
• A bipartite graph is an
undirected graph G = (V, E) in
which V = V1 + V2 and there are
edges only between vertices in
V1 and V2
6/29/2004 Lecture 9
COSC3101A
Not connected
2
1
9
4
8
6
3
7
4
39
Graph Representation
• Adjacency list representation of G = (V, E)
– An array of V lists, one for each vertex in V
– Each list Adj[u] contains all the vertices v such that
there is an edge between u and v
• Adj[u] contains the vertices adjacent to u (in arbitrary order)
– Can be used for both directed and undirected graphs
1
2
3
5
4
1
2
5
2
1
5
3
2
4
4
2
5
3
5
4
1
2
/
4
3
/
/
Undirected graph
6/29/2004 Lecture 9
COSC3101A
40
Properties of Adjacency-List
Representation
• Sum of the lengths of all the
adjacency lists
1
2
3
4
Directed graph
– Directed graph: E 
• Edge (u, v) appears only once in u’s list
– Undirected graph: 2 E 
• u and v appear in each other’s adjacency
1
2
3
5
4
Undirected graph
lists: edge (u, v) appears twice
6/29/2004 Lecture 9
COSC3101A
41
Properties of Adjacency-List
Representation
• Memory required
2
1
– (V + E)
3
• Preferred when
– the graph is sparse: E  << V 2
5
4
Undirected graph
• Disadvantage
– no quick way to determine whether there
1
2
3
4
is an edge between node u and v
• Time to list all vertices adjacent to u:
– (degree(u))
Directed graph
• Time to determine if (u, v)  E:
– O(degree(u))
6/29/2004 Lecture 9
COSC3101A
42
Graph Representation
• Adjacency matrix representation of G = (V, E)
– Assume vertices are numbered 1, 2, … V 
– The representation consists of a matrix A V x V :
– aij = 1 if (i, j)  E
0 otherwise
1
2
3
5
4
Undirected graph
6/29/2004 Lecture 9
1
2
3
4
5
1
0
1
0
0
1
2
1
0
1
1
1
3
0
1
0
1
0
4
0
1
1
0
1
5
1
1
0
1
0
COSC3101A
Matrix A is
symmetric:
aij = aji
A = AT
43
Properties of Adjacency Matrix
Representation
• Memory required
– (V2), independent on the number of edges in G
• Preferred when
– The graph is dense E is close to V 2
– We need to quickly determine if there is an edge
between two vertices
• Time to list all vertices adjacent to u:
– (V)
• Time to determine if (u, v)  E:
– (1)
6/29/2004 Lecture 9
COSC3101A
44
Weighted Graphs
• Weighted graphs = graphs for which each edge
has an associated weight w(u, v)
w: E  R, weight function
• Storing the weights of a graph
– Adjacency list:
• Store w(u,v) along with vertex v in u’s adjacency list
– Adjacency matrix:
• Store w(u, v) at location (u, v) in the matrix
6/29/2004 Lecture 9
COSC3101A
45
Searching in a Graph
• Graph searching = systematically follow the
edges of the graph so as to visit the vertices of
the graph
• Two basic graph searching algorithms:
– Breadth-first search
– Depth-first search
– The difference between them is in the order in which
they explore the unvisited edges of the graph
• Graph algorithms are typically elaborations of
the basic graph-searching algorithms
6/29/2004 Lecture 9
COSC3101A
46
Breadth-First Search (BFS)
• Input:
– A graph G = (V, E) (directed or undirected)
– A source vertex s  V
• Goal:
– Explore the edges of G to “discover” every vertex
reachable from s, taking the ones closest to s first
• Output:
– d[v] = distance (smallest # of edges) from s to v, for
all v  V
– A “breadth-first tree” rooted at s that contains all
reachable vertices
6/29/2004 Lecture 9
COSC3101A
47
Breadth-First Search (cont.)
• Discover vertices in increasing order of distance
from the source s – search in breadth not depth
– Find all vertices at 1 edge from s, then all vertices at 2
edges from s, and so on
2
1
3
5
4
11
6
7
12
9
7
6/29/2004 Lecture 9
COSC3101A
48
Breadth-First Search (cont.)
• Keeping track of progress:
– Color each vertex in either white,
gray or black
source
1
2
3
5
4
– When being discovered a vertex
becomes gray
1
2
– After discovering all its adjacent
vertices the node becomes black
5
4
1
2
– Initially, all vertices are white
– Use FIFO queue Q to maintain the
set of gray vertices
3
3
5
6/29/2004 Lecture 9
COSC3101A
4
49
Breadth-First Tree
• BFS constructs a breadth-first tree
– Initially contains the root (source vertex s)
– When vertex v is discovered while scanning
source
the adjacency list of a vertex u  vertex v
and edge (u, v) are added to the tree
– u is the predecessor (parent) of v in the
1
2
3
5
4
breadth-first tree
– A vertex is discovered only once  it has at
most one parent
6/29/2004 Lecture 9
COSC3101A
50
BFS Additional Data Structures
• G = (V, E) represented using adjacency lists
• color[u] – the color of the vertex for all u  V
• [u] – predecessor of u
– If u = s (root) or node u has not yet been
1
source
d=1
=1
2
3
discovered  [u] = NIL
• d[u] – the distance from the source s to
vertex u
5
4
d=1
=1
d=2
=5
d=2
=2
• Use a FIFO queue Q to maintain the set of
gray vertices
6/29/2004 Lecture 9
COSC3101A
51
BFS(G, s)
1. for each u  V[G] - {s}
2.
r
s
t
u
v
r
w
s
x
t
y
u


do color[u]  WHITE
3.
d[u] ← 
4.
[u] = NIL

5. color[s]  GRAY
6. d[s] ← 0
7. [s] = NIL
8. Q  
9. Q ← ENQUEUE(Q, s)
6/29/2004 Lecture 9
COSC3101A

v
r

w
s

x
t

y
u

0




v
w
Q: s

x

y
52
BFS(V, E, s)
10. while Q  
11.
do u ← DEQUEUE(Q)
12.
for each v  Adj[u]
do if color[v] = WHITE
13.
14.
then color[v] ← GRAY
15.
d[v] ← d[u] + 1
16.
17.
18.
s
t
u

0


Q: s

v
r

w
s

x
t

y
u

0


Q: w

v
1
w

x

y
[v] = u
r
s
t
u
ENQUEUE(Q, v)
1
0


Q: w, r

v
color[u]  BLACK
6/29/2004 Lecture 9
r
COSC3101A
1
w

x

y
53
Example
r
s
t
u
r
s
t
u
r
s
t
u

0


1
0


1
0
2



v
w
Q: s
r
s

x

y

x

y

y
u
t
u

1
v
w
Q: r, t, x
r
s
2
x
t

1
v
w
Q: w, r
r
s
t
u
1
2

1
2
3
0
2
3
2
x

y
2
x
3
y
t
u
2
1
v
w
Q: v, u, y
r
s
t
u
2
3
1
0
2
3
3
y
2
1
v
w
Q: 
2
x
3
y
0
0
2
1
v
w
Q: t, x, v
r
s
2
x

y
t
u
2
1
v
w
Q: x, v, u
r
s
1
2
3
1
3
y
2
1
2
v
w
x
Q: y COSC3101A
0
2
1
2
v
w
x
Q:
u, y Lecture 9
6/29/2004
0
1
54
Analysis of BFS
1. for each u  V - {s}
2.
do color[u]  WHITE
3.
d[u] ← 
4.
[u] = NIL
O(V)
5. color[s]  GRAY
6. d[s] ← 0
7. [s] = NIL
(1)
8. Q  
9. Q ← ENQUEUE(Q, s)
6/29/2004 Lecture 9
COSC3101A
55
Analysis of BFS
10. while Q  
11.
do u ← DEQUEUE(Q)
12.
for each v  Adj[u]
do if color[v] = WHITE
13.
14.
then color[v] = GRAY
15.
d[v] ← d[u] + 1
16.
[v] = u
17.
ENQUEUE(Q, v)
18.
(1)
Scan Adj[u] for all vertices
in the graph
• Each vertex is scanned only
once, when the vertex is
dequeued
• Sum of lengths of all
adjacency lists = (E)
• Scanning operations:
O(E)
(1)
color[u]  BLACK
• Total running time for BFS = O(V + E)
6/29/2004 Lecture 9
COSC3101A
56
Shortest Paths Property
• BFS finds the shortest-path distance from the
source vertex s  V to each node in the graph
• Shortest-path distance = (s, u)
– Minimum number of edges in any path from s to u
source
s
t
u
1
0
2
3
2
1
2
3
x
y
r
v
6/29/2004 Lecture 9
w
COSC3101A
57
Readings
• Chapter 16
• Chapter 22
6/29/2004 Lecture 9
COSC3101A
58