week03-patternRec.ppt

Download Report

Transcript week03-patternRec.ppt

Pattern Recognition Concepts




Chapter 4: Shapiro and Stockman
How should objects be represented?
Algorithms for recognition/matching
* nearest neighbors
* decision tree
* decision functions
* artificial neural networks
How should learning/training be done?
Stockman CSE803 Fall 2009
1
Feature Vector Representation




X=[x1, x2, … , xn], each xj
a real number
Xj may be object
measurement
Xj may be count of object
parts
Example: object rep.
[#holes, Area, moments, ]
Stockman CSE803 Fall 2009
2
Possible features for char rec.
Stockman CSE803 Fall 2009
3
Some Terminology



Classes: set of m known classes of objects
(a) might have known description for each
(b) might have set of samples for each
Reject Class:
a generic class for objects not in any of
the designated known classes
Classifier:
Assigns object to a class based on features
Stockman CSE803 Fall 2009
4
Classification paradigms
Stockman CSE803 Fall 2009
5
Discriminant functions



Functions f(x, K)
perform some
computation on
feature vector x
Knowledge K from
training or
programming is
used
Final stage
determines class
Stockman CSE803 Fall 2009
6
Decision-Tree Classifier



Uses subsets of
features in seq.
Feature extraction
may be interleaved
with classification
decisions
Can be easy to
design and efficient
in execution
Stockman CSE803 Fall 2009
7
Decision Trees
#holes
0
2
1
moment of
inertia
#strokes
t
<t
0
best axis
direction
0
-
60
/
90
1
1
#strokes
2
x
#strokes
0
1
4
w
0
A
Stockman CSE803 Fall 2009
8
B
8
Classification using nearest
class mean



Compute the Euclidean
distance between
feature vector X and the
mean of each class.
Choose closest class, if
close enough (reject
otherwise)
Low error rate at left
Stockman CSE803 Fall 2009
9
Nearest mean might yield poor
results with complex structure


Stockman CSE803 Fall 2009
Class 2 has two
modes
If modes are
detected, two
subclass mean
vectors can be
used
10
Scaling coordinates by std dev
Stockman CSE803 Fall 2009
11
Another problem for nearest
mean classification




If unscaled, object X
is equidistant from
each class mean
With scaling X closer
to left distribution
Coordinate axes not
natural for this data
1D discrimination
possible with PCA
Stockman CSE803 Fall 2009
12
Receiver Operating Curve ROC


Plots correct
detection rate
versus false
alarm rate
Generally, false
alarms go up
with attempts to
detect higher
percentages of
known objects
Stockman CSE803 Fall 2009
13
Confusion matrix shows
empirical performance
Stockman CSE803 Fall 2009
14
Bayesian decision-making
Stockman CSE803 Fall 2009
15
Normal distribution



0 mean and unit
std deviation
Table enables us to
fit histograms and
represent them
simply
New observation of
variable x can then
be translated into
probability
Stockman CSE803 Fall 2009
16
Parametric Models can be
used
Stockman CSE803 Fall 2009
17
Cherry with bruise


Intensities at about 750 nanometers wavelength
Some overlap caused by cherry surface turning away
Stockman CSE803 Fall 2009
18