week03-patternRec.ppt

Download Report

Transcript week03-patternRec.ppt

Pattern Recognition Concepts




Chapter 4: Shapiro and Stockman
How should objects be represented?
Algorithms for recognition/matching
* nearest neighbors
* decision tree
* decision functions
* artificial neural networks
How should learning/training be done?
Stockman CSE803 Fall 2008
1
Feature Vector Representation




X=[x1, x2, … , xn],
each xj a real number
Xj may be object
measurement
Xj may be count of
object parts
Example: object rep.
[#holes, A, moments, ]
Stockman CSE803 Fall 2008
2
Possible features for char rec.
Stockman CSE803 Fall 2008
3
Some Terminology



Classes: set of m known classes of objects
(a) might have known description for each
(b) might have set of samples for each
Reject Class:
a generic class for objects not in any of
the designated known classes
Classifier:
Assigns object to a class based on features
Stockman CSE803 Fall 2008
4
Classification paradigms
Stockman CSE803 Fall 2008
5
Discriminant functions



Functions f(x,
K) perform
some
computation on
feature vector
x
Knowledge K
from training or
programming is
used
Final stage
determines
Stockman CSE803 Fall 2008
6
Decision-Tree Classifier



Uses subsets of
features in seq.
Feature extraction
may be interleaved
with classification
decisions
Can be easy to
design and efficient
in execution
Stockman CSE803 Fall 2008
7
Classification using nearest
class mean



Compute the
Euclidean distance
between feature
vector X and the
mean of each class.
Choose closest
class, if close
enough (reject
otherwise)
Low error rate at left
Stockman CSE803 Fall 2008
8
Nearest mean might yield
poor results with complex
structure


Stockman CSE803 Fall 2008
Class 2 has two
modes
If modes are
detected, two
subclass mean
vectors can be
used
9
Scaling coordinates by std dev
Stockman CSE803 Fall 2008
10
Another problem for nearest
mean classification




If unscaled, object X
is equidistant from
each class mean
With scaling X closer
to left distribution
Coordinate axes not
natural for this data
1D discrimination
possible with PCA
Stockman CSE803 Fall 2008
11
Receiver Operating Curve ROC


Plots correct
detection rate
versus false
alarm rate
Generally, false
alarms go up
with attempts
to detect
higher
percentages of
known objects
Stockman CSE803 Fall 2008
12
Example Face ID Methods
From Colbry,
Stockman et al
Stockman CSE803 Fall 2008
13
Different Test Curves
Stockman CSE803 Fall 2008
14
Confusion matrix shows
empirical performance
Stockman CSE803 Fall 2008
15
Bayesian decision-making
Stockman CSE803 Fall 2008
16
Normal distribution



0 mean and unit
std deviation
Table enables us
to fit histograms
and represent
them simply
New observation
of variable x can
then be
translated into
probability
Stockman CSE803 Fall 2008
17
Parametric Models can be
used
Stockman CSE803 Fall 2008
18
Cherry with bruise


Intensities at about 750 nanometers wavelength
Some overlap caused by cherry surface turning away
Stockman CSE803 Fall 2008
19