Transcript Slide 1

Introduction To The Pattern
Recognition Applet:
Ryan Irwin
Intelligent Electronics Systems
Human and Systems Engineering
Center for Advanced Vehicular Systems
URL: www.cavs.msstate.edu/hse/ies/publications/seminars/msstate/2006/pattern_recognition/
General Overview
o Java based applet that demonstrates various algorithms
implemented at IES
o Each implementation closely mirrors the code and
functionality of the actual implementation in the
repository
o Two types of algorithms implemented
 Pattern Classification: PCA, LDA, SVM, RVM
• Separation of 2 or more classes
 Signal Tracking/Modeling: LP, KF, UKF, PF
• Time based
• One signal/class at a time
Introduction to Pattern Recognition Applet
Page 1 of 13
Pattern Classification
o Algorithms separate
different classes with line
of discrimination
o Different colored points
represent different classes
o Deemed successful if
there are no points of
different color on the same
side of the line
o At left, orange line
separates red and green
classes
Introduction to Pattern Recognition Applet
Page 2 of 13
Pattern Classification – Principal Component Analysis
o A covariance generally describes how two datasets relate
to each other
o A transform maps point from current space to a new
feature space
o Class-Independent PCA – One covariance and transform
for all points calculated
o Class-Dependent PCA – A covariance and transform for
each class is calculated
o Points are mapped from current space to new space with
use of the transforms
Introduction to Pattern Recognition Applet
Page 3 of 13
Pattern Classification – Linear Discrimination Analysis
o Within-class scatter defines distribution of a set
o Between-class scatters defines scatter of expected
vectors around the global mean
o Class Independent – Single between-class scatter
o Class Dependent – Multiple between-class scatters
o Goal is to minimize within-class scatters and maximize
between-class scatters
Introduction to Pattern Recognition Applet
Page 4 of 13
Pattern Classification – Support Vector Machine
o Classification by light training
o Training picks out points
nearest other classes
o This reduces the number of
points for final classification
o Final classification is takes
more computation with SVM than
RVM
o More practical if one-time
training and one-time
classification
Introduction to Pattern Recognition Applet
Page 5 of 13
Pattern Classification – RVM
o Training is more
computationally involved
o A selection of points most
suitable for classification is
made
o Only a few points are used for
final classification (fewer than
SVM)
o More practical if training is not
needed every time a
classification is made
Introduction to Pattern Recognition Applet
Page 6 of 13
Signal Tracking
o Algorithms track a timebased signal from left to
right
o A signal’s next state is
predicted given the previous
states
o Regular interval sampling
by interpolation
o Algorithms are recursive
in nature
o Noise is simulated
Introduction to Pattern Recognition Applet
Page 7 of 13
Signal Tracking – Kalman Filter
o Observation equation relates observations and states
o The state equation predicts the next state
o Algorithm runs two steps repeatedly
 State prediction stage uses state equation and state gain factor
to predict next state
 Update state stage compares previous state and observation with
noises to make final prediction
o Upon completion mean square error is given
Introduction to Pattern Recognition Applet
Page 8 of 13
Signal Tracking – Unscented Kalman Filter
o Algorithm has same basic operation as conventional
Kalman Filter
o Sigma points are used (alpha, beta, and kappa)
o Each sigma point has a weight that ends up effecting the
overall mean of the filtered signal
o Modification generally reduces the mean square error
Introduction to Pattern Recognition Applet
Page 9 of 13
Signal Tracking – Particle Filtering
o Based on sequential Monte
Carlo techniques
o Has state and observation
equations like KF
o Particles are used for
prediction
o They form a probability
distribution of an observation at
each step
o Algorithm functions best when
applied with non-linear signals
Introduction to Pattern Recognition Applet
Page 10 of 13
Important points
o Pattern Classification
 Multiple classes
 Not time-based data
 Performance based on percentage of correctly
classified points
o Signal Tracking
 Single class of points
 Time-based and interpolated data
 Performance based on mean square error
o Is there a need for separate applets?
Introduction to Pattern Recognition Applet
Page 11 of 13
Tutorials
o Detailed operation of
each algorithm is given
o More algorithm detail is
given in the tutorial
section
Go to tutorials
Introduction to Pattern Recognition Applet
Page 12 of 13
References
•
S. Haykin and E. Moulines, "From Kalman to Particle Filters," IEEE International Conference on Acoustics,
Speech, and Signal Processing, Philadelphia, Pennsylvania, USA, March 2005.
•
M.W. Andrews, "Learning And Inference In Nonlinear State-Space Models," Gatsby Unit for Computational
Neuroscience, University College, London, U.K., December 2004.
•
P.M. Djuric, J.H. Kotecha, J. Zhang, Y. Huang, T. Ghirmai, M. Bugallo, and J. Miguez, "Particle Filtering," IEEE
Magazine on Signal Processing, vol 20, no 5, pp. 19-38, September 2003.
•
N. Arulampalam, S. Maskell, N. Gordan, and T. Clapp, "Tutorial On Particle Filters For Online Nonlinear/ NonGaussian Bayesian Tracking," IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174-188, February
2002.
•
R. van der Merve, N. de Freitas, A. Doucet, and E. Wan, "The Unscented Particle Filter," Technical Report CUED/FINFENG/TR 380, Cambridge University Engineering Department, Cambridge University, U.K., August 2000.
•
S. Gannot, and M. Moonen, "On The Application Of The Unscented Kalman Filter To Speech Processing,"
International Workshop on Acoustic Echo and Noise, Kyoto, Japan, pp 27-30, September 2003.
•
J.P. Norton, and G.V. Veres, "Improvement Of The Particle Filter By Better Choice Of The Predicted Sample Set,"
15th IFAC Triennial World Congress, Barcelona, Spain, July 2002.
•
J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill, "Particle Methods For Bayesian Modeling And Enhancement
Of Speech Signals," IEEE Transaction on Speech and Audio Processing, vol 10, no. 3, pp 173-185, March 2002.
•
M. Gabrea, “Robust Adaptive Kalman Filtering-based Speech Enhancement Algorithm,” ICASSP 2004, vol 1, pp. I301-I-304, May 2004.
•
K. Paliwal, :Estiamtion og noise variance from the noisy AR signal and its application in speech enhancement,”
IEEE transaction on Acoustics, Speech, and Signal Processing, vol 36, no 2, pp 292-294, Feb 1988.
Introduction to Pattern Recognition Applet
Page 13 of 13