Transcript Lesson 5

M.Tech. (CS), Semester III, Course B50
Functional Brain Signal
Processing: EEG & fMRI
Lesson 5
Kaushik Majumdar
Indian Statistical Institute
Bangalore Center
[email protected]
Sur & Sinha, Ind. J. Psychiatr., 18(1): 70 – 73, 2009;
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3016705/
Some Important ERPs

N100: A negative deflection peaking between
90 and 200 msec after the onset of stimulus,
is observed when an unexpected stimulus is
presented. It is an orienting response or a
“matching process,” that is, whenever a
stimulus is presented, it is matched with
previously experienced stimuli. It has
maximum amplitude over Cz and is therefore
also called “vertex potential.”
Sur & Sinha, Ind. J. Psychiatr., 18(1): 70 – 73, 2009;
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3016705/
Important ERPs (cont.)

P300: The P3 wave was discovered by
Sutton et al. in 1965 and since then has been
the major component of research in the field
of ERP. For auditory stimuli, the latency
range is 250-400 msec for most adult
subjects between 20 and 70 years of age.
The latency is usually interpreted as the
speed of stimulus classification resulting from
discrimination of one event from another.
Sur & Sinha, Ind. J. Psychiatr., 18(1): 70 – 73, 2009;
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3016705/
Important ERPs (cont.)
Shorter latencies indicate superior mental
performance relative to longer latencies. P3
amplitude seems to reflect stimulus
information such that greater attention
produces larger P3 waves. A wide variety of
paradigms have been used to elicit the P300,
of which the “oddball” paradigm is the most
utilized where different stimuli are presented
in a series such that one of them occurs
relatively infrequently — that is the oddball.
Sur & Sinha, Ind. J. Psychiatr., 18(1): 70 – 73, 2009;
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3016705/
Important ERPs (cont.)
The subject is instructed to respond to the
infrequent or target stimulus and not to the
frequently presented or standard stimulus.
Reduced P300 amplitude is an indicator of
the broad neurobiological vulnerability that
underlies disorders within the externalizing
spectrum {alcohol dependence, drug
dependence, nicotine dependence, conduct
disorder and adult antisocial behavior}
(Patrick et al., 2006).
http://pubs.niaaa.nih.gov/publications/arh313/238-242.htm
ERP Plot by Time Frequency
Diagram
Spectral Estimation

Spectral estimation can be achieved by
Fourier transform or by autoregressive
models (there are other methods also, see
Chapter 14 of Digital Signal Processing:
Principles, Algorithms & Applications, 4th ed.,
by J. G. Proakis and D. J. Manolakis,
Pearson, 2007).
http://paulbourke.net/miscellaneous/windows/
Spectral Estimation by Fourier
Transform (Nonparametric)
Welch window

 x(t ) exp( j 2 nt )dt  a
n
 jbn
Continuous form

M
lim
M 

m  M
an2  bn2
x(m) exp( j 2 nm)W (m)  an  jbn
Discrete form
M

m  2
W ( m)  1  
M

 2
is the power associated with frequency n.

 x(t ) exp( j 2 nt )W (t )dt  a
n

 jbn
W() is a window function





2
Limitations




Long data length required.
Spectral leakage masks weak signals.
Poor frequency resolution.
Signal has to be stationary (statistical
properties do not change over time).
Gaussian (White) Noise



A Gaussian noise signal is with 0 mean.
It is wide sense stationary, that is if the
digitized Gaussian white noise signal is ω(n)
then E[ω(n)ω*(n + k)] = 0 for all k > 0 (k is
nonnegative).
In addition it can be made to have unit
variance.
Spectral Estimation by Autoregressive Moving Average (ARMA)
(Parametric)
Has following advantages:
 Suitable for short data length.
 Gives better frequency resolution.
 Avoids spectral leakage.
ARMA (cont.)
q
H ( z) 
B( z )

A( z )
b z
k 0
p
k
k
H(z) is the system function of ARMA(p,q), where
(p,q) is the model order.
1   ak z  k
k 1
p
q
y (n)   ak y (n  k )   bk x(n  k )
k 1
Autoregress
ive (AR)
k 0
Moving
average (MA)
Corresponding difference equation
with y(n) output and x(n) input.
(1)
Proakis and Manolakis, 2007, p. 987
ARMA (cont.)
 xx (m)    (m)
2
x
M
1
 xx (m) 
1  2M
For a Gaussian white noise signal, x2 is
variance of the noise x and  is Dirac delta
function.
M
 
x(n) x(n  k ) exp( j 2 mn)
k  M n  M
 xx ( f )  H ( f )  ( f )
2
 xx ( f )   2 H ( f )   2
2
B( f )
A( f )
2
2
For the power spectrum of a
general signal x, where ω is
a Gaussian white noise
signal.
Gray et al., Nature, 338: 334 – 337, 23 March 1989
Autocorrelation &
Crosscorrelation
ARMA (cont.)


In an ARMA model if A(z) = 1 then H(z) =
B(z) and the model reduces to moving
average (MA) process of order q.
In an ARMA model if B(z) = 1 then H(z) =
1/A(z) and the model reduces to
autoregressive (AR) process of order p.
Proakis and Manolakis, 2007, p. 987
Wold’s Decomposition Theorem


Any ARMA or MA process can be
represented uniquely by an AR model of
possibly infinite order.
Any ARMA or AR process can be
represented by an MA model of possibly
infinite order.
Model Parameters of ARMA Given
by Yule-Walker Equations
 xx (q  1)
  xx (q)
  (q  1)
 xx (q)
 xx

.
.

.
.

 xx ( p  q  1)  xx (q  p  2)
.
.
.
.
.
.  xx (q  p  1)   a1 
  xx (q  1) 
 
  (q  2) 
.  xx (q  p  2)   a2 
 xx

 .   

.
.
.
 


.
.
.
.
 


 xx (q  p) 
.
 xx (q)   a p 
Here m > q. m is the frequency at which power spectrum is to be
determined.
Yule-Walker Equations (cont.)
Yule-Walker equations give the model parameters
for the AR part in the ARMA model.
 p
mq
 ak  xx (m  k )
 k 1
qm
 p
 xx (m)   ak  xx (m  k )   2  h(k )bk  m
k 0
 k 1
  (m)
m0
xx


0qm
Proakis and Manolakis, 2007, p. 990
Model Parameters for the MA Part
 2 p
0qm
   bk bk  m
k 1

 xx (m)  0
mq
  (m)
m0
 xx

p
 2   xx (0)   ak  xx ( k )
k 1
Proakis and Manolakis, 2007, p. 837
Derivation of Yule-Walker
Equations

When the power spectral density of the
stationary random process is a rational
function, there is a basic relationship
between the autocorrelation sequence { xx (m)}
and the model parameters ak and bk . This is
given by Yule-Walker equations.
Derivation (cont.)

ARMA difference equation (1) for input as
white Gaussian noise and output x is
p
q
k 1
k 0
x(n)   ak x(n  k )   bk  (n  k )

Multiplying both sides by x*(n - m) and taking
expectation we get
p
q
k 1
k 0
E[ x(n) x (n  m)]   ak E[ x(n  k ) x  (n  m)]   bk E[ (n  k ) x  (n  m)]
Proakis and Manolakis, 2007, p. 837
Derivation (cont.)
p
q
 xx (m)   ak  xx (m  k )   bk   x (m  k )
k 1
(2)
k 0
While
  x (m)  E[ x (n) (n  m)]
p
 p




 E   h(k ) (n  k ) (n  m)   E   h(k ) (m  k ) 
 k 1

 k 1

  2 h( m)
Derivation (cont.)
Since ω is white noise, in the last step we have
used
0
  x (m)   2
  h(m)
Combining (2) and (3) we get
m0
m0
(3)
Derivation (cont.)
 p
mq
 ak  xx (m  k )
 k 1
qm
 p
 xx (m)   ak  xx (m  k )   2  h(k )bk  m
k 0
 k 1
  (m)
m0
xx


0mq
Probability Density Function
Estimation


Probability density function estimation of a
one dimensional data set is very similar to
power spectrum estimation. The only
difference is that for a PDF the integral over
the whole space will have to be unity,
whereas power spectrum estimation does
not have any such constraint.
See for detail “Model-based probability
density function estimation,” S. Kay, IEEE
Sig. Proc. Lett., 5(12): 318 – 320, 1998.
References


Digital Signal Processing: Principles,
Algorithms & Applications, 4th ed., J. G.
Proakis and D. J. Manolakis, Pearson, 2007.
Chapter 14.
Modern Spectral Estimation: Theory &
Application, S. M. Kay, Pearson, 1988, for a
general perusal of nonparametric spectral
estimation and probability distribution
function estimation methods beyond YuleWalker equations.
THANK YOU
This lecture is available at http://www.isibang.ac.in/~kaushik