MEG/EEG Inverse problem and solutions In a Bayesian Framework ? Jérémie Mattout Lyon Neuroscience Research Centre With many thanks to Karl Friston, Christophe Phillips, Rik Henson,

Download Report

Transcript MEG/EEG Inverse problem and solutions In a Bayesian Framework ? Jérémie Mattout Lyon Neuroscience Research Centre With many thanks to Karl Friston, Christophe Phillips, Rik Henson,

MEG/EEG Inverse problem and solutions
In a Bayesian Framework
?
Jérémie Mattout
Lyon Neuroscience Research Centre
With many thanks to Karl Friston, Christophe Phillips,
Rik Henson, Jean Daunizeau
EEG/MEG SPM course, Bruxelles, 2011
Talk’s Overview
• SPM rationale
- generative models
- probabilistic framework
- Twofold inference: parameters & models
• EEG/MEG inverse problem and SPM solution(s)
- probabilistic generative models
- Parameter inference and model comparison
A word about generative models
Model: "measure, standard" ; representation or object that enables to describe the functionning
of a physical system or concept
A model enables you to:
- Simulate data
- Estimate (non-observables) parameters
- Predict future observations
- Test hypothesis / Compare models
Physiological Observations
Stimulations
Behavioural Observations
A word about generative models
Model: "measure, standard" ; representation or object that enables to describe the functionning
of a physical system or concept
A model enables you to:
- Simulate data
- Estimate (non-observables) parameters
- Predict future observations
- Test hypothesis / Compare models
MEG Observations (Y)
Sources/Network ()
Y = f(,u)
Model m: f, , u
Auditory-Visual Stimulations (u)
Probabilistic / Bayesian framework
Probability of an event:
- represented by real numbers
- conforms to intuition
- is consistent
• normalization:
a=2
• marginalization:
a=2
b=5
• conditioning :
(Bayes rule)
Probabilistic modelling
MEG Observations (Y)
Y = f(,u)
Sources/Network ()
Model m: f, , u
Auditory-Visual Stimulations (u)
Likelihood
Posterior
P Y , M  
PY  , M P M 
Prior
PY M 
Marginal or Evidence
PY M    PY  , M P M 

Probabilistic modelling enables:
- To formalize mathematically our knowledge in a model m
- To account for uncertainty
- To make inference on both model parameters and models themselves
A toy example
MEG Observations (Y)
- One dipolar source with known position and orientation.
- Amplitude ?
Measurment noise
Linear f
Y = L + ɛ
Source amplitude
Model m:
Source gain vector
Gaussian
distributions
p ɛ ~Ν 𝐿𝜃, 𝑉1 or 𝑝 𝑌 𝜃 ~Ν 𝐿𝜃, 𝑉1
&
𝑝 𝜃 ~Ν 0, 𝑉2
Prior
Likelihood
A toy example
MEG Observations (Y)
Model m:
Bayes rule
𝑝 𝑌𝜃 𝑝 𝜃
𝑝 𝜃𝑌 =
𝛼 𝑝 𝑌𝜃 𝑝 𝜃
𝑝 𝑌
Posterior
𝑝 𝑌 𝜃 ~Ν 𝐿𝜃, 𝑉1
&
𝑝 𝜃 ~Ν 0, 𝑉2
Hypothesis testing: model comparison
Occam’s razor or principle of parsimony
𝑝 𝑌𝜃 𝑝 𝜃
𝑝 𝜃𝑌 =
𝑝 𝑌
« complexity should not be assumed without necessity »
𝑝 𝑌𝑚 =
𝑝 𝑌 𝜃, 𝑚 𝑝 𝜃 𝑚 dθ
y=f(x)
x
model evidence p(y|m)
y = f(x)
Evidence
space of all data sets
Hypothesis testing: model comparison
Bayesian factor
• define the null and the alternative hypothesis H (or model m) in terms of priors, e.g.:
p Y H 0 
1 if   0
H 0 : p  H 0   
0 otherwise
H1 : p  H1   N  0,  
p  Y H1 
y
• invert both generative models (obtain both model evidences)
• apply decision rule, i.e.:
if
P  H0 y 
P  H1 y 
 1 then reject H0
Y
space of all datasets
EEG/MEG inverse problem
Probabilistic framing
forward computation
Likelihood & Prior
𝑝 𝑌 𝜃, 𝑚 𝑝 𝜃 𝑚
Posterior & Evidence
𝑝 𝜃 𝑌, 𝑚
𝑝 𝑌𝑚
inverse computation
EEG/MEG inverse problem
Distributed/Imaging model
Likelihood
Y  LJ  
PY  , M   N LJ ,  
Parameters : (J,)
Hypothesis m: distributed (linear) model, gain matrix L, gaussian distributions
Prior
Source level
P J   N 0,  
# sources
Sensor level
# sensors
# sources
   2I
# sensors
IID
(Minimum Norm)
Maximum Smoothness
(LORETA-like)
EEG/MEG inverse problem
Incorporating Multiple Constraints
Likelihood
Y  LJ  
PY  , M   N LJ ,  
Paramètres : (J,,)
Hypothèses m: hierarchical model, operator L + components C
Prior
…
Source (or sensor) level
P J   N 0,  
  1Q    k Q
log  ~ 𝑁 𝛼, 𝛽
1
k
Multiple Sparse Priors (MSP)
Estimation procedure
Expectation Maximization (EM) / Restricted Maximum Likelihood (ReML) / Free-Energy
optimization / Parametric Empirical Bayes (PEB)
Iterative scheme
E-step
qˆ ( J M )  arg max F
q( J M )
 p( J Y , ˆ, M )
M-step
ˆ  arg max F

F  log p(Y M )  KLq , p Y , M   log pY  , M  q  KLq , p M 
accuracy
complexity
Estimation procedure
Model comparison based on the Free-energy
At convergence
F  ln p (Y | M )  accuracy( M )  complexity( M )
Fi
1
2
3
model Mi
At the end of the day
Somesthesic
data
Example
MEG - Epilepsy
- Pharmacoresistive Epilepsy (surgery planning):
• symptoms
• PET + sIRM
• SEEG
Romain Bouet
Julien Jung
François Maugière
Could MEG replace or at least complement and guide SEEG ?
Seizure
30s
120 patients : MEG proved very much informative in 85 patients
Example
MEG - Epilepsy
Patient 1 : model comparison
Romain Bouet
Julien Jung
François Maugière
SEEG
MEG
(best model)
Example
MEG - Epilepsy
Patient 2 : estimated dynamics
Romain Bouet
Julien Jung
François Maugière
temps
SEEG
lésion occipitale
Conclusion
The SPM probabilistic inverse modelling approach enables to:
• Estimate both parameters and hyperparameters from the data
• Incorporate multiple priors of different nature
• Estimate a full posterior distribution over model parameters
• Estimate an approximation to the log-evidence (the free-energy) which
enables model comparison based on the same data
• Encompass multimodal fusion and group analysis gracefully
• Note that SPM also include a flexible and convenient meshing tool, as
well as beamforming solutions and a Bayesian ECD approach…
Thank you for your attention
EEG/MEG inverse problem
Graphical representation
( j)
Q1( j ) Q2
...
Q1(e) Q(2e)
i( e )
i( j )
C( e )
( j)
C
N (0, C)
N (0, C)
Ε
J
Fixed
L
Variable
Data
...
Y
Fusion of different modality
( j)
Q1( j ) Q2
( e)
( e)
Q11
Q12
( e)
Q(21e) Q22
i( j )
ij( e )
C( j )
C1( e)
C(2e)
N (0, C)
N (0, C)
J
Ε
LMEG
LEEG
YMEG
YEEG
Incorporating fMRI priors
Hypothesis testing: inference on parameters
Frequentist vs. Bayesian approach
• define the null, e.g.:
H0 :   0
• invert model (obtain posterior pdf)
p  y 
p t H 0 
P  H0 y 
P t  t * H 0 
t  t  y
t*

H0 :   0
• estimate parameters (obtain test stat.)
• define the null, e.g.:
• apply decision rule, i.e.:
• apply decision rule, i.e.:
if P  t  t * H 0   
then reject H0
classical inference (SPM)
if P  H 0 y   
then accept H0
Bayesian inference (PPM)