Large-Scale Brain Modeling

Download Report

Transcript Large-Scale Brain Modeling

Cognitive Computing….
Computational Neuroscience
Jerome Swartz
The Swartz Foundation
May 10, 2006
Large Scale Brain Modeling
• Science IS modeling
• Models have power
– To explain
– To predict
– To simulate
– To augment
Why model the brain?
Brains are not computers …
• But they are supported by the same physics
 Energy conservation
 Entropy increase
 Least action
 Time direction
• Brains are supported by the same logic,
but implemented differently…
– Low speed; parallel processing; no symbolic software layer;
fundamentally adaptive / interactive; organic vs. inorganic
Brain research must be multi-level
• Scientific collaboration is needed
– Across spatial scales
– Across time scales
– Across measurement techniques
• Current field borders should not remain
boundaries… Curtail Scale Chauvinism!
…both scientifically and mathematically
• To understand, both theoretically and practically,
how brains support behavior and experience
• To model brain / behavior dynamics as Active
requires
– Better behavioral measures and modeling
– Better brain dynamic imaging / analysis
– Better joint brain / behavior analysis
… the next research frontier
• Brains are active and multi-scale / multi-level
• The dominant multi-level model: Computers
… with their physical / logical computer hierarchy
– the OSI stack
– physical / implementation levels
– logical / instruction levels
A Multi-Level View of Learning
LEVEL
UNIT
INTERACTIONS
LEARNING
ecology
society
predation,
symbiosis
natural selection
society
organism
behaviour
sensory-motor
learning
organism
cell
synapse
protein
cell
synapse
protein
amino acid
spikes
voltage, Ca
direct,V,Ca
molecular forces
synaptic plasticity
(
= STDP)
bulk molecular changes
molecular changes
gene expression,
protein recycling
Increasing
Timescale
LEARNING at a LEVEL is CHANGE IN INTERACTIONS between its UNITS,
implemented by INTERACTIONS at the LEVEL beneath, and by extension
resulting in CHANGE IN LEARNING at the LEVEL above.
Separation of timescales allows INTERACTIONS at one LEVEL
to be LEARNING at the LEVEL above.
Interactions=fast
Learning=slow
T.Bell
A Multi-Level View of Learning
LEVEL
UNIT
DYNAMICS
LEARNING
ecology
society
predation,
symbiosis
natural selection
society
organism
behaviour
sensory-motor
learning
organism
cell
synapse
protein
cell
synapse
protein
amino acid
spikes
voltage, Ca
direct,V,Ca
molecular forces
synaptic plasticity
(
= STDP)
bulk molecular changes
molecular changes
gene expression,
protein recycling
Increasing
Timescale
LEARNING at one LEVEL is implemented by
DYNAMICS between UNITS at the LEVEL below.
Separation of timescales allows DYNAMICS at one LEVEL
to be LEARNING at the LEVEL above.
Dynamics=fast
Learning=slow
What idea will fill in the question mark?
physiology
(of STDP)
(STDP=spike timingdependent plasticity)
T.Bell
physics of selforganisation
?
probabilistic
machine learning
? = the Levels Hypothesis:
Learning in the brain is:
-unsupervised probability density estimation across scales
- the smaller (molecular) models the larger (spikes)….
suggested by STDP physiology, where information flow
from neurons to synapses is inter-level….
Multi-level modeling:
network of neurons
network of 2 brains
1 cell
1 brain
network of protein complexes
(e.g., synapses)
network of macromolecules
Networks within networks
T.Bell
Infomax between Levels.
(eg: synapses density-estimate spikes)
1
t
all neural spikes
2
ICA/Infomax between Layers.
(eg: V1 density-estimates Retina)
y
synapses,
dendrites
y
all synaptic readout
• between-level
• includes all feedback
• molecular net models/creates
• social net is boundary condition
• permits arbitrary activity dependencies
• models input and intrinsic together
pdf of all synaptic ‘readouts’
V1
synaptic
weights
x
retina
• within-level
• feedforward
• molecular sublevel is ‘implementation’
• social superlevel is ‘reward’
• predicts independent activity
• only models outside input
pdf of all spike times
If we can
make this
pdf uniform
then we have a model
constructed from all synaptic and dendritic causality
ICA transform minimises statistical
dependence between outputs. The
bases produced are data-dependent,
not fixed as in Fourier or Wavelet
transforms.
The Infomax principle/ICA algorithms
Many applications (6 international ICA workshops)…
• audio separation in real acoustic environments (as above)
• biomedical data-mining -- EEG,fMRI,
• image coding
Cognitive Computing…Computational Neuroscience
T.Bell