My Liquid Brain - University of Sussex

Download Report

Transcript My Liquid Brain - University of Sussex

The Liquid Brain
Chrisantha Fernando &
Sampsa Sojakka
Motivations
• Only 30,000 genes, ≈1011 neurons
• Attractor neural networks, Turing machines
• Problems with classical models
–
–
–
–
–
Often depend on synchronization by a central clock
Particular recurrent circuits need to be constructed for each task
Recurrent circuits often unstable and difficult to regulate
Lack parallelism
Real organisms cannot wait for convergence to an attractor
• Wolfgang Maass’ invented the Liquid State Machine (a
model of the cortical microcircuit) in which he viewed the
network as a liquid (or liquid-like dynamical system).
Liquid State Machine (LSM)
•
Maass’ LSM is a spiking recurrent
neural network which satisfies two
properties
–
–
•
Separation property (liquid)
Approximation property (readout)
LSM features
–
–
–
–
–
Only attractor is rest
Temporal integration
Memoryless linear readout map
Universal computational power: can
approximate any time invariant filter
with fading memory
It also does not require any a-priori
decision regarding the ``neural code'' by
which information is represented within
the circuit.
Maass’ Definition of the Separation Property
The current state x(t) of the microcircuit at time t has to hold all information about
preceding inputs.
Approximation Property
Readout can approximate any continuous
function f that maps current liquid states x(t) to
outputs v(t).
• We took the metaphor seriously and made
the real liquid brain shown below. WHY?
BECAUSE.
• Real water is computationally efficient.
Maass et al. used a small recurrent network of leaky integrate-and-fire
neurons
– But it was computationally expensive to model.
– And I had to do quite a bit of parameter tweaking.
•
•
•
•
Exploits real physical properties of water.
Simple local rules, complex dynamics.
Potential for parallel computation applications.
Educational aid, demonstration of a physical representation
that does computation.
• Contributes to current work on computation in non-linear
media, e.g. Adamatsky, Database search.
Pattern Recognition in a Bucket
•
•
•
•
8 motors, glass tray,
overhead projector
Web cam to record footage
at 320x240, 5fps
Frames Sobel filtered to
find edges and averaged to
produce 700 outputs
50 perceptrons in parallel
trained using the p-delta
rule
Experiment 1: The XOR
Problem.
2 motors, 1 minute footage of each case, 3400 frames
Readouts could utilize wave interference patterns
Correctness (1-|target output - actual output|)
1
CORRECT
0.9
0.8
0.7
0.6
0.5
0.4
INCORRECT
0
500
1000
1500
2000
2500
3000
3500
Frame
Sum of 50 Perceptron Outputs
Output Sum
50
STILL WATER
BOTH MOTORS
0
-50
0
500
MOTOR A
1000
MOTOR B
1500
2000
Frame
2500
3000
3500
Correctness (Generalization)
1
Correctness
0.8
0.6
0.4
0.2
0
Summed Perceptron Outputs
0
500
BOTH MOTORS
50
1000
1500
2000
2500
Frames (Novel Input File)
MOTOR A
MOTOR B
3000
3500
4000
STILL WATER
0
-50
0
500
1000
1500
2000
2500
Frames(Novel Input File)
3000
3500
4000
Can Anyone Guess How it Works?
Experiment 2 : Speech
Recognition
4 Presentations of the word "Zero"
4 Presentations of the Word "One"
1
1
0.8
0.8
0.6
0.6
0.4
0.4
0.2
0.2
0
0
8
8
7
7
6
6
5
5
4
Motor Number (Frequency)
3
2
1
•
•
•
•
35
30
25
20
15
10
5
Frame (8 frames per word)
0
Motor Number
4
3
2
1
35
30
25
20
15
10
5
Frames ( 8 frames per word).
Objective: Robust spatiotemporal pattern recognition in a noisy environment
20+20 samples of 12kHz pulse-code modulated wave files (“zero” and “one”),
1.5-2 seconds in length
Short-Time Fourier transform on active frequency range (1-3000Hz) to create a
8x8 matrix of inputs from each sample (8 motors, 8 time slices)
Each sample to drive motors for 4 seconds, one after the other
0
Zero
One
Perceptrons trained on dataset NOT passed through water.
1
Final error rates correspond to 25% Mistakes in Classification
0.9
0.8
MSE
0.7
0.6
0.5
0.4
0.3
0
200
400
600
800
1000
Epochs
1200
1400
1600
1800
2000
Analysis
Conclusion
• Properties of a natural dynamical system
(water) can be harnessed to solve non-linear
pattern recognition problems.
• Set of simple linear readouts suffice.
• No tweaking of parameters required.
• Further work will explore neural networks
which exploit the epigenetic self-organising
physical properties of materials.
Acknowledgements
•
•
•
•
•
•
•
Inman Harvey
Phil Husbands
Ezequiel Di Paolo
Emmet Spier
Bill Bigge
Aisha Thorn, Hanneke De Jaegher, Mike Beaton.
Sally Milwidsky