www.gfai.de/~heinz How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks Dr.
Download ReportTranscript www.gfai.de/~heinz How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks Dr.
Slide 1
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 2
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 3
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 4
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 5
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 6
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 7
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 8
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 9
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 10
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 11
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 12
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 13
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 14
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 15
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 16
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 17
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 18
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 19
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 20
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 21
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 2
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 3
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 4
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 5
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 6
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 7
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 8
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 9
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 10
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 11
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 12
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 13
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 14
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 15
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 16
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 17
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 18
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 19
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 20
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21
Slide 21
www.gfai.de/~heinz
How Network Topology Defines its Behavior Serial Code Detection with Spiking Networks
Dr. Gerd Heinz
Gesellschaft zur Förderung
angewandter Informatik e.V
Berlin-Adlershof
Workshop „Autonomous Systems”
Herwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013
Sensor- und Motor- Homunculus.
Natural History Museum, London
Contents
www.gfai.de/~heinz
Abstract
Convolution
A Small Interference Network
Construction of Transfer Functions
Applying a Convolution
Spike Output
Frequency Analysis
Unipolar or Bipolar Signal Levels?
Interpreting Bursts
Examples
Abstract
Compared with technical sensors, sound and code analysis of nerve
system is fascinating
We differ between the whisper of the wind or the branding of waves,
we know the songs of birds, we hear dangerous noises of a defect car
engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to
interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W
Basing on Interference Network (IN) abstraction we transform the net
into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function
to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181,
download: www.gfai.de/~heinz/publications/NI/index.htm
[email protected]
www.gfai.de/~heinz
3
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
t
y(t ) x(t ) * h(t ) x(t ) h( ) d
0
Discrete form (Cauchy product):
n
yn hk xnk
k 0
Example:
FIR-filter
as direct
implementation
of convolution,
form: Y = X * S
[email protected]
www.gfai.de/~heinz
4
A Small Interference Network
Form:
Our Abstraction:
1
x(t)
N'
y(t)
x(t)
n
[email protected]
w1 N'
w2
N
+
...
N
...
2
y(t)
wn
Delay vector:
T [ 1 , 2 ,..., n ]
Weight vector:
W [w1 , w2 ,...,wn ]
Transfer function:
1 n
y(t ) wi x(t i )
n i
www.gfai.de/~heinz
5
Construction of Transfer Function H
(Transfer function of LTI-system)
Discrete transfer function H seen as discrete time function with sample
distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
fs = 1/ts
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference:
Construction of the transfer function of the net by addition of weights:
H(j) = H(j) + W(i) mit j = T(i) :
length(H) ≥ max(T) – min(T)
H(T(i)) = H(T(i)) + W(i)
[email protected]
www.gfai.de/~heinz
6
Get Transfer Function with Scilab
function [H] = trans(T,W,fs);
if length(T) == length(W) then
T = T * fs;
// apply sample rate of H
T = round(T);
// T becomes index: integer
H = 1:max(T); H = H * 0;
// create an empty H
for i = 1:length(T),
// for all T(i), W(i)
j = T(i),
// delay becomes the H-index j
H(j) = H(j) + W(i),
// add the weight to H
end // for
else // if
printf('\n\nerror: T and W have different size\n');
end // if
endfunction;
H is the transfer function of a LTI-system!
[email protected]
www.gfai.de/~heinz
7
Applying a Convolution
What is the system answer Y for different input functions X ?
It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectors
Y=X*H
X
H
Y
Scilab form
Y = convol(H,X)
Fourier Analysis of H
F = abs(fft(H))
[email protected]
www.gfai.de/~heinz
8
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high
synchronous emissions to learn
We need spikes at the output of the neuron
Barker codes maximize spike-like output of long sequences in
RADAR technology:
Example:
H = [1, 1, 1, -1, 1]
X = rev(H)
(Barker code no. 5)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!
What can we do?
[email protected]
www.gfai.de/~heinz
9
Spectral Analysis of Transfer Function H
j
F (e )
H (n)e
jn
n
FFT of any unipolar transfer function shows the maximum for frequency
f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Highest level at 0 Hz
Unipolar
{0…1}
Bipolar
{-1…1}
[email protected]
www.gfai.de/~heinz
10
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses:
[email protected]
www.gfai.de/~heinz
{-1…0…1}
11
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses:
[email protected]
www.gfai.de/~heinz
{0…1}
12
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron)
[email protected]
www.gfai.de/~heinz
{0…1} {-1…1}
13
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron)
X, Y:
uni {0…1}
H:
bi {-1…1}
Big surprize:
Using unipolar signals X, Y and bipolar H, the system is not significant
worse compared to the best case uni/uni
Test it:
Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion
Nerve systems do not need bipolar signals to detect code and sound, if
the synapses are bipolar (inhibiting or exciting)!
[email protected]
www.gfai.de/~heinz
14
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
[email protected]
www.gfai.de/~heinz
15
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response)
and reproduce the delays T and weights W of the network behind:
function [T,W] = net(H,fs);
// returns T and W
j=1;
// W-index j
for i=1:length(H)
// H-index i
if H(i) == 0 then ;
// do nothing
else
// write the value to W, the index to T
W(j) = H(i);
// value to W
T(j) = i;
// index to T
j = j+1;
// increment j
end;
// endif
end;
// endfor
T = T ./ fs;
// multiply with sample duration
T = T - min(T);
// scale to min: reduced T-vector
endfunction;
[email protected]
www.gfai.de/~heinz
16
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:
Weights:
Reduced T, W:
T [ 1 , 2 , 3 ] [5, 3, 8]
W [w1 , w2 , w3 ] [1, 0.5, 1]
TR [ R1 , R 2 , R3 ] [0, 2, 5]
WR [wR1 , wR 2 , wR3 ] [.5, 1, 1]
Transfer function:
H (w2 , 0, w1 , 0, 0, w3 )
H 0.5, 0, 1, 0, 0, 1
[email protected]
www.gfai.de/~heinz
17
Example
Key X and keyhole H
unipolar
max(FFT) at 0 Hz
(uni/uni)
[email protected]
www.gfai.de/~heinz
18
Conclusion
To characterize time- and frequency domain, we transform delays and
weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer
function H (pulse response) of the net from delay vector T (delay mask)
and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H
because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights)
acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an
inverse procedure [T,W] = net(H,fs) that reconstructs the net structure
[T,W] from transfer function H
Find Scilab sources and the paper on the web
www.gfai.de/~heinz/publications/papers/2013_autosys.pdf
www.gfai.de/~heinz/techdocs/index.htm#conv
[email protected]
www.gfai.de/~heinz
19
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential
properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the
pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers
clocked by clock cycles we find, that the NN-approach destroyes the
sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve
like structures
Thinking about nerves we need interferential approches that does not
destroy the delay structure of the net.
[email protected]
www.gfai.de/~heinz
20
Und der Herr sprach: "So führte ich
euch auf den Weg der Erkenntnis.
Gehet nun, und traget die Botschaft in
die Welt hinaus!"
Vielen Dank für die
Aufmerksamkeit!
Erfolgreiche Google-Suchterme:
"Interferenznetze", "Mathematik des
Nervensystems", "Heinz",
"Akustische Kamera"
[email protected]
www.gfai.de/~heinz
Dr. G. Heinz, GFaI
Volmerstr.3
12489 Berlin
Tel. +49 (30) 814563-490
www.gfai.de/~heinz
[email protected]
21