Transcript Umjetna inteligencija - Katedra za stojarsku automatiku
Umjetna inteligencija
Vježbe: Neuronske mreže
Matlab: Neural Network Toolbox
>> neural >> nntool
Neuron Model - Simple Neuron
n = -5:0.1:5; plot(n,hardlim(n));
Multiple-Input Neuron
MATLAB code :
n = W*p + b
Multiple-Input Neuron Abreviated Notation
Layer of Neurons
Layer of Neurons - Abreviated Notation
W
=
w
1 1
w
2 1
w
1 2
w
2 2
w
1 ,
R w
2 ,
R w S
, 1
w S
, 2
w S R
p
=
p
1
p
2
p R
b
=
b
1
b
2
b S
a
=
a
1
a
2
a S
Multiple Layers of Neurons
Multiple Layers of Neurons
-
Abreviated Notation
Backpropagation
- Creating a Network (
newff
) newff
(
PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF
)
PR Si TFi BTF
-
BLF PF
-
R x 2
Size of matrix of min and max values for
R
i-th layer, for
Nl
layers input elements Transfer function of i-th layer, default = 'tansig' Backpropagation network training function , default = 'traingdx' Backpropagation weight/bias learning function , default = 'learngdm' Performance function , default = 'mse'
Example 1.
Two-layer network
net=newff([-1 2; 0 5],[3,1],{'tansig','purelin'},'traingd'); net.inputs{1}.range
net.layers{1}.size net.layers{2}.size net.layers{1}.transferFcn net.layers{2}.transferFcn net.b{1} net.b{2} net.IW{1,1} net.LW{2,1} net.trainFcn
gensim(net,-1)
Backpropagation - Simulation (
sim
)
incremental
mode of simulation:
p = [1;2]; a = sim(net,p) batch
mode form of simulation:
p = [1 3 2;2 4 1]; a=sim(net,p)
0 -0.2
-0.4
-0.6
1 0.8
0.6
0.4
0.2
-0.8
-1 0 1 2 3 4 5 t 6 7
p1 p2
8 9 10
>> t=0:0.01:10; >> p1=sin(2*t); >> p2=sin(3*t); >> plot(t,p1,t,p2) >> p=[p1;p2]; >> a = sim(net,p); >> plot(t,a)
1 0.8
0.6
0.4
0.2
0 -0.2
-0.4
0 1 2 3 4 5 t 6 7 8
a
9 10
Backpropagation - Simulation (
gensim
)
>> gensim(net,-1)
Input 1 p{1} p{1} y {1} Neural Network Sine Wave p{1} y {1} Neural Network y{1} Sine Wave1 y{1}1 y{1}
Backpropagation - Training (
train, traingd
)
p = [-1 -1 2 2;0 5 0 5]; % inputs t = [-1 -1 1 1]; net=newff(minmax(p),[3,1],{'tansig','purelin'},'traingd'); net.trainParam.show = 50; net.trainParam.epochs = 300; net.trainParam.goal = 1e-3; % target [net,tr]=train(net,p,t);
TRAINGD, Epoch 0/300, MSE 1.59423/0.001, Gradient 2.76799/1e-010 TRAINGD, Epoch 50/300, MSE 0.101785/0.001, Gradient 0.517769/1e-010 TRAINGD, Epoch 100/300, MSE 0.0310146/0.001, Gradient 0.263201/1e-010 TRAINGD, Epoch 150/300, MSE 0.010717/0.001, Gradient 0.145051/1e-010 TRAINGD, Epoch 200/300, MSE 0.00442531/0.001, Gradient 0.082443/1e-010 TRAINGD, Epoch 250/300, MSE 0.00229352/0.001, Gradient 0.05011/1e-010 TRAINGD, Epoch 300/300, MSE 0.00143589/0.001, Gradient 0.0339086/1e-010 TRAINGD, Maximum epoch reached, performance goal was not met.
10 1 10 0 10 -1 10 -2
a = sim(net,p)
a = -0.9510 -1.0209 0.9720 1.0184
10 -3 10 -4 0 50 Performance is 0.00143589, Goal is 0.001
>> plotperf(tr)
100 150 300 Epochs 200 250 300
Training Functions (
net.trainFcn
)
trainbfg trainbr traincgb traincgf traincgp traingd traingda traingdm traingdx trainlm trainoss trainrp trainscg trainb trainc trainr
BFGS quasi-Newton backpropagation.
Bayesian regularization.
Powell-Beale conjugate gradient backpropagation.
Fletcher-Powell conjugate gradient backpropagation.
Polak-Ribiere conjugate gradient backpropagation.
Gradient descent backpropagation.
Gradient descent with adaptive lr backpropagation.
Gradient descent with momentum backpropagation.
Gradient descent with momentum and adaptive lr backprop.
Levenberg-Marquardt backpropagation.
One-step secant backpropagation.
Resilient backpropagation (Rprop). Scaled conjugate gradient backpropagation.
Batch training with weight and bias learning rules.
Cyclical order incremental training with learning functions.
Random order incremental training with learning functions.
Faster Training Variable Learning Rate (
traingda, traingdx
)
p = [-1 -1 2 2;0 5 0 5]; t = [-1 -1 1 1]; net=newff(minmax(p),[3,1],{'tansig','purelin'},' traingda '); net.trainParam.show = 50;
Performance is 9.37659e-006, Goal is 1e-005 10 1
net.trainParam.lr = 0.05; net.trainParam.lr_inc = 1.05;
10 0
net.trainParam.epochs = 300; net.trainParam.goal = 1e-5; [net,tr]=train(net,p,t);
10 -1 10 -2 10 -3
a = sim(net,p)
a = -0.9962 -1.0026 0.9974 1.0031
10 -4 10 -5 10 -6 0 10 20 30 40 89 Epochs 50 60 70 80
Faster Training Resilient Backpropagation, Rprop (
trainrp
)
p = [-1 -1 2 2;0 5 0 5]; t = [-1 -1 1 1]; net=newff(minmax(p),[3,1],{'tansig','purelin'},' trainrp '); net.trainParam.show = 10; net.trainParam.epochs = 300; net.trainParam.goal = 1e-5; [net,tr]=train(net,p,t);
10 0 10 -1 Performance is 5.73054e-006, Goal is 1e-005 10 -2 TRAINRP, Epoch 0/300, MSE 0.426019/1e-005, Gradient 1.18783/1e-006 TRAINRP, Epoch 10/300, MSE 0.00610265/1e-005, Gradient 0.172704/1e-006 TRAINRP, Epoch 20/300, MSE 2.30232e-005/1e-005, Gradient 0.00614419/1e-006 TRAINRP, Epoch 23/300, MSE 5.73054e-006/1e-005, Gradient 0.00202166/1e-006 TRAINRP, Performance goal met.
10 -3 10 -4 10 -5
a = sim(net,p)
a = -0.9962 -1.0029 0.9996 1.0000
10 -6 0 5 10 23 Epochs 15 20
Zadatak 1. Aproksimacija funkcija
Potrebno je neuronskom mrežom aproksimirati funkciju:
e
1 10
x
2 u intervalu 0 < x < 5 s točnošću od 2e-4.
1 0.8
0.6
0.4
0.2
0 -0.2
-0.4
-0.6
-0.8
-1 0 0.5
1 1.5
2 2.5
x 3 3.5
4 4.5
5
Jedno moguće rješenje
clear; p = 0:0.01:5; t = exp(-0.1*p.^2).*sin(5*sin(3*p)); net= newff (minmax(p),[ 30 ,1],{' tansig ',' purelin '},' traincgb '); net.trainParam.show = 100; net.trainParam.epochs = 2000; net.trainParam.goal = 2e-4; [net,tr]=train(net,p,t); a = sim(net,p); figure(1), plot(p,t,p,a,':r','linewidth',1.5), legend('Funkcija','Neuronska mreža')
Usporedba NN aproksimacije sa zadanom funkcijom 1 0.8
0.6
0.4
0.2
0 -0.2
-0.4
-0.6
-0.8
-1 0 0.5
1 1.5
2 2.5
3 Funkcija Neuronska mreža 3.5
4 4.5
5
Zadatak 2. Aproksimacija signala sa šumom
p = 0:0.01:5; t0 = exp(-0.1*p.^2).*sin(5*sin(3*p)); t = t0 +0.05*randn(size(p)); % signal % signal+šum net=newff(minmax(p),[30,1],{'tansig','purelin'},' traincgf '); net.trainParam.show = 100; net.trainParam.epochs = 2000; net.trainParam.goal = 2e-4; [net,tr]=train(net,p,t); a = sim(net,p); figure(1), plot(p,t0,p,t,':r','linewidth',1.5), legend('Signal','Signal + Šum') figure(2), plot(p,t,':r',p,a,'b','linewidth',1.5), legend('Signal + Šum','Neuronska mreža') figure(3), plot(p,t0,p,a,':r','linewidth',1.5), legend('Signal','Neuronska mreža')
signal sa šumom 1.5
1 0.5
Signal Signal + Šum 0 -0.5
-1 -1.5
0 0.5
1 1.5
2 2.5
3 3.5
4 4.5
5
NN aproksimacija signala sa šumom 1.5
1 0.5
Signal + Šum Neuronska mreža 0 -0.5
-1 -1.5
0 0.5
1 1.5
2 2.5
3 3.5
4 4.5
5
usporedba NN aproksimacije sa signalom bez šuma 1 0.8
0.6
0.4
0.2
0 -0.2
-0.4
-0.6
-0.8
-1 0 0.5
Signal Neuronska mreža 1 1.5
2 2.5
3 3.5
4 4.5
5
Zadatak 3. Aproksimacija signala sa šumom (II)
Potrebno je neuronskom mrežom aproksimirati funkciju (sa šumom):
e
2sin 2 1 0.8
0.6
0.4
0.2
0 -0.2
-0.4
-0.6
-0.8
-1 -4 u intervalu -4 < x < 4 s točnošću od 1e-4.
-3 -2 -1 0 1 2 3 4 1 0.8
0.6
0.4
0.2
0 -0.2
-0.4
-0.6
-0.8
-1 -4 -3 -2 -1 0 1 2 3 4